Line 461: | Line 461: | ||
:::Now, to try just once more to get over my other point to you: |
:::Now, to try just once more to get over my other point to you: |
||
:::The energies of molecular rotation and vibration at room temperature ''are'' kinetic, ''are'' thermal, and ''are'' quantised -- and the fact of that quantisation matters, because it directly affects the heat capacity associated with the degree of freedom. |
:::The energies of molecular rotation and vibration at room temperature ''are'' kinetic, ''are'' thermal, and ''are'' quantised -- and the fact of that quantisation matters, because it directly affects the heat capacity associated with the degree of freedom. |
||
:::I'm not talking about electrons here. I'm not talking about photons. I'm talking about the kinetic rotation energy of the molecules themselves, which is interchanged when they bash into each other. |
:::I'm not talking about electrons here. I'm not talking about photons. I'm talking about the kinetic rotation energy of the molecules themselves, which is interchanged when they bash into each other. |
||
:::Talking about temperature as related to the average energy per degree of freedom might be okay when introducing the subject to children; but it's not a good long-term standpoint, because there are too many systems for which it is not true, either because the density of states is not perfectly quadratic, or (as above) because it's quantised, with energy gaps of a size that means the quantisation cannot be ignored. A better long-term strategy is to think of temperature as (1/T) = dS/dE, becuase that's more fundamental, more general, and encompasses equipartition as a special case when the density of states is smooth and quadratic. [[User:Jheald|Jheald]] ([[User talk:Jheald|talk]]) 14:31, 8 December 2012 (UTC) |
:::Talking about temperature as related to the average energy per degree of freedom might be okay when introducing the subject to children; but it's not a good long-term standpoint, because there are too many systems for which it is not true, either because the density of states is not perfectly quadratic, or (as above) because it's quantised, with energy gaps of a size that means the quantisation cannot be ignored. A better long-term strategy is to think of temperature as (1/T) = dS/dE, becuase that's more fundamental, more general, and encompasses equipartition as a special case when the density of states is smooth and quadratic. [[User:Jheald|Jheald]] ([[User talk:Jheald|talk]]) 14:31, 8 December 2012 (UTC) |
||
The article is about the Boltzmann constant, but Jheald writes about :- |
|||
::: ''if you've got a '''probability distribution''', it's got an '''entropy'''.'' |
|||
The article is about the Boltzmann constant which is the energy of a single particle per Kelvin, but Jheald writes about:- |
|||
::: ''That's how it fixes your units of entropy.'' As if entropy was a constant like the Boltzmann constant ! |
|||
The article is about the Boltzmann constant, but Jheald writes about :- |
|||
:::''I'm not talking about electrons here. I'm not talking about photons. I'm talking about the kinetic rotation energy of the molecules themselves, which is interchanged when they bash into each other.'' |
|||
Oh dear! Electrons? Photons? The kinetic rotation energy of the molecules? Yes Jheald, I know what your talking about and it isn't the Boltzmann constant! |
|||
:::''The energies of molecular rotation and vibration at room temperature ''are'' kinetic, ''are'' thermal, and ''are'' quantised -- and the fact of that quantisation matters, because it directly affects the heat capacity associated with the degree of freedom.'' |
|||
Oh dear! Where is the Boltzmann constant in this? |
|||
Now I know that you have nothing to contribute to the article on the Boltzmann constant, what a shame! |
|||
And now I feel free to restore my contribution after your deletion. --[[User:Damorbel|Damorbel]] ([[User talk:Damorbel|talk]]) 18:00, 8 December 2012 (UTC) |
Revision as of 18:00, 8 December 2012
![]() | Physics B‑class Top‑importance | |||||||||
|
|
||||
The First Line in the Article
The first line of the article introduces bulk (macroscopic) concepts not immedialy related to the Boltzmann constant; I suggest that this in not appropriate in an encyclopedia article. Currently the article (1st l) reads "The Boltzmann constant (k or kB) is the physical constant relating energy at the individual particle level with temperature observed at the collective or bulk level. It is the gas constant R divided by the Avogadro constant NA."
Since the Boltzmann constant will shortly become the recognised basic physical constant, replacing the Kelvin, this should, at the very least, be recognised in a competent encyclopedia.
The errors propagate through the article. Section 1 states "Boltzmann's constant, k, is a bridge between macroscopic and microscopic physics." Its least mistake is '..ann's con...' it should be "the Boltzmann constant" - thus no 'apostophe s'. But the major error is the 'bridge between macroscopic and microscopic physics', which is not correct. The proper 'bridge between macroscopic and microscopic physics' is (are?) Maxwell-Boltzmann statistics, which take into account that, in a perfect gas with random exchange of momentum the particles will have a distribution of energies with an average energy equal to the total energy divided by the number of particles.
It is an imporant concept of physics that particle interactions take place at particle level and the nature of the interaction is strongly related to the energy of the individual particle i.e. the particle temperature. It is of course not easy, perhaps impossible, to measure the temperature of the individual particle directly but it may well be inferred from the intensity with which, let us say, a chemical reaction takes place. --Damorbel (talk) 08:42, 6 October 2011 (UTC)
- Don't be pedantic, from your comments above you obviously have a single issue wonk.86.135.142.245 (talk) 02:47, 15 October 2011 (UTC)
User :86.135.142.245 Care to explain more clearly why objecting to "Boltzmann's constant, k, is a bridge between macroscopic and microscopic physics." is being pedantic? Just saying so helps nobody and doesn't help the article either. --Damorbel (talk) 19:39, 15 October 2011 (UTC)
- Temperature is necessarily a "macroscopic quantity" which is what "temperature observed at the collective or bulk level" means to say. Temperature must NECESSARILY be collected at the collective or bulk level, so I hope the sentence isn't taken to apply otherwise. Perhaps it should be rewritten to say that. I'll put it in and see if anybody objects. THe Bolzmann constant K is microscopic merely because it's R per atom. It's really avogadro's constant N_o which is the brige between microscopic and macroscopic physics! SBHarris 22:12, 15 October 2011 (UTC)
Sbharris|arris You write:- "Temperature is necessarily a "macroscopic quantity"" But the Boltzman constant is 'energy per degree of freedom' - of which a single atom has three, that is why a single atom has 3 x kT/2 or 3/2 x 1.380 6488(13)×10−23 J/K. I can see nothing in this that links these figures to a 'macroscopic level'. To have any meaning at the macroscopic level the entropy of the (bulk, macroscopic) system would need to be known or defined; if the entropy was not at a maximum i.e, when the particles making up the system did not have a Maxwell-Boltzmann distribution then it is not possible to define a temperature at the macroscopic level because there would not even be a statisical distribution supporting a macroscopic temperature, whereas it is quite reasonable to assign temperatures at the particle level.
Further, why should Avagadro's Number play a role in the definition of temperature? Avagadro's Number merely defines the size of the macroscopic system. It is painfully simple to have systems with 50% of AN; 25% of AN or even 0.000001% of AN all with exactly the same temperature, there is no reason why a temperature cannot be defined for a single particle, or if you try really hard for a single degree of freedom, and, wonder of wonders, you have the Boltzmann constant --Damorbel (talk) 09:59, 17 October 2011 (UTC)
- No, you don't. You can't talk about the temperature of a single atom, any more than you can talk about wetness or viscosity of a single water molecule. Wetness and viscosity are bulk properties. They have nothing to do with Avogadro's number, but both of them assume enough atoms to get good statistical properties from the collection. This is indeed a number much smaller than Avogadro's number N, but it must be larger than 1.
You can talk about the amount of heat in a bulk which can be mentally "assigned" to each atom, or "thermal energy per atom," but this is a bit like saying the average American family has 2.3 children. The actual energy per atom is quantized in whole numbers like the number of children in a family. You got a fraction only by dividing two totals, one of which has a distribution. Heat is like that. Temperature is something like "national wealth." It can be expressed in terms of "GDP" or "GDP per capita." But a single wage earner gets a salary-- not a GDP per capita. Don't be confused that both are measured in dollars. GDP per capita never does apply to a single wage earner, and temperature never applies to single particles.
Both R and k have units of specific entropy or specific heat capacity, which is (thermal) energy per degree kelvin. The first constant is appropriate to use per mole, the second is appropriately used per atom since R = kN = kAN. Thus, neither unit can be used to "replace" temperature, since both units assume that a temperature already exists. Neither R nor k gives you a measure of "thermal energy" unless first multiplied by a temperature. Both R and k are merely mediator scaling-constants between thermal energy and temperature. The various degrees of freedom are small numbers with no units, and since they have no units, they are thus not part of either k or R. Of course k or R must be multiplied by effective degress of freedom before you get an answer to a heat capacity, but these degrees of freedom don't have to be whole numbers for the bulk, only for the individual particles. At low temperatures, heat capacities for a mole of substance can fall to small fraction of RT, which means a tiny fraction of kT per atom. It could be 0.01 kT per atom, but that doesn't mean any atom has 0.01 degrees of freedom anymore than a family has 2.3 kids. It means that 1% of atoms are excited into a state where one of their degrees of freedom contains a quanta of energy, from heat. What fraction of degrees of freedom particiate, and what fraction of atoms are excited in any way, and how many in more than one way, is a quantum discussion. See the Einstein and Debye sections of the heat capacity article, at the end. As well as their own articles.SBHarris 22:41, 27 October 2011 (UTC)
The First Line in the Article (2)
User:Sbharris: This is in danger of becoming a (bad) example of a Wiki War if you keep changing my contributions; please check your user page.
You write "You can't talk about the temperature of a single atom, any more than you can talk about wetness or viscosity of a single water molecule. Wetness and viscosity are bulk properties" Do you mean that the temperature of a collection of atoms (molecules) depends on the number of atoms (molecules) it contains? I ask again, at what number of atoms (molecules) does their temperature begin to deviate from that of a sample containing one mole i.e. Avogadro's number, AN of particles?
You write "The actual energy per atom is quantized" Quantization effects are seen at energy levels related to the Planck constant 6.62606957(29)×10−34J s about 10-11 smaller than the Boltzmann constant, this is not really significant at thermal energies and way below that of any significant fraction of AN.
When you write about "Both R and k have units of specific entropy" and "Thus, neither unit can be used to "replace" temperature" you must understand that the Boltzmann constant will not 'replace' the Kelvin in the new definition of fundamental constants, but the Kelvin will be defined in terms of the Boltzmann constant because it is now possible to determine the Boltzmann constant much more accurately than the Kelvin. The Kelvin is defined by the tripple point of water which is of limited accuracy since it is a function of the isotopic balance of the water (D2O freezes at a differnt temperature than H2O). --Damorbel (talk) 11:45, 30 October 2011 (UTC)
- "Quantization effects... [are] not really significant at thermal energies." Can I suggest you read Equipartition theorem#History, and then perhaps revise your statement. Heat_capacity#Theory_of_heat_capacity treats the material at greater length, though it is perhaps over-wordy, and could use some pictures drawn from real data. Fermi–Dirac statistics also discusses quantum effects at very everyday temperatures. [[User:|Jheald]] (talk) 12:03, 31 October 2011 (UTC)
- Jheald, if you have something to say about quantum effects on thermal processes, then perhaps it would be useful if you could say just what it is that you are thinking of, merely giving a few links will only waste our time as we try to identify what is raising your concerns.
- Perhaps you are thinking of statements like this "In general, for this reason, specific heat capacities tend to fall at lower temperatures where the average thermal energy available to each particle degree of freedom is smaller, and thermal energy storage begins to be limited by these quantum effects. Due to this process, as temperature falls toward absolute zero, so also does heat capacity." from your link http://en.wikipedia.org/wiki/Heat_capacity#Theory_of_heat_capacity The effect referred to is one that is seen with very large temperature excursions and is offten described as certain modes of vibration being 'locked out' because they appear to have a threshold below which they neither absorb nor release energy. Thanking you in advance. --Damorbel (talk) 17:52, 31 October 2011 (UTC)
- One important point is that the magnitude of the Planck and Boltzmann constants cannot be compared since they are measured using different units. Dauto (talk) 19:28, 31 October 2011 (UTC)
- Joules per Kelvin and Joules per Hertz - both are measures of energy density, both are a measure of particle energy, the Boltzmann constant is a measure of mechanical particle energy and the Planck constant the measure of photon (electromagnetic) energy - both are treated as particles because that is what experimental science shows them to be. --Damorbel (talk) 21:20, 31 October 2011 (UTC)
- That misses the point. In order to get units of energy, you must multiply h by frequency f (often a very large number for atoms and atom vibrations), to get energy k must only be multiplied by T (a very much smaller number).
In fact, if you want to know where a solid departs from the classical Dulong-Petit law limit of heat capacity of 3R/mole you can examine a characeristic temperature value at which quantum effects become very important. In the Einstein solid theory that is the so-called Einstein temperature: T_E = hf/k. Notice that the frequency has ofsets the very large ratio of h/k. The factor that determines excursion from Dulong-Petit at room temp is a function of dimensionless ratios of T/T_E = s (the "reduced temperature"). Einstein temps vary from 85 K or so (lead) to thousands of degrees K for tightly-bonded solids like carbon. For beryllium, T_E is ~ 690. The factor this corrects Dulong-Petit by, is (s)^2 * [e^s/(e^s-1)]. For beryllium at 300 K this comes out to 0.54, which predicts Be heat capacity at room temp will be only 54% of 3R (it's actually about 60%-66% of 3R, depending on source). For diamond you get 18% of 3R by calculation (actual measured is 24% of 3R). So quantum effects here are extreme, cutting solid heat capacities at room temp to a fraction of most other solids. Room temperature is not a "very large temperature excursion."
Similarly for diatomic gases at room temp the vibrational reduced temp is so high (thousands of degrees) that nearly all the vibrational heat capacity is not seen. Oxygen only has 13% of its theoetical vibration heat capacity degree of freedom at room temp, so has a constant volume heat capacity much closer to 5/2 R/mole than 7/2 R/mole. Damorbel is simply wrong-- quantization effects ARE important at room temp-- in fact for gases they are usually extreme, and for some solids composed of light well-bonded atoms, too.
As for the issue that it's easier to determine the "Boltzmann constant" than the Kelvin, that depends on how you do it. The Kelvin in the future may well be rescaled using energy plus k_B, rather than scaling Kelvin using gas pressures and the triple point of water. However, I don't see the point. You have to pick an energy scale and a temperature scale, and there will always be a constant that relates the two, if only to change the units. E = constant X T. That constant is some simple number times R or k_B. If you pick any two values, the third is determined, so it makes sense to pick the two things you can most easily measure, and let them determine the third. You "measure" k_B only if you have scaled Kelvin and energy already, but that's overdetermined. If you haven't scaled Kelvin, then you can measure the value of the Kelvin with regard to the joule, by fixing k_B at some arbitrary value (so it will no longer have a measurment variation). We've done exactly that with our standard for length, which is now determined by frequency (our time standard) and the speed of light (now fixed at an arbitrary exact value and no longer measured), since the relationship of time is measurable to higher precision than are the distance between marks on a bar of metal, and the speed of light is just a scaling constant between length and time. So? When this happens with temperature, the value of k_B (the Bolzmann constant) will be exact and fixed, as the speed of light is now. We will no longer "measure" it-- rather we will "measure" Kelvin using our energy standard. SBHarris 21:32, 31 October 2011 (UTC)
- That misses the point. In order to get units of energy, you must multiply h by frequency f (often a very large number for atoms and atom vibrations), to get energy k must only be multiplied by T (a very much smaller number).
- Joules per Kelvin and Joules per Hertz - both are measures of energy density, both are a measure of particle energy, the Boltzmann constant is a measure of mechanical particle energy and the Planck constant the measure of photon (electromagnetic) energy - both are treated as particles because that is what experimental science shows them to be. --Damorbel (talk) 21:20, 31 October 2011 (UTC)
The First Line in the Article (2)(Sbharris)
This new section needed because the last contriburtion to the section "The First Line in the Article (2)" adds a lot of material that is really a long way from the fundamentals of the Boltzmann constant.
User:Sbharris; in the first line of your last contribution you write "[in] order to get units of energy, you must multiply h by frequency f (often a very large number for atoms and atom vibrations " The Planck constant is about the vibration of electric charge, not atoms. Photons interact minimally with atoms having little or no dipole moment (a dipole moment is a manifestation of electric charge). Charge (or dipole moment) is how the energy of photons is converted back and forward to/from mechanical energy i.e. heat. These concepts do not require the Dulong Petit law to understand them, the Dulong Petit law is relevant at the bulk (macroscopic) level but not at the microscopic or particle level, the Boltzmann constant is not affected in the consideration of heat capacity that the Dulong Petit law deals with, the Dulong Petit law is quite irrelevant to the Boltzmann constant.
User:Sbharris you write " Damorbel is simply wrong-- quantization effects ARE important at room temp ". But do they impact on the Boltzmann constant? Further you write "Oxygen only has 13% of its theoetical heat capacity at room temp, so has a constant volume heat capacity much closer to 5/2 R per mole" Which may well be true but, once more, it is quite irrelevant to the Boltzmann constant and has no place in an article or even a discussion about the Boltzmann constant.
Yet further you write " As for the issue that it's easier to determine the "Boltzmann constant" than the Kelvin, that depends on how you do it. " You may well have a good point but are you not aware of the current progress in determining the Boltzmann constant? Check this link Big Step Towards Redefining the Kelvin:Scientists Find New Way to Determine Boltzmann Constant. Wikipedia can surely record the changes that are taking place? The old method of standardising temperature with the triple point of water is known to have limitations, the better standard is now generally accepted to be the value of the Boltzmann constant.
The fact is that the International Committee for Weights and Measures (CIPM) is adopting the Boltzmann constant as the most accurate unit which will make the Kelvin a derived unit, see here Preparative Steps Towards the New Definition of the Kelvin in Terms of the Boltzmann Constant, all of this should be in Wikipedia, don't you think?
I suggest that if you can discover a more accurate method of defining the Kelvin these the situation will be reversed by CIPM. But until that happens I suggest the new situation should be reflected in the Wiki article on the Boltzmann constant. --Damorbel (talk) 22:34, 31 October 2011 (UTC)
- I am well aware of people attempting to use energy-dependent methods to redefine the temperature scale, for example this one. Yes, they should be mentioned in this article, since if any one of them is adopted to standardize the Kelvin, then the Boltzmann constant will be fixed and no longered "measured," like the speed of light (which is no longer measured, but has an exact value). However, that hasn't happened yet.
As to your bizarre ideas that Planck's constant only has to do with charge, you need to read some physics books. Here on WP, there is matter wave which discusses interference of neutral atoms and even molecules, each of which has a wavelength set by h/momentum. As for the relevance of k_B and h to the heat capacity of solids, you're the one making the bizarre claims, not me. The Dulong-Petit law gives heat capacities in terms of R, which is N_A*k_B. The importance of k_B is not in the Dulong-Petit law, but in the way that heat capacities depart from it at low temperatures, and in different ways for different substances. Why don't you actually read Einstein solid and Debye model? See of you can derive either of them without using h or k_B. They have nothing to do with photons or with dipole moments of atoms, and would work equally well if atoms were little inert vibrating spheres with not a trace of charge.
Does the heat capacity of gases have any relevance to k_B? Of course. The entire kinetic theory of gases is based upon k_B (you do understand where this constant came from historically??), and gas heat capacity is only part of that larger theory. One cannot explain why heat capacity of gases behaves as it does, without invoking k_B, and this theory is a quantum theory at low temperatures (where you must invoke h), and yet it still has nothing at all to do with photons or dipole moments. In particular, the "freeze out" of heat capacity at lower temperatures (well on the way by room temperature) cannot be calculated without a theory that uses of both h and k_B. Again, the quantum kinetic theory of gases would work just as well if the gases were composed of little ideal atoms with no charge, rotating around bonds, vibrating, and bouncing off each other and the container elastically, with each atom behaving like a tiny Superball on a spring (if bonded), with no charge at all. SBHarris 23:49, 31 October 2011 (UTC)
- The vibration modes of molecules actually give quite a good example for getting a feel for what kB is all about.
- For example, let's work through the numbers for the fundamental CO2 bending mode. It's got a characteristic frequency of 667 cm-1 (in spectroscopists' units).
- That implies a corresponding energy gap of
- i.e. 6.6 x 10-34 x 3.0 x 108 x 66700 = 1.3 x 10-20 J
- This is the characteristic microscopic energy kT corresponding to a temperature that we can find by dividing through by k, ie 1.3 x 10-20 / 1.3 x 10-23 K, so about 1000 K.
- That tells us that at around 1000 K this vibration mode will be really starting to kick in, as shown in the schematic curve at Equipartition theorem#History. Much below this temperature most of the CO2 molecules will be in the ground state, so none of the thermal energy of the system will be in this vibration mode. At a much higher temperature, the various CO2 molecules will be spread in a distribution right up the ladder of vibrational states. This temperature corresponds to the cross-over between those two regimes, where a plurality of molecules in that CO2 gas are starting to get knocked into those first vibrational levels.
- This is the right way to think about how a particular microscopic energy can be related to a particular macroscopic temperature.
- But let's turn it around for a moment. Suppose you find a CO2 in that first vibrational level, corresponding to an energy of 1.3 x 10-20 J. Can you tell the temperature of the gas it came from? The answer is no you can't, not with any great precision. The temperature of the gas might be quite low, but that molecule happens to be one of the few that has got the energy to get out of the ground state. Or the temperature of the gas might be quite high, but that molecule happens to be one of the ones with comparatively little energy compared to the average. This I think is the key point that SB Harris has been trying to make to you: for a single molecule, the temperature simply isn't well defined.
- We can take this further. In terms of classical thermodynamics, the zeroth law of thermodynamics says that two systems are at the same temperature if they would be in thermal equilibrium when put in thermal contact with each other -- i.e. if they were free to exchange energy with each other. This is why it makes no sense to talk about a single molecule with a particular energy to have a particular temperature -- because to have a temperature, the molecule must be free to exchange energy. You can (sometimes) talk about a molecule that has a particular distribution of energy over a period of time as having a particular temperature -- because it is constantly exchanging energy over that time, sometimes having more energy, sometimes having less energy. Or you can talk of a set of molecules that has a range of energies having a particular temperature -- if the set is sufficiently large that the amount of energy being exchanged hardly affects the overall distribution of molecular energies. But you can't talk about a single molecule with a particular energy having a particular temperature -- it doesn't, its temperature simply isn't well defined.
- We can also think in terms of statistical thermodynamics, where temperature is defined as 1/(dS/dE). But the statistical thermodynamic entropy S = - k Σ p ln p: it is a property of a distribution over which probabilities can be defined. A single molecule in an explicitly given state in fact has zero entropy: its state is exactly defined. It is only when we allow there to be a probability distribution for the molecule's state; or, alternatively, a large number of molecules with energy that can be spread between them in so many different ways, that it becomes possible to have a statistical entropy, and so any chance of a well-defined temperature.
- The right way to think about the connection between T and kT is therefore to think of kT as an energy that is in some way characteristic of a temperature T. The wrong way to think about the connection is to think of a particle having an energy E having an associated temperature E/k -- temperature is a property of distributions, of macroscopic (or at least mesoscopic) systems; not of single particles.
- Finally, you keep returning to the CIPM proposing to use k to formally define the Kelvin. That is certainly something we probably should mention in the article. But probably fairly well down the piece, because it is something that really only starts to be meaningful to somebody once they already have a pretty good grasp of what k is -- and what a Kelvin is. The CIPM is also currently considering formally defining the kilogram in terms of Planck's constant, the Josephson constant and the von Klitzing constant from the quantum Hall effect. But that is something we (rightly) only touch on at the very end of our article on Planck's constant. Jheald (talk) 01:41, 1 November 2011 (UTC)
Sbharris,you write:- "I am well aware of people...". Do you consider the CIPM as just 'people'? Then these are the 'people' who chose the Kelvin defined by the triple point of water and now they are the 'people' who will make the Kelvin a unit derived from the Boltzmann constant, Boltzmann constant will now become the the fundamental constant, not the Kelvin - note the Wiki article is entitled 'Boltzmann constant' Surely the fact that it is internationally accepted as (or may well become accepted as) a fundamental physical constant should be in the introduction.
You write further "The right way to think about the connection between T and kT is therefore to think of kT as an energy that is in some way characteristic of a temperature T" Do you mean by this ' some way ' that E = kBT is somehow imprecise, and that the various parts of the equation, E, kB and T are somehow incomplete or imprecise? E = kBT is complete, it applies to particles in every kind of system not just to the single kind of system defined by the Maxwell-Boltzmann distribution that you keep introducing.
Again you write "We can take this further. In terms of classical thermodynamics, the zeroth law of thermodynamics says that two systems are at the same temperature " Why are you introducing ' systems '? Not just ' systems ' but ' systems ' in equilibrium, i.e. ' systems ' to which a single temperature can be assigned. None of this has anything to do with the Boltzmann constant, which is 'energy per degree of freedom, what you are introducing is the behaviour of systems of very many particles that are interacting in a random way that gives the Maxwell-Boltzmann distribution of particle energy. But these conditions exist only in a gas defined as being in thermal equilibrium, a very rare condition indeed. But there are many particles in the universe that are not part of a gas and certainly do not have a single assignable temperature at the macroscopic level; the Boltzmann constant, properly defined is equaly relevant to these systems. The Boltzmann constant is also used in calculating the forward voltage and the reverse current of p-n junctions and their [thermal voltage]
You further emphasise your macroscopic definition of temperature (which is correct at equilibrium) by stating "think in terms of statistical thermodynamics, where temperature is defined as 1/(dS/dE)." This is only true when the entropy S is at a maximum, another way of saying the system is in equilibrium. If the system (of particles) is not in equilibrium what then does your definition of temperature [T] = 1/(dS/dE) give as 'the temperature T'? It doesn't exist, does it? This is the reason why macrosopic matters have only a minimal place in an article entitled 'Boltzmann constant'.
Introducing the "The CIPM is also currently considering formally defining the kilogram in terms of Planck's constant, the Josephson constant and the von Klitzing constant from the quantum Hall effect" is quite irrelevant. Physical constants are just that, constant. Fundamental physical constant are called 'fundamental' because they can be indepenently defined i.e. their size is unrelated to other constants, so why cite the CIPM activity on these matters as relevant to the Boltzmann constant? --Damorbel (talk) 10:16, 1 November 2011 (UTC)
- Actually, if you read what I was talking about above, I was talking about the thermal excitation of a vibration mode. That is something different to the distribution of molecular speeds which is given by the Maxwell-Boltzmann distribution.
- To pick up some of your points:
- The Boltzmann constant is "energy per degree of freedom". This (E = ½ kT) is only true when the equipartion of energy holds. As has been repeatedly pointed out to you, equipartition breaks down when modes get frozen out if the quantum energy level spacing is too big. So any general statement like this needs to be taken with considerable caution. Furthermore, equipartition is about average energy per degree of freedom. It applies when there is a probability distribution of energies -- either for a single particle over time, or for a collection of particles.
- Similarly, as I have explained above, temperature is a property of things in thermal equilibrium -- things which can exchange energy with their surroundings. A single molecule in a particular state does not have a temperature. Temperature is not a property of states of single molecules. You cannot just calculate the energy and divide by k. This is not temperature.
- Do you understand this? Jheald (talk) 11:41, 1 November 2011 (UTC)
- Further you wrote "The Boltzmann constant is "energy per degree of freedom". This (E = ½ kT) is only true when the equipartion of energy holds" By this do you mean that 'equipartition ... etc." is relevant to the Boltzmann constant? That the meaning and calculation of the Boltzmann constant is meaningless without equipartition? That the energy in atoms and molecules has nothing to do with the Boltzmann constant unless thre is equipartition of energy? If you believe this, how can you do gas dynamics when the gas is accelerating and thus not in equilibrium? --Damorbel (talk) 15:09, 1 November 2011 (UTC — Preceding unsigned comment added by Damorbel (talk • contribs)
This debate could go on forever, I don't think Damorbel will ever be convinced and it is pointless to continue trying to do so. It is not Wikipedia's job to teach Damorbel. Wikipedia is based on sources, not written by working through theory ourselves. There is no shortage of quality physics textbooks which specifically and directly state that the temperature of a single molecule is meaningless. Unless Damorbel is able to produce more reliable sources to contradict this it is pretty much settled by the sources - it can't go in the article, period. SpinningSpark 13:02, 1 November 2011 (UTC)
The First Line in the Article (3)
Spinningspark, you wrote:- citing your Google search for 'temperature of a single molecule' "the temperature of a single molecule is meaningless " But you don't actually cite a single document of any kind!
- Just click on the first three or four results, they are all relevant. SpinningSpark 16:31, 1 November 2011 (UTC)
" Just click on the first three or four results, they are all relevant. But Spinningspark just giving the links is no contribution, you must also explain. Up until now I have not read anything relevant in what you write. Why are these links of yours relevant? I have looked at them and they do not have anything of particular interest to say. --Damorbel (talk) 17:00, 1 November 2011 (UTC)
- (ec) FFS. Did you notice at all the page after page of hits that SpinningSpark's search brings back?
- James Jeans (1921), The Dynamical Theory of Gases, p. 125. "It is of the utmost importance to notice that, for the kinetic theory, temperature is a statistical conception ; it is meaningless to talk of the temperature of a single molecule."
- W.C.M Lewis (1924), A system of physical chemistry, "The temperature, in fact, is determined by the average kinetic energy. It is therefore meaningless to speak of the temperature of a single molecule in a gas."
- Zhu (2003), Large-scale inhomogeneous thermodynamics, p. 88 "Since... the temperature of a single molecule is meaningless..."
- Gemmer et al (2004), Quantum thermodynamics: emergence of thermodynamic behavior within composite systems, p. 209. "It thus appears meaningless to talk about the temperature of an individual particle..."
- William A. Blanpied (1969), Physics: its structure and evolution, "It is quite meaningless to speak of the temperature of a single molecule..."
- Henry O. Hooper, Peter Gwynne (1977), Physics and the physical perspective, "The concept of temperature is plainly a statistical one that depends on large numbers of molecules. The idea of the temperature of a single molecule is meaningless;"
- Hugo O. Villar (1994), Advances in Computational Biology, "The temperature of two particles, for instance, is physically meaningless."
- A. D'Abro (1939), The decline of mechanism: (in modern physics), p. 42 "But when we pass to the level of molecular dimensions, the entire concept of temperature becomes meaningless. There is no sense in speaking of the temperature of a single molecule, for a large number of molecules in motion is necessary to give meaning to temperature."
- Henry Margenau (1982) "Physics and Reductionism", in Joseph Agassi, Robert Sonné Cohen (eds), Scientific philosophy today: essays in honor of Mario Bunge, p. 190. "the 'higher level' observables are meaningless at the lower level; entropy and temperature mean no more for a single molecule than volume does with respect to a plane figure".
- Robert L. Sells (1965) Elementary classical physics, "It is meaningless to speak of the "temperature" of a single molecule or of a small number of molecules"
- Charles Hirsch (2007) Numerical computation of internal and external flows Vol 1, p. 22. "... the temperature, or pressure, or entropy of an individual atom or molecule is not defined and generally meaningless".
- R.J.P. Williams, quoted in Gregory Bock, Jamie Goode (1998) The limits of reductionism in biology, p. 129 "Entropy is not reducible to a property of a molecule. Thus random kinetic energy of a large number of molecules is described by temperature, which is very different from kinetic energy of a single molecule."
- Meghnad Saha, B. N. Srivistava (1931, 1958) A text book of heat "It is meaningless to talk of the pressure exerted by a single molecule, or of the temperature of a single molecule."
- Paul Karlson, A. E. Fisher (1936), The world around us: a modern guide to physics, "Just as it was meaningless to apply the notion of temperature to a single molecule... "
- Alistair I.M. Rae (1994, 2004), Quantum physics, illusion or reality?, "just as meaningless... as it is to talk ahout the temperature of a single isolated particle."
- Michel A. Saad (1966), Thermodynamics for engineers, "It should be remarked that reference to properties, such as temperature or pressure, apply to a large number of molecules and it is meaningless to refer to such properties for a single molecule."
- etc, etc.
- Your original complaint was with the phrase "temperature, which must necessarily be observed at the collective or bulk level." These quotations more than justify that phrase. Jheald (talk) 17:08, 1 November 2011 (UTC)
Jheald your citations are about entropy, pressure or they do not refer to the Boltzmann constant. For example ""Entropy is not reducible to a property of a molecule" Of course it isn't. Of necessity, when refering to entropy, you are talking about a large number of molecules interacting in a random way, witch, at equilibrium have a Maxwell Boltmann distribution, this is not a requirement for the Boltzmann constant, it has no role in the definition of the Boltzmann constant yet you do not appear to recognise the fact. Neithe do you recognise the role of the Boltzmann constant in the redefinition of the Kelvin. You cite James Jeans "for the kinetic theory, temperature is a statistical conception" Yes of course it is, because Kinetic theory is about large numbers of particles interacting by random elastic collisions; the actual energy of the particles is not the concern of Kinetic theory, Kinetic theory relies on the particles having a Maxwell-Boltzmann which by definition assumes, that the particles do not have the same temperature, this assumption does not mean that they can't have the same temperature just as electrons in a beam do.
And with Lewis "W.C.M Lewis (1924), A system of physical chemistry, "The temperature, in fact, is determined by the average kinetic energy. It is therefore meaningless to speak of the temperature of a single molecule in a gas" Notice he says 'System' thus multiple interacting particle. Did you notice 'average kinetic energy'? And 'in a gas' It is well known that the particles in a gas (at equilibrium) have the Maxwell-Boltzmann distribution of velocities therefore, with their different velocities, they have all got different temperatures; temperatures that are changing when the molecules collide. Temperature is the measure of energy in a atom (molecule or degree of freedom). --Damorbel (talk) 18:05, 1 November 2011 (UTC)
- I'm sorry, but you haven't got a clue; it's become clear that it's a waste of time engaging with you, because you appear simply to be incapable of understanding when you haven't got a clue; and evidently you have absolutely no interest in acquiring a clue.
- For the last time, molecules don't have a temperature -- it is distributions of molecules that have a temperature. A Maxwell-Boltzmann distribution of molecular velocities is characterised by a particular temperature. Individual molecules are not. Asserting that the individual molecules in a Maxwell-Boltzmann distribution each have different temperatures because they have different velocities shows you simply don't get it (as the above citations make very clear).
- Now, as SpinningSpark noted above, and as I also put to you on the 15 September, it is not WP's job to teach you physics, nor -- see WP:TALK -- are the talk pages here to try to straighten out your personal misconceptions. If you can find what WP would consider a reliable source that contests this article's proposition that "temperature ... must necessarily be observed at the collective or bulk level", then bring it on. Otherwise this discussion is at an end. Jheald (talk) 19:48, 1 November 2011 (UTC)
Further you wrote "Unless Damorbel is able to produce more reliable sources to contradict this it is pretty much settled by the sources - it can't go in the article, period." Now please explain, if the Boltzmann constant, with a value of 1.380 6488(13)×10−23J/k is not the energy in a single degree of freedom, just what is it? Even in its present form the article says :- "The Boltzmann constant (k or kB) is the physical constant relating energy at the individual particle level with temperature". Now if you disagree with this well and good. But then perhaps you do not agree with the revisions to the fundamental physical constants, including the Boltzmann constant, currently being proposed by the CIPM? Wouldn't it still be a valid contribution to Wikipedia to draw the attention of users to the reasons why these changes are being considered? --Damorbel (talk) 15:56, 1 November 2011 (UTC)
- Enough enough enough. This has filled up the talk page to no end. Let me repeat what I said to you on my talk page, talk pages are for discussing improvements to the article, not discussing the subject itself. Any further posts of this nature will be deleted without comment. See WP:TALK. SpinningSpark 16:31, 1 November 2011 (UTC)
Again SpinningSpark you write:- "talk pages are for discussing improvements to the article". Yes they are. And do you not think that new ways of determining the Boltzmann constant would improve the article? Such as [1] and this [2], do pay attention to the title of the article, it is "Boltzmann constant"! --Damorbel (talk) 17:00, 1 November 2011 (UTC)
Here is a good article on just how the improvements in the measuring of the Boltzmann constant is being improved using Johnson Noise Thermometry (JNT) --Damorbel (talk) 17:06, 1 November 2011 (UTC)
Jheald you write (above) "molecules don't have a temperature -- it is distributions of molecules that have a temperature" How so? Is it not the particles (atoms and molecules) that have the kinetic energy? And does not the formula E = kBT connect that energy and temperature? Are you saying that E = kBT is wrong? --Damorbel (talk) 21:34, 1 November 2011 (UTC)
- This discussion raised at WT:PHYSICS. Jheald (talk) 22:38, 1 November 2011 (UTC)
- Let's consider an object in space. The object may have a temperature, and it may also have some speed. Since it is in space, it is possible to change its speed without changing its temperature, and to change its temperature without changing its speed. It is this independence of temperature and speed that is behind many of the comments above. In the case of a single molecule, is its speed related to its translational kinetic energy or to its temperature? In this case, the various vibrational modes are clearly related to the energy stored in the molecule, but assuming that the translational kinetic energy is somehow related to a temperature just does not make sense. When a "large number" of molecules are considered, temperature refers to the energy contained in the large number of molecules and excludes any energy that causes the entire mass of molecules to move from one place to another.
- Within an object at a given temperature, each atom (or molecule in a gas) can have a different velocity (vector quantity) and speed (scalar quantity). This does not mean that each atom (or molecule) has a different temperature. Temperature is a property of the object as a whole. This is why I think other editors are disagreeing with you. Q Science (talk) 23:22, 1 November 2011 (UTC)
- Q Science the question to Jheald was "Are you saying that E = kBT is wrong?" Up until now Jheald and others have tried to focus the discussion on the connection between the Boltzmann constant and gases. In gases energy is freely exchanged between particles by collision according to Kinetic theory. But gases are only one state of matter, other states matter can have are solid, liquid and plasma. But the Boltzmann applies to microscopic particles in general, even electrons (conduction of heat in metals is largely by electrons, insulators do not in general conduct heat well); the temperature of electrons is also an important factor in gas discharges. The idea that the concept of temperature is restricted to gas volumes at thermal equilibrium is quite wrong and any article about the Boltzmann constant should record this. --Damorbel (talk) 07:02, 2 November 2011 (UTC)
- Yes. The relation E = kBT is wrong, if by E you mean the energy associated with a particular degree of freedom at a particular time.
- (1) Even when equipartition holds, the relation should be E = ½ kBT
- (2) Even in that case, E is not the energy associated with a particular degree of freedom at a particular time. E is the average energy -- either an average over time, or an average over all similar such degrees of freedom.
- (3) Contrary to your assertion that we have only been considering the translational energy of gas molecules, for which equipartition might hold at room temperature, both SBHarris and I have given you a number of examples where equipartion does not hold at room temperature, ie for which E does not equal ½ kBT. For example, the vibrational bending mode of a CO2 molecule. For that degree of freedom E = ½ kBT, where E is the average energy associated with that mode, only holds at temperatures significantly above 1000 K. As for solids, SB Harris gave you [3] the examples of Beryllium and Diamond, where average E per degree of freedom is very much less than ½ kBT, reflected in their heat capacity being very much less than 3R. You want to think about electrons? Good. That's discussed at Fermi–Dirac statistics in the section Quantum and classical regimes. Again, at room temperature, the average contribution to the heat capacity dE/dT per electron in a metal per d.o.f. is far far less than ½ kB. It is not until you get to a plasma environment like the surface of the sun that electrons start to make this kind of contribution to the heat capacity, i.e. that their average energy per degree of freedom becomes comparable with ½ kBT.
- So: in general it is not true that E = ½ kBT. Jheald (talk) 09:30, 2 November 2011 (UTC)
- I can remark that temperatures of objects behave vary much like their invariant masses and for the same intuitive reasons you give: even if increase the speed of a object (as would be seen by a moving observer), it keeps the temperature it had in its rest frame. So temperature is also Lorentz invariant. But what about the entropy of a moving object? Consensus seems to be that it increases to keep dE/dS constant, since T stays the same and dE/dS = T. The E here is not the rest energy but the total relativistic energy, so if it changes (which it does, as it's not invariant) then S must change in the same way to keep T constant. For a history of this argument see: [4]. The consensus also seems to be that in the CM frame for systems, entropy is conserved AND invariant, the same as mass, since adding energy in the CM frame also results in adding the same amount of invariant mass and system-rest-energy (in the CM frame they are all the same). SBHarris 00:19, 2 November 2011 (UTC)
Here's a completely brain-dead way to shoot down Damorbel's argument. If you define the energy of the particles of an ensemble to be the kinetic energy in the frame of reference of the center of mass of that ensemble, then for a large ensemble you get nothing out of the ordinary. But for an ensemble consisting of a single particle, you get an energy of zero, since a particle in isolation has zero velocity relative to its center of mass. --Vaughan Pratt (talk) 09:34, 2 November 2011 (UTC)
- Vaughan Pratt you write:- "consisting of a single particle, you get an energy of zero, since a particle in isolation has zero velocity relative to its center of mass." You do? and the particle's energy from vibrations and rotations wrt its centre of mass are also zero? Brain dead, I suppose!
- Oh btw, the matter I am dealing with are the revisions of the definition of the Boltzmann constant and the Kelvin that are currently in progress at the CIPM, you can read about it here [5], here [6], here [7], and here [8]. I just happen to consider that these matters are important for a Wiki article entitled Boltzmann constant. Have a nice day. --Damorbel (talk) 10:05, 2 November 2011 (UTC)
- Damorbel, those are very interesting references. I think you should add a section to the article summarizing the main points. Personally, I don't see how substituting one primary constant for another helps anything. But I look forward to reading more about this. Q Science (talk) 07:55, 3 November 2011 (UTC)
A particle "in isolation" (an electron flying free in vacuum) has no possible vibrations about its center of mass that could contribute to temperature. Any rotation (spin) cannot change for a fundamental particle like an electron. The same is true of individual atoms, which may have changable rotations, but nothing that would contribute to room temperatures, since the energy level spacing is so high (helium atoms in a baloon may have a temperature as a collection, but each helium atom in its own rest frame does not).
In solids, all systems have zero-point motion, and zero-point energy for reasons having to do with the Heisenberg uncertain principle and the wave nature of matter-- but they still have this at absolute zero, so this energy does not contribute to temperature, either. I think the point is made. A crystal at absolute zero still has vibrating atoms, but this fact does not affect its temperature, which remains zero. And if seen from a moving frame, it will have a very high velocity and each atom a high kinetic energy, but its temperature is still zero. SBHarris 16:15, 2 November 2011 (UTC)
- Quite right. Temperature is a statistical property of an ensemble that can be understood in various equivalent ways, such as the location of the peak of the Maxwell-Boltzmann distribution, or the derivative of energy with respect to entropy. It is the thermodynamic counterpart of signal-to-noise ratio in a communication channel, more precisely its reciprocal. There is no such thing as the temperature of one particle, which is a meaningless notion. --Vaughan Pratt (talk) 19:24, 2 November 2011 (UTC)
- Vaughan Pratt, you write "Temperature is a statistical property of an ensemble", which is true only if the ensemble is in equilibrium, meaning that the average temperature of the parts making up the ensemble becomes the assigned temperature; you will understand from this that all the particles making up the ensemble do not individually have the same temperature, it is only the average that counts. Or do you disagree? The basis of this is the Maxwell-Boltzmann distribution which is required for a system of particles, exchanging energy by random collision, to be in equilibrium i.e. to have a (single) defined temperature. The particles on the ensemble all have different temperatures but, in the defined conditions, their average is a single temperature - made up from many different (particle) temperatures. --Damorbel (talk) 10:28, 3 November 2011 (UTC)
- As no references are being offered to support this point of view it cannot possibly go in the article and I propose that this discussion now be ended and archived. SpinningSpark 15:38, 3 November 2011 (UTC)
- +++++++++ What do you mean, Spinningspark, "no references are being offered"? What do you think the Maxwell-Boltzmann distribution is? Further you write "to support this point of view " Excuse me, but it is not a point of view, it is the science of Kinetic Theory which was established many, many years before I was born. The Maxwell-Boltzmann distribution article explains about it here [9] or you can go directly to Kinetic Theory. Sorry, I thought you were familiar with branch of physics. +++++++--Damorbel (talk) 21:31, 3 November 2011 (UTC)
- By references I mean reliable sources and Wikipedia articles are decidedly not that. Please don't be condescending. In any case, neither of those articles states that a single molecule can have a temperature, but it does say "the temperature of the gas does not vary with the gross speed of the gas." SpinningSpark 22:10, 3 November 2011 (UTC)
- Spinningspark, would you care to explain further? I could understand you having problems with a Wiki ref. if you did not agree with it; but if, as you state 'neither of those articles states that a single molecule can have a temperature'. But surely you recognise that you are unlikely to find that precise form of words, it is much more likely that the 'temperature' bit is expresse mathematically, such a relationship is not often written down in the words you appear to want. So perhaps you can explain just what it is about the links that doesn't meet your needs. Below I give a expanded explanation of why each particle (also in a Maxwell-Boltzmann distribution) has its own temperature, using a quotation here [10] from this very article:-
- By references I mean reliable sources and Wikipedia articles are decidedly not that. Please don't be condescending. In any case, neither of those articles states that a single molecule can have a temperature, but it does say "the temperature of the gas does not vary with the gross speed of the gas." SpinningSpark 22:10, 3 November 2011 (UTC)
- +++++++++ What do you mean, Spinningspark, "no references are being offered"? What do you think the Maxwell-Boltzmann distribution is? Further you write "to support this point of view " Excuse me, but it is not a point of view, it is the science of Kinetic Theory which was established many, many years before I was born. The Maxwell-Boltzmann distribution article explains about it here [9] or you can go directly to Kinetic Theory. Sorry, I thought you were familiar with branch of physics. +++++++--Damorbel (talk) 21:31, 3 November 2011 (UTC)
- As no references are being offered to support this point of view it cannot possibly go in the article and I propose that this discussion now be ended and archived. SpinningSpark 15:38, 3 November 2011 (UTC)
- Vaughan Pratt, you write "Temperature is a statistical property of an ensemble", which is true only if the ensemble is in equilibrium, meaning that the average temperature of the parts making up the ensemble becomes the assigned temperature; you will understand from this that all the particles making up the ensemble do not individually have the same temperature, it is only the average that counts. Or do you disagree? The basis of this is the Maxwell-Boltzmann distribution which is required for a system of particles, exchanging energy by random collision, to be in equilibrium i.e. to have a (single) defined temperature. The particles on the ensemble all have different temperatures but, in the defined conditions, their average is a single temperature - made up from many different (particle) temperatures. --Damorbel (talk) 10:28, 3 November 2011 (UTC)
- Kinetic theory gives the average pressure P for an ideal gas as : [This is because the gas molecules have a velocity = ]
- Substituting that the average translational kinetic energy is : [This is the relationship between velocity and temperature for each particle, just as ech particle has an individual velocity, each particle has an individual temperature.]
- First of all, please actually read the link I gave you to reliable sources. Wikipedia articles, whether I agree with them or not, are NEVER, under ANY circumstances considered reliable sources.
- I am astonished that you can say that I am "unlikely to find that precise form of words". Earlier in this discussion I gave you a link to multiple sources that stated precisely and without ambiguity that single molecules cannot have a meaningful temperature.
- After you seemed incapable of actually following the link yourself, JHeald was kind enough to list many of them out on this page with the exact quote and details of the sources. This must have involved him in some effort. This surreal discussion is beginning to take the form of "nah nah, I'm not listening".
- Your own working through of equations above does not constitute a reliable source, so cannot be used as a basis for a change to the article. But in any case I believe that the symbol represents mean velocity of an assembly of particles, not the actual velocity of an individual particle.
- Do you have any kind of RS that directly states that individual molecules can have a temperature or that contradict the multiple sources I provided above that say it can't? If not, continuing the debate is pointless, since even if you were right, it could not be placed in the article. SpinningSpark 19:13, 4 November 2011 (UTC)
- Just to add to what SpinningSpark has written: Damorbel, note the word "average" in what you've written above. As noted in the book quotes, the random average kinetic energy is something very different from the instantaneous kinetic energy of any given particle. The molecules in a Maxwell-Boltzmann distribution all have the same average kinetic energy, so all have the same temperature. They don't have instantaneous different temperatures corresponding to their different instantaneous kinetic energies, because that is not what temperature is. Temperature is to do with average random thermal motions, something quite distinct from the immediate velocity of an object or particle.
- But more fundamentally, please listen to what SpinningSpark has written above -- and recall this statement of principle, from an ArbCom decision just a few months ago: "Article talk pages should not be used by editors as platforms for their personal views on a subject, nor for proposing unpublished solutions, forwarding original ideas, redefining terms, or so forth... Although more general discussion may be permissible in some circumstances, it will not be tolerated when it becomes tendentious, overwhelms the page, impedes productive work, or is otherwise disruptive."
- In that particular case ArbCom as a result went on to impose several permanent topic bans. So take stock when SpinningSpark cautions you before expending any more words on this. You've had several warnings in the past about this on your talk page; and here you have been indulged at length, probably far more than you ever should have been. So far you haven't produced a ghost of what WP would consider a reliable source (WP:RS) that explicitly supports your concerns. There's a real limit to how much argumentation not directly reflecting reliable sources is tolerable before it becomes disruptive and tendentious. The time has come to either put up an WP:RS, or to drop it. Jheald (talk) 21:26, 4 November 2011 (UTC)
- Jheald The matter I am proposing for this article is the changes being introduced by the International Committee for Weights and Measures to the Boltzmann constant. May we have your opinion on this? The discussion you are advancing here is about macroscopic ensembles of particles and is clearly 'off topic'.
- Similarly when the first line of an article has technicalities related to another topic (the macroscopic ensembles of particles having a temperature) can be confusing; I refer to this statement "which must necessarily be observed at the collective or bulk level"; surely it would be far better to leave this to the subsections 1 - 5? The important detail about the Boltzmann constant being that it is the energy of an individual particle; ensembles, bulk, macroscopic, collections of particles, thermal voltage are all matters where the Boltzmann constant plays an important role but none of them are central to the concept of the Boltzmann constant, so reference should not be made to just one of them at such an early stage, it is confusing.
And just to blow your mind, Damorbel, Let us note that the thermodynamic temperature only speaks about the relationship of the system's energy to entropy differential: T = dE/dS. For most systems with many unpopulated quantum states, this is positive, as anytime you add energy you add entropy. But for some systems you can't add any more energy since all the high-energy states are populated (as in an excited laser system). In these systems entropy is actually low, since all the particles are in the same (high) state. If you substract energy (the system "lases") then its entropy actually increases, since now you have more particles in different states and the disorder increases. In such systems dE/dS is negative, and thus they actually have a negative temperature. However, as you see, again these concepts make no sense for individual particles, since dE/dS doesn't really make much sense for single particles. SBHarris 16:13, 3 November 2011 (UTC)
- Sbharris, you write "since dE/dS doesn't really make much sense for single particles." Which is quite correct, since entropy is a 'multiparticle' concept. If you read the wiki article on entropy, right at the beginning you will see this "Entropy is a thermodynamic property that can be used to determine the energy available for useful work in a thermodynamic process," 'Useful work' means there is a temperature difference in the system, no heat engine does work without a temperature difference. Such a system has entropy less than the maximum (i.e. at least two parts with different temperatures to get some kind of Carnot efficiency The point of all this is that, although a single degree of freedom , of which a particle has at least three, does actually have entropy because S = Q/T exists for a single degree of freedom but the entropy will always be a maximum. Temperature T also exists since T = Q/S. I am using Q (thermal energy) instead of E (energy in general), because temperature (T) applies only to thermal energy. Energy in general, E, is of course very important; E can be chemical energy, gravitational energy, electrical energy etc., etc., and must not be forgotten, if there is any around!
- BTW using T = dE/dS is risky, it absolutely does not apply to systems changing state, as with evaporating water, sure the energy is changing because the volume is increasing but the temperature isn't (changing).
- Finally, all this has got almost nothing to do with revisions to the definitions of the Boltzmann constant and the Kelvin by the CIPM! --Damorbel (talk) 21:31, 3 November 2011 (UTC)
- Note that (1/T) = dS/dE is actually perfectly applicable to phase transitions. You put in some energy, the entropy increases (liquid becomes vapour). The ratio between the energy input and the entropy increase remains constant, and corresponds to the reciprocal of the temperature. No problem.
- As to your final line, it was you that titled this section (and the previous several) "First line of the article". If you now want to shift the focus, are you prepared to accept that the first line of the article is in fact correct, so SpinningSpark can archive all of this irrelevance? Jheald (talk) 22:32, 3 November 2011 (UTC)
- I'm afraid I don't agree that the first line is correct. The statement "must necessarily be observed at the collective or bulk level." can only be considered relevant in conditions of complete equilibrium. It is this confusion of the Boltzmann constant with higher levels of abstraction such as Maxwell-Boltzmann distribution and kinetic theory that are not relevant to the article; they should be included with a link because the Boltzmann constant is important tunderstanding these higher level concepts. At present the article has othe quite incorrect statements e.g. "since temperature (T) makes sense only in the macroscopic world", which is quite incorrect.--Damorbel (talk) 06:52, 4 November 2011 (UTC)
- Jheald you write above "(1/T) = dS/dE is actually perfectly applicable to phase transitions." In what sense? You add "...put in some energy, the entropy increases ... corresponds to the reciprocal of the temperature". Which is partially correct but you fail to take into account that the 'added energy' does not change the temperature of the liquid - vapour system, it merely changes the ratio of liquid to vapour. This ratio of 'liquid to vapour' includes a change of volume which is where the energy you have added is to be found. In the thermodynamics of phase transitions the entropy is replaced, to avoid confusion, by the enthalpy; so what you should have written is "adding energy during a phase transition increases the enthalpy, not the temperature", or something like that. --Damorbel (talk) 09:57, 5 November 2011 (UTC)
- "It doesn't change the temperature of the liquid - vapour system." So what? The equation is above changes in energy (dE) and changes in entropy (dS), which both happen, not about changes in temperature, which doesn't.
- The enthalpy isn't what's primarily relevant. It's the change in entropy which we're considering as a function of energy added, because that is what the temperature is defined in terms of. Jheald (talk) 11:32, 5 November 2011 (UTC)
- Jheald you write above "(1/T) = dS/dE is actually perfectly applicable to phase transitions." In what sense? You add "...put in some energy, the entropy increases ... corresponds to the reciprocal of the temperature". Which is partially correct but you fail to take into account that the 'added energy' does not change the temperature of the liquid - vapour system, it merely changes the ratio of liquid to vapour. This ratio of 'liquid to vapour' includes a change of volume which is where the energy you have added is to be found. In the thermodynamics of phase transitions the entropy is replaced, to avoid confusion, by the enthalpy; so what you should have written is "adding energy during a phase transition increases the enthalpy, not the temperature", or something like that. --Damorbel (talk) 09:57, 5 November 2011 (UTC)
- I'm afraid I don't agree that the first line is correct. The statement "must necessarily be observed at the collective or bulk level." can only be considered relevant in conditions of complete equilibrium. It is this confusion of the Boltzmann constant with higher levels of abstraction such as Maxwell-Boltzmann distribution and kinetic theory that are not relevant to the article; they should be included with a link because the Boltzmann constant is important tunderstanding these higher level concepts. At present the article has othe quite incorrect statements e.g. "since temperature (T) makes sense only in the macroscopic world", which is quite incorrect.--Damorbel (talk) 06:52, 4 November 2011 (UTC)
Jheald you write above "(2) Even in that case, E is not the energy associated with a particular degree of freedom at a particular time. E is the average energy -- either an average over time, or an average over all similar such degrees of freedom." Which is what I have been failing to explain! Yours is a very respectable summary of the ergodic hypothesis. A given particle in an ensemble with a given temperature T will also, over time, have a temperature T - this is what happens to a particle of pollen. A particle of pollen is much more massive than a gas or water molecule, so the buffeting it receives from the individual molecules has little individual effect, it is only the time average that the temperature of the pollen converges to the temperature of the whole ensemble. However, as you pointed out by your reference to the equipartition of energy, the (average) thermal energy of the pollen particles is the same as the individual molecules and atoms. Surely it is only a small step to see that the average of all the energies of an ensemble (volume) of molecules is a temperature T, also part of the Ergodic theory? --Damorbel (talk) 09:23, 9 November 2011 (UTC)
Maxwell and Bolzmann and the box with fast atoms all at the same velocity
We seem to be going in circles. Perhaps it would break the cycle if we do this step by step. Damorbel raised an excellent question, What do you think the Maxwell-Boltzmann distribution is? Assuming it was intended rhetorically, it leads naturally into another. Damorbel, what would you say the units are for the average of the Maxwell-Boltzmann distribution? --Vaughan Pratt (talk) 22:19, 3 November 2011 (UTC)
- Vaughan Pratt, why do you want to start a discussion on the Maxwell-Boltzmann distribution here? --Damorbel (talk) 06:52, 4 November 2011 (UTC)
- Why? In order to illustrate that "temperature" requires a system of many parts in equilibration with another system at the same temperature, to be a meaninful concept. What determines such a "temperature" is the amount of entropy generated in a system for each bit of energy added or subtracted. Heat will flow in the direction in which the largest entropy change is generated per per unit of transfered heat energy, since that process is the one that causes an increase in entropy in the universe, which means it happens spontaneously. That is all that "termperature" means: it defines direction of heat flow. Heat flow is a meaningless concept when talking about individual atoms or particles, and one cannot even figure out the diection of heat flow if the parts of two reservoirs or objects that are in contact, are not in thermodynamic equilibrium within themselves, first. Each part that touches, must have a "temperature." The heat flow direction defines these temperatures; temperature is a short of form of figuring out how much potential systems have to transfer heat to some other object, after their energy contents are separately equilibrated.
For example, consider a 22.4 L box at absolute zero with an interior at absolute zero, into which two 10 gram crystals of solid neon (also at absolute zero, or as close to it as you like) have been fired by cannons in opposite directions, at 1100 m/sec relative to the box. As they head toward impact with each other in the center of the box, it is sealed. Question: what is the temperature of this system, according to you? What is the temperature of each neon atom, as seen in the lab frame? What is the entropy of the system?
Suppose the crystals miss each other and bounce between walls of the box, perfectly elastically. If thermal contact could be make between this box and another box of neon of the same size, filled with gas at STP, which direction would heat flow?
Now, suppose the crystals are allowed to break up and fill the box with neon gas as their kinetic energy is distributed randomly, in a Maxwell-Boltzmann distribution. Has the temperature of the system changed? Has the entropy changed? What is the direction of heat flow now between this box and the box holding the same amount of neon at STP?
Note: I suppose all this amounts to much the same as asking what would happen if Maxwell's demon managed to take the mole of atoms in a 22.4 L box of neon at standard temp and pressure, and get them going all at the same velocity, in 3 orthogonal directions, all while conserving momentum and energy. Clearly the entropy changes. Does the temperature change? That's a question for you. Starting with the crystals, I've merely run that thought experiment in the opposite direction, in the direction of spontaneity. SBHarris 19:16, 4 November 2011 (UTC)
- Why? In order to illustrate that "temperature" requires a system of many parts in equilibration with another system at the same temperature, to be a meaninful concept. What determines such a "temperature" is the amount of entropy generated in a system for each bit of energy added or subtracted. Heat will flow in the direction in which the largest entropy change is generated per per unit of transfered heat energy, since that process is the one that causes an increase in entropy in the universe, which means it happens spontaneously. That is all that "termperature" means: it defines direction of heat flow. Heat flow is a meaningless concept when talking about individual atoms or particles, and one cannot even figure out the diection of heat flow if the parts of two reservoirs or objects that are in contact, are not in thermodynamic equilibrium within themselves, first. Each part that touches, must have a "temperature." The heat flow direction defines these temperatures; temperature is a short of form of figuring out how much potential systems have to transfer heat to some other object, after their energy contents are separately equilibrated.
- Sbharris, nothing that you write above concerns the Boltzmann constant or is revision by the International Committee for Weights and Measures. --Damorbel (talk) 09:14, 5 November 2011 (UTC)
- Alas, nor does anything you've written, so long as you continue to insist that "temperature" is anything other than a statistical property of an aggregate system, like entropy is. This affects how constants are measured. The Bolzmann constant is always inseparably connected to temperature, because it connects temperature and energy scales. However, although there are some physical constants that can be measured from properties of single atoms in traps, the Bolzmann constant, since it requires a connection to "temperature," is not one of them. Never has been, and it never will be. Got that? If you think otherwise, get over it. SBHarris 21:09, 5 November 2011 (UTC)
- How then does your argument square with the definition of the Boltzmann constant as "the kinetic energy of a particle", any particle - it doesn't have to be an atom or a molecule, "at a temperature T?" The particle can easily be as big as a pollen grain (for particles many times larger than an atom or a molecule see Brownian motion and 'the Feynman Lectures on Physics', v1 p41-1 ff); the only requirement is for the particle, whatever its size, to exchange energy with other particles (also of any size) by means random collision. I put my often repeated question to you - 'what size do these particles have to be for you recognise them as having a temperature, according to the definition of the Boltzmann constant?'
- PS For simplicity I am not considering 'numbers of degrees of freedom per particle' here, a particle's number of degrees of freedom is a separate matter. --Damorbel (talk) 10:44, 6 November 2011 (UTC)
- If a pollen grain is moving at the speed of sound, what is its temperature? My understanding is that temperature is not defined when only the speed is known. As you implied above, temperature is related to collisions, which itself implies a large number of particles. So, it is not the size of the particles that matters, but the number of particles per cubic volume. This leads immediately to an interesting question - What is the temperature of empty space? Q Science (talk) 20:15, 6 November 2011 (UTC)
- "If a pollen ... what is its temperature". The short answer is that it depends on the mass of the particle because you have defined the velocity, it's like this because the energy of any particle is related to temperature by the product of kB and T. In kinetic theory the particles collide with each other and the walls of the vessel confining them, this means their speed and direction are constantly changing and, since they are confined in some sort of container, their net velocity is zero.
- If you check kinetic theory you will see that the kinetic energy 1/2mv2 is equal to 3/2kBT, you can see from this that the temperature is independent from the mass m of the particle but the velocity v is proportional to the square root of the temperature (and so is the speed of sound). There is a better guide to gases and kinetic theory here [11] --Damorbel (talk) 21:38, 6 November 2011 (UTC)
- Which is hotter, a one gram hailstone moving at 1,000 m/s or a one gram red-hot cinder moving at 2 m/s ? SpinningSpark 22:37, 6 November 2011 (UTC)
- Spinningspark, the best picture I can paint for you is the Armour-piercing discarding sabot. This weapon is fired from a gun when the propellent (at low temperature) is ignited, which turns the propellant and its stored energy into a hot, high pressure gas. This gas expands in the gun's barrel, doing work on the projectile by accelerating it out of the barrel. 'Doing work on the projectile' cools the gas because its thermal (random kinetic) energy is changed into the (forward) kinetic energy of the projectile; the projectile remains fairly cool while in the air but it is carrying a large amount of the (thermal) energy of the gas with it because it is perhaps 2kg. of uranium moving at between 1 and 2 km/s, thus about 6MJ. When the projectile hits its target the velocity is reduced to zero by friction and the energy, that originally comes from the propellant, is released as heat in a very small area of the target.
- Thus the chemical energy in a cold compound (the propellant) is transferred to a (hot) gas (kinetic energy of the gas particles), to a (fairly) cool projectile giving it lot of forward kinetic energy and finally to a very hot (because of a lot of thermal (kinetic) energy) target.--Damorbel (talk) 07:20, 7 November 2011 (UTC)
- Which is hotter, a one gram hailstone moving at 1,000 m/s or a one gram red-hot cinder moving at 2 m/s ? SpinningSpark 22:37, 6 November 2011 (UTC)
- Damorbel, I agree with your description except that the velocity should be the average velocity and not the instantaneous velocity. The instantaneous velocity can be anything from zero to an extremely high value. However, temperature is defined to be related to only the average velocity assuming that the instantaneous velocity follows a Maxwell distribution. Your reference supports this interpretation. Q Science (talk) 23:08, 6 November 2011 (UTC)
- I don't see how you can say that the temperature of a particle can only be related to the average particle energy. Sure, when you are trying to measure temperature with a thermometer what you measure is the average energy of the particles but that is a limitation of the thermometer, not a requirement of the physics. Other ways of measuring temperature (defined as 'energy per particle') are more sensitive, one example being the photoelectric effect where electrons are ejected from material only if the impacting particle has sufficient energy. --Damorbel (talk) 07:20, 7 November 2011 (UTC)
Energy, not temperature. You can't talk about the temperature of an individual photon, either. Energy, yes. Temperature, no. SBHarris 07:28, 7 November 2011 (UTC)
- According to Feynman ('the Feynman Lectures on Physics', v1 p41-1 ff) temperature is 'energy per particle'. He is not the only one, but I venture to say he is a reliable source. Now perhaps you wish to check if photons are particles, we can follow that path if you like, please say if you so wish; however on the basis of energy 'per particle' photons also have a temperature, just as they have momentum even though they have no mass! --Damorbel (talk) 09:38, 7 November 2011 (UTC)
- My copy of Feynman is not searchable and I can not find your quote. However, the paper continuously refers to "mean kinetic energy". Q Science (talk) 14:46, 7 November 2011 (UTC)
- On p41-1 l.15 (of the small text) Feymman writes "the mean kinetic energy of a small particle suspended in a liquid or a gas will be 3/2kT even though...". Now the kinetic energy is 1/2mv2. I don't quite see why the mean energy of a large number of atoms should represent the temperature but the instantaneous energy does not represent the instantaneous temperature. Most thermometers are slow to react and need time to measure a temperature, but slow reaction is not a requirement for a temperature to exist; it is not the thermometer that produces the temperature. The possibility exists (and molecular interactions that are temperature sensitive are a good example) of a method of determining the temperature of an individual particle. Further, the fact that a large number of molecules seemingly reach a temperature such as a melting point, means that they have individually acquired enough energy to break the crystalline bonds holding the solid together does not mean that the bonds will break when a mole or some other measure is needed for a melting point to be established. --Damorbel (talk) 16:00, 7 November 2011 (UTC)
- Because A implies B cannot be used to argue that B implies A. Because the mean energy per particle can be calculated from the temperature, doesn't imply the energy of a single particle tells you anything about the temperature. As in my example above, you can calculate GDP per capita (mean income) from a country, but no one person has a "mean income." Individual people have individual incomes. Temperature is already a mean quantity and no individual quantiy tells you anything about it, anymore than I can take income of a single person and try to infer a national mean income from it.
Try going back to your idea that individual photons have a temperature. Certainly a collection of photons at different frequences in a blackbody distribution have a collective temperature, but they are a collection. What about one photon of given energy? What about a monochromatic beam of photons at a single energy?
This is not an academic exercise. Your microwave oven heats things that are already so hot they are emitting in the infrared (at about 0.001 cm). But the photons from the oven have a wavelength of several cm. So why does the energy go from oven to food, and not the other way around? Could it be because you're not talking about two things of different temperature at all? The food has a temperature, but what about the microwave output? Remember, the photons from the Big Bang have a wavelength of about 0.1 cm, so your microwave oven photons are "colder" than that, if you insist that their temperature is an individual thing, and even when there are great numbers, it doesn't require a distribution. So by your theory, they shouldn't be able to heat anything much above 0.2 Kelvin or so. My oatmeal gets considerably hotter. In fact, you can put a neon glow tube into a microwave (with a cup of water next to it for a load) and see the neon plasma ionize at thousands of degrees K. What's going on, pray tell? Something wrong with your theory?
A beam of monochromatic microwaves is a lot like a beam of atoms all moving in one direction at one velocity. You think it has a "temperature" you can calculate as energy/particle. I'm here to tell you that since it has an entropy/energy as low as you like (depending on how monochromatic it is) it can induce temperatures as high as you like. Think about that next time you see plasma sparks at very high temps, from metal points inside your microwave oven. SBHarris 17:29, 7 November 2011 (UTC)
- Sbharris When your neon tube is put in the microwave it is the electric field generated by a very large number of electrons oscillating in a magnetron osscillator that ionises the gas, just as it can make sparks (sparks are a high pressure version of a glow discharge).
- Because A implies B cannot be used to argue that B implies A. Because the mean energy per particle can be calculated from the temperature, doesn't imply the energy of a single particle tells you anything about the temperature. As in my example above, you can calculate GDP per capita (mean income) from a country, but no one person has a "mean income." Individual people have individual incomes. Temperature is already a mean quantity and no individual quantiy tells you anything about it, anymore than I can take income of a single person and try to infer a national mean income from it.
- On p41-1 l.15 (of the small text) Feymman writes "the mean kinetic energy of a small particle suspended in a liquid or a gas will be 3/2kT even though...". Now the kinetic energy is 1/2mv2. I don't quite see why the mean energy of a large number of atoms should represent the temperature but the instantaneous energy does not represent the instantaneous temperature. Most thermometers are slow to react and need time to measure a temperature, but slow reaction is not a requirement for a temperature to exist; it is not the thermometer that produces the temperature. The possibility exists (and molecular interactions that are temperature sensitive are a good example) of a method of determining the temperature of an individual particle. Further, the fact that a large number of molecules seemingly reach a temperature such as a melting point, means that they have individually acquired enough energy to break the crystalline bonds holding the solid together does not mean that the bonds will break when a mole or some other measure is needed for a melting point to be established. --Damorbel (talk) 16:00, 7 November 2011 (UTC)
- Further you write:- "A beam of monochromatic microwaves is a lot like a beam of atoms all moving in one direction at one velocity. You think it has a "temperature" you can calculate as energy/particle. I'm here to tell you that since it has an entropy/energy as low as you like (depending on how monochromatic it is) it can induce temperatures as high as you like." ---- I'm not sure about the microwave beam being equivalent to a beam of particles, particles do not travel at the speed of light but for the rest you are correct. A large collection of particles does not have, according to the Maxwell-Boltzmann distribution of energy (it is mono-energetic) the system is in severe disequilibrium. The energy that the particles your beam has will appear somewhere in a Maxwell-Boltzmann distribution, a distribution that will have a peak at a wavelength corresponding to some temperatute, this is the maximum temperature your beam can heat anything to.--Damorbel (talk) 13:22, 8 November 2011 (UTC)
- Damorbel, I found the second quote in my copy of Feynman. Thanks. It is my understanding that temperature is defined as a bulk property of matter and is not defined for individual atoms. Think of this as just one of the conventions of language. As far as I can tell, most of the editors here, as well as Feynman, are using this convention. Q Science (talk) 20:41, 7 November 2011 (UTC)
- Q Science your argument leaves me baffled! The formula for the energy of a particle with three degrees of freedom is E =3/2kBT where T is temperature, by rearranging T = 2/3 E/kB o the temperature T of the particle is explicit, it isn't the temperature of any other particle, how can it be? The basic principle of Kinetic theory is that the particles making up the ensemble are independent except for random collisions. In a given sample the particles have a range of energies, if the energy per particle is expressed as 1/2 mv2 or 3/2kBT what is the difference? The collision process is always going to produce a wide range of particle energies, calling it 'kinetic energy' or 'particle temperature' is identical.--Damorbel (talk) 13:01, 8 November 2011 (UTC)
- The difference is that you can't seem to understand the difference between the energy of a single particle E, and the average (mean) energy of particle in of a collection of particles <E>. You're told about the last, and you assume the first. But they are not the same. Feynman says in vol. 1 chapter 41 "the mean kinetic energy of a small particle suspended in a liquid or a gas will be 3/2kT even though...". Do you see the word mean? He is talking about average energy per particle not the energy of each particular particle (since obviously every particle doesn't have the average energy). You write "The formula for the energy of a particle with three degrees of freedom is E =3/2kBT where T is temperature," but that is incorrect, and is not the one Feynman gives in chapter 39 on gas kinetic theory. That E is the average energy per particle, not the energy of a particle (any given particle). The formula you give is wrong as you define the variables. E there canoot be the energy of a single particle. Rather, E there in that formula is <E>, the mean/average E. No text you have says anything else, yet you refuse to believe your texts. You refuse to believe Feynman, even though you quote him! Why don't you look at chapter 39 instead of chapter 41, where you will find that every time Feynman talks about energy of gas molecules in the context of temperature, he uses <E>, the mean energy. And he defines temperature in terms of the mean energy. That's how temperature is defined. Once it is defined, any energy associated with temperature must be a mean energy. Several people have told you that above, and it's not penetrating!
I'm disappointed in you about the microwaves. Yes, photons are particles (see photoelectric effect and Compton effect). You can have one photon. You can't switch to talking about the electric field of an EM wave when it suits you, but talk about individual particles of light (as you do above-- you brought the subject up) when you think that will serve your argument. Nature must do the same thing no matter which way you chose to view it/her.
Yes, you're quite right that the maximum temperature a beam can heat anything to, is the slice out of the Stefan-Bolzmann power law it represents for a black body, but that depends on the narrowness of frequency, and the beam's maldistribution of frequencies away from that of a from black body spectrum (which is the analog of Maxwell-Bolzmann distribution for boson photons). But this does not fit your definition of temperature. First, your definition clearly doesn't work for a single photon. Second, it allows me to adjust the "heating power" of a beam of a given energy, simply by making it more monochromatic, but keeping its power the same. If "temperature" is simply energy per particle in any circumstance (or an average of each particle energy for any particle collection), I shouldn't be able to do that, since the number of particles and the energy are not changed by this, yet the heating power of the beam (what it can do to temperatures of objects) is.
Finally, even if you don't accept the quantum theory of EM radiation, you can replace a microwave beam with a beam of atoms, all moving at a velocity which has a very narrow spectrum (that is, they are all going nearly the same velocity, as closely as physical possible). I did that in my previous examples and you missed the point. The smaller you get such a beam velocity spread, the hotter that beam can heat another collection of gas you aim it at. Do you understand this point? If I shoot a beam of atoms at a bottle of gas, even if the beam atoms have far less kinetic energy than the average energy of the molecules in the gas in the bottle, I will heat the bottle and the gas, so long as the power of the atom beam exceeds the power of that velocity-spread channel in the Maxwell-Bolzmann temp that I aim it at (and assuming that the incoming gas molecules can bounce off the bottle, so they can transfer some energy without having to stay behind and be equilibrated). If atoms in any object are constrained (as in solid objects) a bombardment of atoms of any temperature-- even close to absolute zero, will heat the object, so long as the density of the beam is high enough. That's more or less what you do when you throw a baseball at a wall! The baseball can be cooler than the wall (in terms of what you measure when you stick a themometer in both), and even can be colder when you count its kinetic energy, and yet it will still heat it when it strikes, and you can keep that up till the wall is very hot. The point is that you can arrange this with a baseball so the energy transfer is only in one-direction, because the atoms in the baseball are only moving in one direction.
You could go farther and hit a paddle-wheel with slow baseballs outside a container of gas, and this wheel could turn a crank connected to a fan inside the box that heats the gas by friction. This works no matter what speed of the baseball, since the wheel always turns only one direction, even with a slow cold ball. You could do the same with a gas beam, turing a paddle wheel outside the box connected to a fan inside. You begin to see the role entropy plays in energy transfer, since none of this would work if the gas outside didn't have not only energy/per particle, but a DIRECTED energy per particle (low entropy per energy).
The gas certainly has a temperature, but what about the beam? (or the cold baseball?). If you're going to insist the beam does also, and insist that it is defined as the mean kinetic energy of the beam atoms (even though it is maldistributed) then you must say this is a situation in which heat is transferred from cold to hot, spontaneously. It's exactly the same situation as a microwave beam or laser directed into a hole in a black body radiator. Thus, we either need to give up the second law of thermodynamics (heat goes from hot to cold, those being defined in terms of temperature), or else we must give up your (private) definition of "temperature." Which is it to be? SBHarris 16:14, 8 November 2011 (UTC)
- The difference is that you can't seem to understand the difference between the energy of a single particle E, and the average (mean) energy of particle in of a collection of particles <E>. You're told about the last, and you assume the first. But they are not the same. Feynman says in vol. 1 chapter 41 "the mean kinetic energy of a small particle suspended in a liquid or a gas will be 3/2kT even though...". Do you see the word mean? He is talking about average energy per particle not the energy of each particular particle (since obviously every particle doesn't have the average energy). You write "The formula for the energy of a particle with three degrees of freedom is E =3/2kBT where T is temperature," but that is incorrect, and is not the one Feynman gives in chapter 39 on gas kinetic theory. That E is the average energy per particle, not the energy of a particle (any given particle). The formula you give is wrong as you define the variables. E there canoot be the energy of a single particle. Rather, E there in that formula is <E>, the mean/average E. No text you have says anything else, yet you refuse to believe your texts. You refuse to believe Feynman, even though you quote him! Why don't you look at chapter 39 instead of chapter 41, where you will find that every time Feynman talks about energy of gas molecules in the context of temperature, he uses <E>, the mean energy. And he defines temperature in terms of the mean energy. That's how temperature is defined. Once it is defined, any energy associated with temperature must be a mean energy. Several people have told you that above, and it's not penetrating!
- Q Science your argument leaves me baffled! The formula for the energy of a particle with three degrees of freedom is E =3/2kBT where T is temperature, by rearranging T = 2/3 E/kB o the temperature T of the particle is explicit, it isn't the temperature of any other particle, how can it be? The basic principle of Kinetic theory is that the particles making up the ensemble are independent except for random collisions. In a given sample the particles have a range of energies, if the energy per particle is expressed as 1/2 mv2 or 3/2kBT what is the difference? The collision process is always going to produce a wide range of particle energies, calling it 'kinetic energy' or 'particle temperature' is identical.--Damorbel (talk) 13:01, 8 November 2011 (UTC)
- User Sbharris you wrote "Do you see the word mean? " My response [12] to Heald makes it quite clear that, overtime, all particles in an ensemble will have the same average temperature; also at any instant the sum of the energies (which energies have the Maxwell-Boltzmann distribution) of all the particle divided by AN gives the ensemble temperature. BTW you write "That E is the average energy per particle, not the energy of a particle." Do you mean by this that "the energy of a particle" is not an allowable concept? That a photon cannot be considered as a particle with an energy E = hv? That the mechanical energy of (massive) particles cannot be E =3/2kBT?
- Further you write "You can't switch to talking about the electric field of an EM wave when it suits you". Only when it is appropriate. For a microwave oven the photon energy is about 10-5 that of infrared photons of IR. So to get any useful tranfer of energy at microwave frequencies the Magnetron has perhaps 1020 electrons oscillating in its cavities; counting the photons is possible but the old fashioned way of measuring the electric and magnetic fields (or electromagnetic field) due to the electrons and their flow is much more familiar and has few if any disadvantages.
- Finally I have not yet had an answer to the question I posed about ensembles of particles not in equilibrium, thus not having a Maxwell-Boltzmann distribution. What is the temperature of such an ensemble? What is the meaning of the average energy of the particles in such an ensemble? --Damorbel (talk) 14:11, 9 November 2011 (UTC)
- And another final question that has not been answered. If an atom or a molecule is too small to have a temperature according to the definition of the Boltzmann constant, what then is the smallest particle that can have a temperature? --Damorbel (talk) 18:04, 9 November 2011 (UTC)
- It isn't that an atom is too small, the problem is the number of atoms per unit volume. Given a container, if the probability of an atom hitting another atom is "much larger" than the probability of hitting the wall of the container, then there are enough atoms. And no, I don't have a value for "much larger". BTW, this is why I think there is a problem providing a temperature for the thermosphere (more than 1,500 °C), there aren't enough molecules per unit volume. For instance, "above 160 kilometers ... the density is so low that molecular interactions are too infrequent to permit the transmission of sound". Q Science (talk) 07:44, 10 November 2011 (UTC)
- Q Science, you write "the problem is the number of atoms per unit volume" Yes I agree completely. For the Maxwell-Boltzmann distribution to apply the particles must be exchanging energy in a random way, if their density is so low that they are exchanging momentum with the container walls more often than with each other this is not the case. You point out that "above 160 kilometers ... the density is so low that molecular interactions are too infrequent to permit the transmission of sound", too true! The transmission of sound requires the exchange of momentum by the particles of gas, that is why the speed of sound is a function of the square root of the gas temperature T, but only if the particles are able to exchange (most of their) energy (momentum) by collision, no collisions = no sound - at all! --Damorbel (talk) 13:11, 10 November 2011 (UTC)
Hello everyone. Just to let you know, I have made a minor edit and replaced the reference "Boltzmann's constant" with "the Boltzmann constant", merely to ensure consistency throughout the article. However, I did not touch the quotation from Planck's Nobel Lecture - which proved to be accurate, nor the occurrence in the title of item #7 in the "References" list - which I cannot verify. --NicAdi 14:35, 29 December 2011 (UTC)
Archived
- I have archived the hugely long discussion on this page. Since no one is offering sourced proposals for improving this article it really does not belong here per WP:TALK. Please do not reopen it. SpinningSpark 19:04, 7 November 2011 (UTC)
- Please do not act unilaterally without seeking consensus, as (last I checked) nobody died and made you king. Per WP:BRD, other comments are needed.
Personally, I'm learning a lot from this discussion about how to present this difficult subject to readers who don't understand it well. SBHarris 19:26, 7 November 2011 (UTC)
- Spinningspark, I don't agree with your archiving. You complain about the lack of sources; there are plenty of sources, too many is not necessarily helpful. You don't comment on the sources given, how do you maintain this position? --Damorbel (talk) 19:49, 7 November 2011 (UTC)
- Wikipedia is not a forum. It's not a place to learn "how to present this difficult subject to readers who don't understand it well", valuable though that may be. Damorbel has had long enough on this (policy would say far too long), and so far hasn't put up a single source that explicitly supports his contention.
- Let's give Damorbel a chance to say whether he accepts SBHarris's last point about the microwave, so we can put this whole thing to bed with consensus achieved. That would be the best outcome. But if Damorbel doesn't accept it, I say let's call the discussion dead there, and archive it on that basis. Jheald (talk) 20:18, 7 November 2011 (UTC)
- My problem with the "archive" is that it blanked the history page. On other pages, that does not happen. Please fix it. Q Science (talk) 20:45, 7 November 2011 (UTC)
- Yeah, archiving should be by copy-and-paste, not redirect. We're going to need a history-merging specialist to patch this up. Jheald (talk) 20:56, 7 November 2011 (UTC)
- Worse than redirect-- this was an "archive" by page-move and redirect, which effectively moved the page's revision history [13] to the archive page's history, rather than left it at the main article talk page revision history, where it belongs. So per WP:MOVE it's going to take an admin to fix it straightforwardly. The real problem is that a new page for Talk:Bolzmann constant was created, and thus its page-change history along with it. Now it can't be written-over by an ordinary editor. What a FUBAR. SBHarris 23:35, 7 November 2011 (UTC)
{{Histmerge|Talk:Boltzmann constant/Archive 1}}{{Histmerge|Talk:Boltzmann constant/Archive 2}}- I added the two Histmerge tags and a comment at Wikipedia:Cut_and_paste_move_repair_holding_pen Q Science (talk) 06:52, 8 November 2011 (UTC)
- I have now re-integrated Talk:Boltzmann constant/Archive 2 with Talk:Boltzmann constant, including histmerging. Talk:Boltzmann constant/Archive 2 is now empty. Talk:Boltzmann constant/Archive 1 is all old discussions, so I have left it alone. (I always archive my user talk by moving it to the archive name and then cut-and-pasting the header matter back (plus any current discussions): then the edit history stays with the text that it refers to, and the edit history does not get excessively long.) Anthony Appleyard (talk) 10:22, 8 November 2011 (UTC)
- I have now archived 4 old discussions back to Talk:Boltzmann constant/Archive 2. Anthony Appleyard (talk) 10:35, 8 November 2011 (UTC)
Latest Revision
I have removed the text:-
- , which must necessarily be observed at the collective or bulk level
Which is quite wrong. There is no need to 'observe temperature at the bulk level'
Temperature is an intensive quantity which means it is independent of the amount ('number of particles) present.
Surely it takes onl a little imagination to realise that temperature is the measure of the energy of a particle and the average energy of 'N' particles when there are a number (N) particles in the system? (the system must have only one temperature i.e. it must be in equilibrium, of course.
As for observe, what is that supposed to mean? In quantum systems particles with insufficient energy are unable to initiate reactions; check Einstein's Nobel prize winning photo emission paper of 1905. In this paper he showed how only photons with sufficient energy (= high enough temperature) were able to displace electrons. --Damorbel (talk) 15:48, 5 December 2012 (UTC)
Latest Revision (2)
The argument for the revision above is also applies to all relationships in the article between microscopic (particle level) effects and macroscopic (or bulk) systems.
Macroscopic systems have an undefined number of particles, a Mole (unit) is an example with NA (the Avogadro number) of particles but a number 106 smaller changes nothing.
Having a large number of particles in a system places an additional requirement on the (Boltzmann constant) relationship between particle energy and temperature. To establish a measurable temperature for such a system the average energy of the particles must be related by the Boltzmann constant.
I have removed or modified a number of texts in the article :=
- 1/ The Boltzmann constant, k, is a bridge between macroscopic and microscopic physics, since temperature (T) makes sense only in the macroscopic world, while the quantity kT gives a quantity of energy which is on the order of the average energy of a given atom in a substance with a temperature T.
It is not "a bridge.... etc." The are various requirements for systems with > 1 particle; one is where the particles are able to exchange energy in random way, this is the basis of thermal equilibrium, the Maxwell Boltzmann distribution describes the energy distribtion of such a system. Another multi-particle sytem with a defined temperature is one where all the particles have the same energy; this is unusual because it is not an equilibrium condition but can arise at low densities where the particles seldom exchange energy through collisions.
This statement:-
- while the quantity kT gives a quantity of energy which is on the order of the average energy of a given atom in a substance with a temperature T.
is wildly untrue!
In particle physics it is the molecule, not the atom that is the relevant unit. Polyatomic molecules such as carbon dioxide have a much higher heat capacity than monatomics such as helium, diatomics (H2, O2 ) are in between. To understand the reasons why, the article Degrees of freedom (physics and chemistry) helps explain.
Please, gentlemen, argue and discuss before inserting such statements!
Therefore the section "Bridge from macroscopic to microscopic physics" contributes nothing to the Boltzmann constant article and will be deleted. --Damorbel (talk) 07:38, 6 December 2012 (UTC)
"Role in the equipartition of energy" ?
The Boltzmann constant will shortly replace the Kelvin as one of the seven base units in the International System of Units (SI) because it can be determined more accurately than the triple point (see http://iopscience.iop.org/0026-1394/48/3/008/pdf/0026-1394_48_3_008.pdf) more accurately
Currently the Boltzmann constant can be determined to 6 places (kB = 1.380 651(17) × 10−23 JK−1) by measuring johnson noise. There is no way the Boltzmann constant can be regarded as:-
- the thermal energy carried by each microscopic "degree of freedom" in the system is on the order of magnitude of kT/2
There are a number of notable features here, the constant is measured using electron, not gas, temperature, so the article, when limiting itself to atoms of gas is far too restrictive.
The Boltzmann constant is applicable to all particles, even grains of pollen, as observed by Einstein in his 1905 paper on Brownian motion entitled :-
Role in the equipartition of energy (2)
The section Role in the equipartition of energy has no content about the equipartition of energy. I have inserted a link. --Damorbel (talk) 08:58, 6 December 2012 (UTC)
- I have changed the link into a {{main}} template.
- Since you're aware of the article on Equipartition of energy, which is a good thing, can I direct your attention to the section Equipartition of energy#Failure_due_to_quantum_effects.
- It's the section which starts
The law of equipartition breaks down when the thermal energy kBT is significantly smaller than the spacing between energy levels. Equipartition no longer holds because it is a poor approximation to assume that the energy levels form a smooth continuum, which is required in the derivations of the equipartition theorem above.
- This is why it is a mistake to think of temperature as the average energy per degree of freedom. In quantum systems, where the energy levels no longer form a smooth continuum, that ceases to be true, and one needs to move to a more sophisticated idea of temperature.
- In turn, that is why it important to say that "the thermal energy carried by each microscopic "degree of freedom" in the system is on the order of magnitude of kT/2", rather than saying it is equal to kT/2 -- because
- (1) in quantum systems thermal activity in those degrees of freedom can get "frozen out", depending on the density of states; and
- (2) even in classical systems (like your Brownian motion system), it is only as an average over all the molecules in the system that the thermal energy equals kT/2 -- the thermal energy of a particular individual degree of freedom will have a probability distribution, so at any particular time will only be on the order of kT/2.
- Jheald (talk) 13:57, 6 December 2012 (UTC)
- I have also reverted your substantial deletion to the section "Bridge from macroscopic to microscopic physics"
- The key point the section is making is that PV=nRT is an equation that is entirely about moles of gas -- the quatities involved can all be measured at an entirely macroscopic level. But changing from nR to Nk moves that frame of reference to thinking about individual molecules of gas. It opens a doorway to a microscopic view of thermal physics, where one then looks to relate pressure to the kinetic theory of molecules, and temperature to the properties of ensembles of molecules.
- I think that is a valuable proposition, and one worth keeping in the article. Jheald (talk) 14:07, 6 December 2012 (UTC)
- I'm sorry you chose to undo my contribution before this discussion. You write:-
- This is why it is a mistake to think of temperature as the average energy per degree of freedom
- on the reasoning that, becase of "Failure due to quantum effects" :-
- Equipartition no longer holds because it is a poor approximation to assume that the energy levels form a smooth continuum
- In thermal system where the molecules have sufficient energy to excite quantum transitions, the energy in the quantum states is 'in quanta', i.e. fixed lumps in common parlance. Being fixed these, quanta are not part of the kinetic energy of the particles; for this reason kinetic (or thermal) physics excludes energy stored in quantum levels. The same is seen in other, none kinetic (i.e potential ) energy forms such as chemical bonds and intermolecular forces such as van der Waals forces. These arguments are fundamental to quantum theory and are the basis of Planck's law of radiation.
- The equipartition of energy article also refers to other cases where it 'breaks down'. It has, for coupled oscillators :-
- equipartition often breaks down for such systems, because there is no exchange of energy between the normal modes. In an extreme situation, the modes are independent and so their energies are independently conserved
- The equipartition of energy article also refers to other cases where it 'breaks down'. It has, for coupled oscillators :-
- It should, of course, be obvious that systems that cannot exchange energy will not partition energy equally. Again, the definition of thermal systems requires that the system particles freely exchange energy according to the Maxwell Boltzmann distribution; quantum interactions are not, by definition, free exchanges of energy.
- Further, about the "Bridge from macroscopic to microscopic physics":-
- The Boltzmann constant is relevant to all sorts of macroscopic physics and this should be mentioned in the article, but this constant is a property of a particle, any particle, that can exchange mechanical energy (difficult for neutrinos!) But atoms, molecules, electrons, protons, etc. all qualify as particles. Electrons are really interesting in this respect because, in semiconductors at working temperature, the band gap of a doped semiconductor is near the thermal energy of an electron (see carrier generation and recombination and the working temperature influences the conduction process much more than in (metal) conductors. All of these matters are a consequence of he Boltzmann relation and should be mentioned in the article but they are of secondary relevance. Matters of primary interest in this article are the value of the constant and how it is measured - such information is entirely relevant yet doesn't appear in the article.
- Now I don't want to indulge in an edit war so I'm not going to restore your 'undo' but I do think you should consider doing it yourself, if you are convinced by the arguments I have presented (for the second time).
- So "quanta are not part of the kinetic energy of the particles" ?
- That's interesting. Have you considered applying that statement to diatomic molecules in a gas at room temperature? Most people would consider the energy of the molecules stretching and unstretching (vibration) or tumbling (rotation perpendicular to their axis) as exactly what they mean as part of the kinetic energy of the molecules.
![](https://upload.wikimedia.org/wikipedia/commons/thumb/6/64/DiatomicSpecHeat2.png/280px-DiatomicSpecHeat2.png)
![](https://upload.wikimedia.org/wikipedia/commons/thumb/0/07/DiatomicSpecHeat1.png/280px-DiatomicSpecHeat1.png)
- But if you look at the real data, in terms of the heat capacity per molecule, this is exactly where you don't see equipartition -- you don't see a neat heat capacity of (1/2)k per degree of freedom. Instead you see much messier data, with the molecules only reaching a heat capacity of (7/2)k at high temperatures -- and if you look at the halogens, you can see even slightly overshooting (7/2)k.
- You write: "Being fixed these quanta are not part of the kinetic energy of the particles; for this reason kinetic (or thermal) physics excludes energy stored in quantum levels."
- But thermal physics precisely does not exclude these rotational and vibrational energies: they are vital to include to accurately account for the heat capacity.
- Again, the definition of thermal systems requires that the system particles freely exchange energy according to the Maxwell Boltzmann distribution; quantum interactions are not, by definition, free exchanges of energy.
- Thermal systems need to be able to exchange energy, but not necessarily according to a Maxwell Boltzmann distribution. The distribution you actually get will be a combination of the Boltzmann distribution and the density of states, which might not be either quadratic or continuous.
- Diatomic gas molecules very much are themal systems. Their rotation energies and (at higher temperatures) vibration energies are thermalised, and are readily exchanged. The issue is, they have a ladder of possible energies rather than a continuum of possible energies, so exchange of energy between them knocks them up and down this ladder. And this means their heat capacity departs from k/2 per degree of freedom. But the heat capacity is not zero -- there is a distribution of occupation levels on the energy ladders, so you can't say that these degrees of freedom are not thermalised.
- Finally you write: "The Boltzmann constant ... is a property of a particle, any particle, that can exchange mechanical energy"
- It isn't. The Boltzmann constant is a property of the system of units used to measure entropy. It sets the mapping between bits and Joules per Kelvin.
- A discussion of the history of how this constant of proportionality has been determined might be useful, from R/NA, through various quantum-mechanical measurements, to today when it is now proposed to be set by fiat like the speed of light. But the bottom line is that the Boltzmann constant is essentially to do with how we define a Kelvin, rather than being a property of a particle. Jheald (talk) 12:59, 7 December 2012 (UTC)
Jheald, commenting upon my assertion that:-
- "quanta are not part of the kinetic energy of the particles".
And then you write:-
- But thermal physics precisely does not exclude these rotational and vibrational energies, they are vital to include to accurately account for the heat capacity
What you refer to as these rotational and vibrational energies are not quantum states of the particles and I certainly do not exclude them from the particle energy contributing to heat capacity etc., etc.; of course they do! All I am saying is that the energy locked in particle quantum states is not part of the kinetic energy = 1/2v2that is exchanged during thermal collisions.
Again you write:-
- Thermal systems need to be able to exchange energy, but not necessarily according to a Maxwell Boltzmann distribution.
Who ever said they did? The Maxwell Boltzmann distribution only applies in equlibrium conditions i.e. when the particles exchange energy freely and with equal probability.
You write :-
- Diatomic gas molecules ..... Their rotation energies and (at higher temperatures) vibration energies are thermalised,...
Yes, because at low temperatures the vibration energies are too small to overcome the diatomic binding forces, thus there is little or no energy in the (elastic) binding at low temperatures.
Further, you write:-
- The issue is, they have a ladder of possible energies rather than a continuum of possible energies, so exchange of energy between them knocks them up and down this ladder. And this means their heat capacity departs from k/2 per degree of freedom.
Not true. When energy is locked in a quantum state it only has quantum degrees of freedom which will not be activated by quantum events with lower energy, that is why Einstein got the 1922 Nobel prize. The energy in the inner quantum states of atoms is relatively high, for the inner electrons of metals it is extremely high e.g. X-Rays. The quantum states of molecules are in the range of thermal energy, 1.24 meV - 1.7 eV, which corresponds to the kinetic energy of molecules at intermediate tempertures so there is a great deal of overlap, effectively leading to a continuum, as can be seen from the Planck energy spectrum which never goes to zero.
You write:-
- so you can't say that these degrees of freedom are not thermalised.
Ultimately, at high enough temperatures, all atoms will be stripped of their electrons i.e. 100% ionised and all the electrons and protons will be thermal and none of them will be in particle aggregates such as atoms and molecules. Such conditions are said to exist at the centre of stars.
The reason for my original deletion, which you 'undid' was to eliminate the very real confusion between the Bolztmann constant and the application of it. The Boltzmann constant is a very simple but important ratio between temperature and particle energy, it has the same dimension as entropy, joules/K but entropy is about a system of particles whereas the Boltzmann constant is about a single degree of freedom. In view of all this I would like you to undo the deletion of my contribution.--Damorbel (talk) 10:18, 8 December 2012 (UTC)
- Just to be absolutely clear. The "rotational and vibrational energies" are quantised (see eg our articles on Rotational spectroscopy and on Molecular vibration), and this quantisation is important.
- Of course atomic and molecular electron orbitals are quantised too, but it wasn't quantisation of electrons I was discussing above. I was specifically discussing the quantisation of energies of the rotational and vibrational kinetic degrees of freedom of the molecules themselves as whole molecules.
- The rotational and vibrational energies do indeed have a ladder of states, and the occupation of these is thermalised. It is a reasonably correct description to say that two molecules get knocked up and down these ladders, interchanging energy, when they collide.
- At everyday temperatures, as the heat capacity plots make clear, these ladders of energies are not frozen out -- otherwise the heat capacity would be stuck at (3/2) k; nor are they so dense compared to the thermal energy that they can be treated as a continuum -- otherwise the heat capacity would be the full (7/2) k. Instead they have to be treated for what they are -- a thermalised ladder of quantum energy states. As a result, as I wrote above, this means the heat capacity per molecule departs from k/2 per kinetic degree of freedom.
- So it's quite wrong to say things like "kinetic (or thermal) physics excludes energy stored in quantum levels"; and it's better to say "the thermal energy carried by each microscopic "degree of freedom" in the system is on the order of magnitude of kT/2", rather than saying it is equal to kT/2, as further discussed in my first post above.
- And that's why it's not a very good idea to think of the Boltzmann constant primarily as a ratio of a temperature to an equipartion energy. The Boltzmann constant is more universal than that. It fixes the scale for our customary units of entropy -- whether the entropy associated with a single degree of freedom (and yes, you can have entropy associated with a single degree of freedom), or whether associated with a system of particles.
- Equipartition of energy follows as a consequence if there is a smooth quadratic density of states. Non-smooth or non-quadratic densities of states don't produce equipartition. The Boltzmann constant fixes the units of entropy. This is the relationship that is fundamental. That, through the relation (1/T) = dS/dE, is then what fixes the numerical temperature associated with idealised equipartition. Jheald (talk) 12:12, 8 December 2012 (UTC)
- You write:-
- Just to be absolutely clear. The "rotational and vibrational energies" are quantised
- You write:-
- This is not an informed argument. Can you name me an interaction that is not quantised?
- I was specifically discussing the quantisation of energies of the rotational and vibrational kinetic degrees of freedom of the molecules themselves as whole molecules.
- And my answer was that the thermal (i.e.) translational energies are of the same order and are thus indistinguishable.
- You write :-
- So it's quite wrong to say things like "kinetic (or thermal) physics excludes energy stored in quantum levels"
- You write :-
- The thermal energies are of the order 0.025 - 1.00 meV; these thermal photons will do nothing to quantum states 1.1eV and higher, their energy is excluded from the kinetic interactions; thus there is no equipartition with these quantum states.
- You write :-
- The Boltzmann constant fixes the units of entropy
- You write :-
- How can it do this?
- Or, if you want to be slightly more sophisticated about it,
- So if you've got a probability distribution, it's got an entropy. That can be probabilities associated with one degree of freedom; it can be the probability distribution of the combined state of a whole system of particles. It doesn't matter. If it has a probability distribution, then it has an entropy. What k does is just change the units from nats (or bits) to whatever you want to use as your preferred measure of entropy -- kilocalories per Rankine, or kilogram (furlongs per fortnight) squared per Rømer, or whatever.
- That's how it fixes your units of entropy. Then, once you have chosen units for energy and entropy, temperature follows directly, from (1/T) = dS/dE.
- Now, to try just once more to get over my other point to you:
- The energies of molecular rotation and vibration at room temperature are kinetic, are thermal, and are quantised -- and the fact of that quantisation matters, because it directly affects the heat capacity associated with the degree of freedom.
- I'm not talking about electrons here. I'm not talking about photons. I'm talking about the kinetic rotation energy of the molecules themselves, which is interchanged when they bash into each other.
- Talking about temperature as related to the average energy per degree of freedom might be okay when introducing the subject to children; but it's not a good long-term standpoint, because there are too many systems for which it is not true, either because the density of states is not perfectly quadratic, or (as above) because it's quantised, with energy gaps of a size that means the quantisation cannot be ignored. A better long-term strategy is to think of temperature as (1/T) = dS/dE, becuase that's more fundamental, more general, and encompasses equipartition as a special case when the density of states is smooth and quadratic. Jheald (talk) 14:31, 8 December 2012 (UTC)
The article is about the Boltzmann constant, but Jheald writes about :-
- if you've got a probability distribution, it's got an entropy.
The article is about the Boltzmann constant which is the energy of a single particle per Kelvin, but Jheald writes about:-
- That's how it fixes your units of entropy. As if entropy was a constant like the Boltzmann constant !
The article is about the Boltzmann constant, but Jheald writes about :-
- I'm not talking about electrons here. I'm not talking about photons. I'm talking about the kinetic rotation energy of the molecules themselves, which is interchanged when they bash into each other.
Oh dear! Electrons? Photons? The kinetic rotation energy of the molecules? Yes Jheald, I know what your talking about and it isn't the Boltzmann constant!
- The energies of molecular rotation and vibration at room temperature are kinetic, are thermal, and are quantised -- and the fact of that quantisation matters, because it directly affects the heat capacity associated with the degree of freedom.
Oh dear! Where is the Boltzmann constant in this?
Now I know that you have nothing to contribute to the article on the Boltzmann constant, what a shame!
And now I feel free to restore my contribution after your deletion. --Damorbel (talk) 18:00, 8 December 2012 (UTC)