→The First Line in the Article (3): r: enough. no more. |
|||
Line 175: | Line 175: | ||
And with Lewis "W.C.M Lewis (1924), ''A system of physical chemistry'', "The temperature, in fact, is determined by the average kinetic energy. It is therefore meaningless to speak of the temperature of a single molecule in a gas" Notice he says 'System' thus multiple interacting particle. Did you notice 'average kinetic energy'? And 'in a gas' It is well known that the particles in a gas (at equilibrium) have the [[Maxwell-Boltzmann distribution]] of velocities therefore, with their different velocities, they have all got different temperatures; temperatures that are changing when the molecules collide. Temperature is the measure of energy in a atom (molecule or degree of freedom). --[[User:Damorbel|Damorbel]] ([[User talk:Damorbel|talk]]) 18:05, 1 November 2011 (UTC) |
And with Lewis "W.C.M Lewis (1924), ''A system of physical chemistry'', "The temperature, in fact, is determined by the average kinetic energy. It is therefore meaningless to speak of the temperature of a single molecule in a gas" Notice he says 'System' thus multiple interacting particle. Did you notice 'average kinetic energy'? And 'in a gas' It is well known that the particles in a gas (at equilibrium) have the [[Maxwell-Boltzmann distribution]] of velocities therefore, with their different velocities, they have all got different temperatures; temperatures that are changing when the molecules collide. Temperature is the measure of energy in a atom (molecule or degree of freedom). --[[User:Damorbel|Damorbel]] ([[User talk:Damorbel|talk]]) 18:05, 1 November 2011 (UTC) |
||
:I'm sorry, but you haven't got a clue; it's become clear that it's a waste of time engaging with you, because you appear simply to be incapable of understanding when you haven't got a clue; and evidently you have absolutely no interest in acquiring a clue. |
|||
:For the last time, molecules don't have a temperature -- it is ''distributions'' of molecules that have a temperature. A [[Maxwell-Boltzmann distribution]] of molecular velocities is characterised by a particular temperature. Individual molecules are not. Asserting that the individual molecules in a Maxwell-Boltzmann distribution each have different temperatures because they have different velocities shows you simply don't get it (as the above citations make very clear). |
|||
:Now, as SpinningSpark noted above, and as I also put to you on the [http://en.wikipedia.org/w/index.php?title=Talk%3AHeat&action=historysubmit&diff=450678146&oldid=450676891 15 September], it is not WP's job to teach you physics, nor -- see [[WP:TALK]] -- are the talk pages here to try to straighten out your personal misconceptions. If you can find what WP would consider a [[WP:RS|reliable source]] that contests this article's proposition that "temperature ... must necessarily be observed at the collective or bulk level", then bring it on. Otherwise this discussion is at an end. [[User:Jheald|Jheald]] ([[User talk:Jheald|talk]]) 19:48, 1 November 2011 (UTC) |
|||
Further you wrote "Unless Damorbel is able to produce more reliable sources to contradict this it is pretty much settled by the sources - it can't go in the article, period." Now please explain, if the [[Boltzmann constant]], with a value of 1.380 6488(13)×10<sup>−23</sup>J/k is not the energy in a single degree of freedom, just what is it? Even in its present form the article says :- "The Boltzmann constant (k or k<sub>B</sub>) is the physical constant relating energy at the individual particle level with temperature". Now if you disagree with this well and good. But then perhaps you do not agree with the revisions to the fundamental physical constants, including the Boltzmann constant, currently being proposed by the [[CIPM]]? Wouldn't it still be a valid contribution to Wikipedia to draw the attention of users to the reasons why these changes are being considered? --[[User:Damorbel|Damorbel]] ([[User talk:Damorbel|talk]]) 15:56, 1 November 2011 (UTC) |
Further you wrote "Unless Damorbel is able to produce more reliable sources to contradict this it is pretty much settled by the sources - it can't go in the article, period." Now please explain, if the [[Boltzmann constant]], with a value of 1.380 6488(13)×10<sup>−23</sup>J/k is not the energy in a single degree of freedom, just what is it? Even in its present form the article says :- "The Boltzmann constant (k or k<sub>B</sub>) is the physical constant relating energy at the individual particle level with temperature". Now if you disagree with this well and good. But then perhaps you do not agree with the revisions to the fundamental physical constants, including the Boltzmann constant, currently being proposed by the [[CIPM]]? Wouldn't it still be a valid contribution to Wikipedia to draw the attention of users to the reasons why these changes are being considered? --[[User:Damorbel|Damorbel]] ([[User talk:Damorbel|talk]]) 15:56, 1 November 2011 (UTC) |
Revision as of 19:48, 1 November 2011
Physics B‑class Top‑importance | ||||||||||
|
the boltzmann constant in hartree doesn't seem to be right!
Values of k Units Comments 1.380 6504(24) × 10−23 J/K SI units, 2006 CODATA value 8.617 343(15) × 10−5 eV/K 1 electronvolt = 1.602 176 53(14) × 10−19 J 1/kB = 11 604.51(2) K/eV
2.303 6644(36) × 1010 Hz/K 6.336 281(73) × 10−6 EH/K 1 hartree = 27.211 383 86(68) eV = 4.359 74394(22) × 10−18 J
that number of 6.336281 seems to be half of it, 3.16682!
--Yingchiehsun (talk) 02:59, 12 May 2009 (UTC)
- Well spotted! You can check the conversion at the NIST site. Strangely, the conversion factors from hartrees to electronvolts and joules were correct, but whoever had added the value of the Boltzmann constant had forgotten that EH = 2R∞. Physchim62 (talk) 09:26, 12 May 2009 (UTC)
conversion factor
Why does this table list 4.1868 as the conversion factor for joules and calories? Surely the appropriate conversion is 4.184 since we're talking thermodynamics —Preceding unsigned comment added by 206.248.179.192 (talk • contribs) 04:30, 11 August 2010
- More to the point, the article should be specifying which of the many flavours of calorie is intended. SpinningSpark 17:24, 11 August 2010 (UTC)
Thermal voltage section has Boltzmann constant in the wrong units
The Boltzmann constant should be 1.380 6504(24) × 10−23 J/K instead of 8.617 343(15) × 10−5 eV/K if you are using Coulomb for the charge of an electron. —Preceding unsigned comment added by 167.225.107.17 (talk) 15:46, 20 September 2010 (UTC)
- What is your problem, both systems of units are given. SpinningSpark 18:23, 20 September 2010 (UTC)
If we choose to measure temperature in units of energy....
This statement appears at the end of the article: 'If we choose to measure temperature in units of energy then Boltzmann's constant would not be needed at all.[7]'. It is nonsense; if temperature were measured in units of energy then a statement such as 'the temperature is 10 Joules' would be meaningful, it isn't. --Damorbel (talk) 07:27, 26 January 2011 (UTC)
- It is meaningful to say "the temperature is such that the amount of energy needed to raise the entropy by one nat is 10 Joules". The point being made is that that establishes a particularly natural scale for temperature, similar to choosing a scale for force such that the constant of proportionality in comes out to be unity. But perhaps the article should say "in units of energy per nat". Jheald (talk) 09:16, 26 January 2011 (UTC)
- Naturally entropy is the conjugate of temperature, so in systems of units with k=1, the relation to the nat is completely implicit, so that this language is really not necessary and complicates the presentation needlessly. Kbrose (talk) 21:50, 26 January 2011 (UTC)
Temperature and entropy, although connected in some respects, must be distinguished. Check this - Intensive and extensive properties. Temperature is an intensive property, a thermodynamic system almost certainly has different temperatures at different locations, the same system will have only one entropy because entropy is an extensive property. Unless the temperature is uniform throughout the entropy will not be a maximum. It would be possible to divide the system into subsystems, each with its own entropy contributing to the system entropy but that remains the same as describing temperature as an intensive property. Another fact - a system must be in equilibrium i.e at maximum entropy, to have a definite temperature. --Damorbel (talk) 14:02, 26 January 2011 (UTC)
- What is it you don't understand about E = 1/2 kT ? Isn't that clear enough that temperature is just another expression of energy, of a certain kind of energy? k is simply a conversion factor of units, that does not change any kind of underlying physics. The most elegant, pure description of physics results when k=1 as is often practiced. You have been told this over and over again in various articles, yet you keep belaboring this topic for months without showing any kind progress of understanding. Please stop engaging in these endless, non-productive cyclical arguments, inquiries, and debates. You are wasting people's time. Kbrose (talk) 21:50, 26 January 2011 (UTC)
If as you say "Isn't that clear enough that temperature is just another expression of energy, of a certain kind of energy?" then it should be possible to express temperature in terms of energy Joules or ergs. An analogous comparison is Volt (potential), Coulomb (charge) and Farad (capacitance). Let Volt correspond to Kelvin, Coulomb to Joule and Farad to either kB(Boltzmann constant), R (gas constant) or C (heat capacity); The Volt is not a measure of charge, it is the potential measured accross the terminals of a one Farad capacitor containing a charge of one one Joule; with a capacitance of 1/2Farad 1 Joule would give 2 volts; thus the potential E relates to charge Q on capacitor C like this:- E = Q/C.
Now kB (Boltzmann constant), R (gas constant) and C (heat capacity) are related by numbers that only differ in the amount of material present; kB = R/NA NA is Avogadro's number - the number of molecules in a mole of an ideal gas and C (heat capacity) is the heat capacity of a mole of any substance. So, as it says in the first line of the article, "the Boltzmann constant (k or kB) is the physical constant relating energy at the individual particle level with temperature observed at the collective or bulk level".--Damorbel (talk) 08:41, 27 January 2011 (UTC)
- The gas law says that the number pV/T=nR=Nk is a measure of the amount of gas. n is the number of mols, and N is the number of molecules. So the gas constant R has the unit of joule per kelvin per mole, and the Boltzmann constant k has the unit of joule per kelvin per molecule. So k is the amount of gas pV/T for one molecule. A lot of confusion arises by the unnecessary use of the unit mole. I wonder how it was ever standardized. The joule per kelvin is an SI unit for amount of matter. Bo Jacoby (talk) 10:22, 27 January 2011 (UTC).
Bo Jacoby, I have indented your paragraph for clarity. I have edited my previous contribution to change Avogadro's number from N to NA. I agree about the mole, I think it is a (mistaken) attempt to 'simplify' the concept of molecular energy. Molecules are a big hazard when studying thermodynamics because there is only a loose connection between molecular structure and heat capacity.--Damorbel (talk) 11:39, 27 January 2011 (UTC)
- Thank you. I am no sure I understand you comment on molecular energy and heat capacity, but I notice the simplification of thermodynamical equations by omitting R and using the joule per kelvin unit for amount of matter. For example the specific heat of a substance becomes dimensionless. (J/K per J/K). Bo Jacoby (talk) 15:57, 27 January 2011 (UTC).
The First Line in the Article
The first line of the article introduces bulk (macroscopic) concepts not immedialy related to the Boltzmann constant; I suggest that this in not appropriate in an encyclopedia article. Currently the article (1st l) reads "The Boltzmann constant (k or kB) is the physical constant relating energy at the individual particle level with temperature observed at the collective or bulk level. It is the gas constant R divided by the Avogadro constant NA."
Since the Boltzmann constant will shortly become the recognised basic physical constant, replacing the Kelvin, this should, at the very least, be recognised in a competent encyclopedia.
The errors propagate through the article. Section 1 states "Boltzmann's constant, k, is a bridge between macroscopic and microscopic physics." Its least mistake is '..ann's con...' it should be "the Boltzmann constant" - thus no 'apostophe s'. But the major error is the 'bridge between macroscopic and microscopic physics', which is not correct. The proper 'bridge between macroscopic and microscopic physics' is (are?) Maxwell-Boltzmann statistics, which take into account that, in a perfect gas with random exchange of momentum the particles will have a distribution of energies with an average energy equal to the total energy divided by the number of particles.
It is an imporant concept of physics that particle interactions take place at particle level and the nature of the interaction is strongly related to the energy of the individual particle i.e. the particle temperature. It is of course not easy, perhaps impossible, to measure the temperature of the individual particle directly but it may well be inferred from the intensity with which, let us say, a chemical reaction takes place. --Damorbel (talk) 08:42, 6 October 2011 (UTC)
- Don't be pedantic, from your comments above you obviously have a single issue wonk.86.135.142.245 (talk) 02:47, 15 October 2011 (UTC)
User :86.135.142.245 Care to explain more clearly why objecting to "Boltzmann's constant, k, is a bridge between macroscopic and microscopic physics." is being pedantic? Just saying so helps nobody and doesn't help the article either. --Damorbel (talk) 19:39, 15 October 2011 (UTC)
- Temperature is necessarily a "macroscopic quantity" which is what "temperature observed at the collective or bulk level" means to say. Temperature must NECESSARILY be collected at the collective or bulk level, so I hope the sentence isn't taken to apply otherwise. Perhaps it should be rewritten to say that. I'll put it in and see if anybody objects. THe Bolzmann constant K is microscopic merely because it's R per atom. It's really avogadro's constant N_o which is the brige between microscopic and macroscopic physics! SBHarris 22:12, 15 October 2011 (UTC)
Sbharris|arris You write:- "Temperature is necessarily a "macroscopic quantity"" But the Boltzman constant is 'energy per degree of freedom' - of which a single atom has three, that is why a single atom has 3 x kT/2 or 3/2 x 1.380 6488(13)×10−23 J/K. I can see nothing in this that links these figures to a 'macroscopic level'. To have any meaning at the macroscopic level the entropy of the (bulk, macroscopic) system would need to be known or defined; if the entropy was not at a maximum i.e, when the particles making up the system did not have a Maxwell-Boltzmann distribution then it is not possible to define a temperature at the macroscopic level because there would not even be a statisical distribution supporting a macroscopic temperature, whereas it is quite reasonable to assign temperatures at the particle level.
Further, why should Avagadro's Number play a role in the definition of temperature? Avagadro's Number merely defines the size of the macroscopic system. It is painfully simple to have systems with 50% of AN; 25% of AN or even 0.000001% of AN all with exactly the same temperature, there is no reason why a temperature cannot be defined for a single particle, or if you try really hard for a single degree of freedom, and, wonder of wonders, you have the Boltzmann constant --Damorbel (talk) 09:59, 17 October 2011 (UTC)
- No, you don't. You can't talk about the temperature of a single atom, any more than you can talk about wetness or viscosity of a single water molecule. Wetness and viscosity are bulk properties. They have nothing to do with Avogadro's number, but both of them assume enough atoms to get good statistical properties from the collection. This is indeed a number much smaller than Avogadro's number N, but it must be larger than 1.
You can talk about the amount of heat in a bulk which can be mentally "assigned" to each atom, or "thermal energy per atom," but this is a bit like saying the average American family has 2.3 children. The actual energy per atom is quantized in whole numbers like the number of children in a family. You got a fraction only by dividing two totals, one of which has a distribution. Heat is like that. Temperature is something like "national wealth." It can be expressed in terms of "GDP" or "GDP per capita." But a single wage earner gets a salary-- not a GDP per capita. Don't be confused that both are measured in dollars. GDP per capita never does apply to a single wage earner, and temperature never applies to single particles.
Both R and k have units of specific entropy or specific heat capacity, which is (thermal) energy per degree kelvin. The first constant is appropriate to use per mole, the second is appropriately used per atom since R = kN = kAN. Thus, neither unit can be used to "replace" temperature, since both units assume that a temperature already exists. Neither R nor k gives you a measure of "thermal energy" unless first multiplied by a temperature. Both R and k are merely mediator scaling-constants between thermal energy and temperature. The various degrees of freedom are small numbers with no units, and since they have no units, they are thus not part of either k or R. Of course k or R must be multiplied by effective degress of freedom before you get an answer to a heat capacity, but these degrees of freedom don't have to be whole numbers for the bulk, only for the individual particles. At low temperatures, heat capacities for a mole of substance can fall to small fraction of RT, which means a tiny fraction of kT per atom. It could be 0.01 kT per atom, but that doesn't mean any atom has 0.01 degrees of freedom anymore than a family has 2.3 kids. It means that 1% of atoms are excited into a state where one of their degrees of freedom contains a quanta of energy, from heat. What fraction of degrees of freedom particiate, and what fraction of atoms are excited in any way, and how many in more than one way, is a quantum discussion. See the Einstein and Debye sections of the heat capacity article, at the end. As well as their own articles.SBHarris 22:41, 27 October 2011 (UTC)
The First Line in the Article (2)
User:Sbharris: This is in danger of becoming a (bad) example of a Wiki War if you keep changing my contributions; please check your user page.
You write "You can't talk about the temperature of a single atom, any more than you can talk about wetness or viscosity of a single water molecule. Wetness and viscosity are bulk properties" Do you mean that the temperature of a collection of atoms (molecules) depends on the number of atoms (molecules) it contains? I ask again, at what number of atoms (molecules) does their temperature begin to deviate from that of a sample containing one mole i.e. Avogadro's number, AN of particles?
You write "The actual energy per atom is quantized" Quantization effects are seen at energy levels related to the Planck constant 6.62606957(29)×10−34J s about 10-11 smaller than the Boltzmann constant, this is not really significant at thermal energies and way below that of any significant fraction of AN.
When you write about "Both R and k have units of specific entropy" and "Thus, neither unit can be used to "replace" temperature" you must understand that the Boltzmann constant will not 'replace' the Kelvin in the new definition of fundamental constants, but the Kelvin will be defined in terms of the Boltzmann constant because it is now possible to determine the Boltzmann constant much more accurately than the Kelvin. The Kelvin is defined by the tripple point of water which is of limited accuracy since it is a function of the isotopic balance of the water (D2O freezes at a differnt temperature than H2O). --Damorbel (talk) 11:45, 30 October 2011 (UTC)
- "Quantization effects... [are] not really significant at thermal energies." Can I suggest you read Equipartition theorem#History, and then perhaps revise your statement. Heat_capacity#Theory_of_heat_capacity treats the material at greater length, though it is perhaps over-wordy, and could use some pictures drawn from real data. Fermi–Dirac statistics also discusses quantum effects at very everyday temperatures. [[User:|Jheald]] (talk) 12:03, 31 October 2011 (UTC)
- Jheald, if you have something to say about quantum effects on thermal processes, then perhaps it would be useful if you could say just what it is that you are thinking of, merely giving a few links will only waste our time as we try to identify what is raising your concerns.
- Perhaps you are thinking of statements like this "In general, for this reason, specific heat capacities tend to fall at lower temperatures where the average thermal energy available to each particle degree of freedom is smaller, and thermal energy storage begins to be limited by these quantum effects. Due to this process, as temperature falls toward absolute zero, so also does heat capacity." from your link http://en.wikipedia.org/wiki/Heat_capacity#Theory_of_heat_capacity The effect referred to is one that is seen with very large temperature excursions and is offten described as certain modes of vibration being 'locked out' because they appear to have a threshold below which they neither absorb nor release energy. Thanking you in advance. --Damorbel (talk) 17:52, 31 October 2011 (UTC)
- One important point is that the magnitude of the Planck and Boltzmann constants cannot be compared since they are measured using different units. Dauto (talk) 19:28, 31 October 2011 (UTC)
- Joules per Kelvin and Joules per Hertz - both are measures of energy density, both are a measure of particle energy, the Boltzmann constant is a measure of mechanical particle energy and the Planck constant the measure of photon (electromagnetic) energy - both are treated as particles because that is what experimental science shows them to be. --Damorbel (talk) 21:20, 31 October 2011 (UTC)
- That misses the point. In order to get units of energy, you must multiply h by frequency f (often a very large number for atoms and atom vibrations), to get energy k must only be multiplied by T (a very much smaller number).
In fact, if you want to know where a solid departs from the classical Dulong-Petit law limit of heat capacity of 3R/mole you can examine a characeristic temperature value at which quantum effects become very important. In the Einstein solid theory that is the so-called Einstein temperature: T_E = hf/k. Notice that the frequency has ofsets the very large ratio of h/k. The factor that determines excursion from Dulong-Petit at room temp is a function of dimensionless ratios of T/T_E = s (the "reduced temperature"). Einstein temps vary from 85 K or so (lead) to thousands of degrees K for tightly-bonded solids like carbon. For beryllium, T_E is ~ 690. The factor this corrects Dulong-Petit by, is (s)^2 * [e^s/(e^s-1)]. For beryllium at 300 K this comes out to 0.54, which predicts Be heat capacity at room temp will be only 54% of 3R (it's actually about 60%-66% of 3R, depending on source). For diamond you get 18% of 3R by calculation (actual measured is 24% of 3R). So quantum effects here are extreme, cutting solid heat capacities at room temp to a fraction of most other solids. Room temperature is not a "very large temperature excursion."
Similarly for diatomic gases at room temp the vibrational reduced temp is so high (thousands of degrees) that nearly all the vibrational heat capacity is not seen. Oxygen only has 13% of its theoetical vibration heat capacity degree of freedom at room temp, so has a constant volume heat capacity much closer to 5/2 R/mole than 7/2 R/mole. Damorbel is simply wrong-- quantization effects ARE important at room temp-- in fact for gases they are usually extreme, and for some solids composed of light well-bonded atoms, too.
As for the issue that it's easier to determine the "Boltzmann constant" than the Kelvin, that depends on how you do it. The Kelvin in the future may well be rescaled using energy plus k_B, rather than scaling Kelvin using gas pressures and the triple point of water. However, I don't see the point. You have to pick an energy scale and a temperature scale, and there will always be a constant that relates the two, if only to change the units. E = constant X T. That constant is some simple number times R or k_B. If you pick any two values, the third is determined, so it makes sense to pick the two things you can most easily measure, and let them determine the third. You "measure" k_B only if you have scaled Kelvin and energy already, but that's overdetermined. If you haven't scaled Kelvin, then you can measure the value of the Kelvin with regard to the joule, by fixing k_B at some arbitrary value (so it will no longer have a measurment variation). We've done exactly that with our standard for length, which is now determined by frequency (our time standard) and the speed of light (now fixed at an arbitrary exact value and no longer measured), since the relationship of time is measurable to higher precision than are the distance between marks on a bar of metal, and the speed of light is just a scaling constant between length and time. So? When this happens with temperature, the value of k_B (the Bolzmann constant) will be exact and fixed, as the speed of light is now. We will no longer "measure" it-- rather we will "measure" Kelvin using our energy standard. SBHarris 21:32, 31 October 2011 (UTC)
- That misses the point. In order to get units of energy, you must multiply h by frequency f (often a very large number for atoms and atom vibrations), to get energy k must only be multiplied by T (a very much smaller number).
- Joules per Kelvin and Joules per Hertz - both are measures of energy density, both are a measure of particle energy, the Boltzmann constant is a measure of mechanical particle energy and the Planck constant the measure of photon (electromagnetic) energy - both are treated as particles because that is what experimental science shows them to be. --Damorbel (talk) 21:20, 31 October 2011 (UTC)
The First Line in the Article (2)(Sbharris)
This new section needed because the last contriburtion to the section "The First Line in the Article (2)" adds a lot of material that is really a long way from the fundamentals of the Boltzmann constant.
User:Sbharris; in the first line of your last contribution you write "[in] order to get units of energy, you must multiply h by frequency f (often a very large number for atoms and atom vibrations " The Planck constant is about the vibration of electric charge, not atoms. Photons interact minimally with atoms having little or no dipole moment (a dipole moment is a manifestation of electric charge). Charge (or dipole moment) is how the energy of photons is converted back and forward to/from mechanical energy i.e. heat. These concepts do not require the Dulong Petit law to understand them, the Dulong Petit law is relevant at the bulk (macroscopic) level but not at the microscopic or particle level, the Boltzmann constant is not affected in the consideration of heat capacity that the Dulong Petit law deals with, the Dulong Petit law is quite irrelevant to the Boltzmann constant.
User:Sbharris you write " Damorbel is simply wrong-- quantization effects ARE important at room temp ". But do they impact on the Boltzmann constant? Further you write "Oxygen only has 13% of its theoetical heat capacity at room temp, so has a constant volume heat capacity much closer to 5/2 R per mole" Which may well be true but, once more, it is quite irrelevant to the Boltzmann constant and has no place in an article or even a discussion about the Boltzmann constant.
Yet further you write " As for the issue that it's easier to determine the "Boltzmann constant" than the Kelvin, that depends on how you do it. " You may well have a good point but are you not aware of the current progress in determining the Boltzmann constant? Check this link Big Step Towards Redefining the Kelvin:Scientists Find New Way to Determine Boltzmann Constant. Wikipedia can surely record the changes that are taking place? The old method of standardising temperature with the triple point of water is known to have limitations, the better standard is now generally accepted to be the value of the Boltzmann constant.
The fact is that the International Committee for Weights and Measures (CIPM) is adopting the Boltzmann constant as the most accurate unit which will make the Kelvin a derived unit, see here Preparative Steps Towards the New Definition of the Kelvin in Terms of the Boltzmann Constant, all of this should be in Wikipedia, don't you think?
I suggest that if you can discover a more accurate method of defining the Kelvin these the situation will be reversed by CIPM. But until that happens I suggest the new situation should be reflected in the Wiki article on the Boltzmann constant. --Damorbel (talk) 22:34, 31 October 2011 (UTC)
- I am well aware of people attempting to use energy-dependent methods to redefine the temperature scale, for example this one. Yes, they should be mentioned in this article, since if any one of them is adopted to standardize the Kelvin, then the Boltzmann constant will be fixed and no longered "measured," like the speed of light (which is no longer measured, but has an exact value). However, that hasn't happened yet.
As to your bizarre ideas that Planck's constant only has to do with charge, you need to read some physics books. Here on WP, there is matter wave which discusses interference of neutral atoms and even molecules, each of which has a wavelength set by h/momentum. As for the relevance of k_B and h to the heat capacity of solids, you're the one making the bizarre claims, not me. The Dulong-Petit law gives heat capacities in terms of R, which is N_A*k_B. The importance of k_B is not in the Dulong-Petit law, but in the way that heat capacities depart from it at low temperatures, and in different ways for different substances. Why don't you actually read Einstein solid and Debye model? See of you can derive either of them without using h or k_B. They have nothing to do with photons or with dipole moments of atoms, and would work equally well if atoms were little inert vibrating spheres with not a trace of charge.
Does the heat capacity of gases have any relevance to k_B? Of course. The entire kinetic theory of gases is based upon k_B (you do understand where this constant came from historically??), and gas heat capacity is only part of that larger theory. One cannot explain why heat capacity of gases behaves as it does, without invoking k_B, and this theory is a quantum theory at low temperatures (where you must invoke h), and yet it still has nothing at all to do with photons or dipole moments. In particular, the "freeze out" of heat capacity at lower temperatures (well on the way by room temperature) cannot be calculated without a theory that uses of both h and k_B. Again, the quantum kinetic theory of gases would work just as well if the gases were composed of little ideal atoms with no charge, rotating around bonds, vibrating, and bouncing off each other and the container elastically, with each atom behaving like a tiny Superball on a spring (if bonded), with no charge at all. SBHarris 23:49, 31 October 2011 (UTC)
- The vibration modes of molecules actually give quite a good example for getting a feel for what kB is all about.
- For example, let's work through the numbers for the fundamental CO2 bending mode. It's got a characteristic frequency of 667 cm-1 (in spectroscopists' units).
- That implies a corresponding energy gap of
- i.e. 6.6 x 10-34 x 3.0 x 108 x 66700 = 1.3 x 10-20 J
- This is the characteristic microscopic energy kT corresponding to a temperature that we can find by dividing through by k, ie 1.3 x 10-20 / 1.3 x 10-23 K, so about 1000 K.
- That tells us that at around 1000 K this vibration mode will be really starting to kick in, as shown in the schematic curve at Equipartition theorem#History. Much below this temperature most of the CO2 molecules will be in the ground state, so none of the thermal energy of the system will be in this vibration mode. At a much higher temperature, the various CO2 molecules will be spread in a distribution right up the ladder of vibrational states. This temperature corresponds to the cross-over between those two regimes, where a plurality of molecules in that CO2 gas are starting to get knocked into those first vibrational levels.
- This is the right way to think about how a particular microscopic energy can be related to a particular macroscopic temperature.
- But let's turn it around for a moment. Suppose you find a CO2 in that first vibrational level, corresponding to an energy of 1.3 x 10-20 J. Can you tell the temperature of the gas it came from? The answer is no you can't, not with any great precision. The temperature of the gas might be quite low, but that molecule happens to be one of the few that has got the energy to get out of the ground state. Or the temperature of the gas might be quite high, but that molecule happens to be one of the ones with comparatively little energy compared to the average. This I think is the key point that SB Harris has been trying to make to you: for a single molecule, the temperature simply isn't well defined.
- We can take this further. In terms of classical thermodynamics, the zeroth law of thermodynamics says that two systems are at the same temperature if they would be in thermal equilibrium when put in thermal contact with each other -- i.e. if they were free to exchange energy with each other. This is why it makes no sense to talk about a single molecule with a particular energy to have a particular temperature -- because to have a temperature, the molecule must be free to exchange energy. You can (sometimes) talk about a molecule that has a particular distribution of energy over a period of time as having a particular temperature -- because it is constantly exchanging energy over that time, sometimes having more energy, sometimes having less energy. Or you can talk of a set of molecules that has a range of energies having a particular temperature -- if the set is sufficiently large that the amount of energy being exchanged hardly affects the overall distribution of molecular energies. But you can't talk about a single molecule with a particular energy having a particular temperature -- it doesn't, its temperature simply isn't well defined.
- We can also think in terms of statistical thermodynamics, where temperature is defined as 1/(dS/dE). But the statistical thermodynamic entropy S = - k Σ p ln p: it is a property of a distribution over which probabilities can be defined. A single molecule in an explicitly given state in fact has zero entropy: its state is exactly defined. It is only when we allow there to be a probability distribution for the molecule's state; or, alternatively, a large number of molecules with energy that can be spread between them in so many different ways, that it becomes possible to have a statistical entropy, and so any chance of a well-defined temperature.
- The right way to think about the connection between T and kT is therefore to think of kT as an energy that is in some way characteristic of a temperature T. The wrong way to think about the connection is to think of a particle having an energy E having an associated temperature E/k -- temperature is a property of distributions, of macroscopic (or at least mesoscopic) systems; not of single particles.
- Finally, you keep returning to the CIPM proposing to use k to formally define the Kelvin. That is certainly something we probably should mention in the article. But probably fairly well down the piece, because it is something that really only starts to be meaningful to somebody once they already have a pretty good grasp of what k is -- and what a Kelvin is. The CIPM is also currently considering formally defining the kilogram in terms of Planck's constant, the Josephson constant and the von Klitzing constant from the quantum Hall effect. But that is something we (rightly) only touch on at the very end of our article on Planck's constant. Jheald (talk) 01:41, 1 November 2011 (UTC)
Sbharris,you write:- "I am well aware of people...". Do you consider the CIPM as just 'people'? Then these are the 'people' who chose the Kelvin defined by the triple point of water and now they are the 'people' who will make the Kelvin a unit derived from the Boltzmann constant, Boltzmann constant will now become the the fundamental constant, not the Kelvin - note the Wiki article is entitled 'Boltzmann constant' Surely the fact that it is internationally accepted as (or may well become accepted as) a fundamental physical constant should be in the introduction.
You write further "The right way to think about the connection between T and kT is therefore to think of kT as an energy that is in some way characteristic of a temperature T" Do you mean by this ' some way ' that E = kBT is somehow imprecise, and that the various parts of the equation, E, kB and T are somehow incomplete or imprecise? E = kBT is complete, it applies to particles in every kind of system not just to the single kind of system defined by the Maxwell-Boltzmann distribution that you keep introducing.
Again you write "We can take this further. In terms of classical thermodynamics, the zeroth law of thermodynamics says that two systems are at the same temperature " Why are you introducing ' systems '? Not just ' systems ' but ' systems ' in equilibrium, i.e. ' systems ' to which a single temperature can be assigned. None of this has anything to do with the Boltzmann constant, which is 'energy per degree of freedom, what you are introducing is the behaviour of systems of very many particles that are interacting in a random way that gives the Maxwell-Boltzmann distribution of particle energy. But these conditions exist only in a gas defined as being in thermal equilibrium, a very rare condition indeed. But there are many particles in the universe that are not part of a gas and certainly do not have a single assignable temperature at the macroscopic level; the Boltzmann constant, properly defined is equaly relevant to these systems. The Boltzmann constant is also used in calculating the forward voltage and the reverse current of p-n junctions and their [thermal voltage]
You further emphasise your macroscopic definition of temperature (which is correct at equilibrium) by stating "think in terms of statistical thermodynamics, where temperature is defined as 1/(dS/dE)." This is only true when the entropy S is at a maximum, another way of saying the system is in equilibrium. If the system (of particles) is not in equilibrium what then does your definition of temperature [T] = 1/(dS/dE) give as 'the temperature T'? It doesn't exist, does it? This is the reason why macrosopic matters have only a minimal place in an article entitled 'Boltzmann constant'.
Introducing the "The CIPM is also currently considering formally defining the kilogram in terms of Planck's constant, the Josephson constant and the von Klitzing constant from the quantum Hall effect" is quite irrelevant. Physical constants are just that, constant. Fundamental physical constant are called 'fundamental' because they can be indepenently defined i.e. their size is unrelated to other constants, so why cite the CIPM activity on these matters as relevant to the Boltzmann constant? --Damorbel (talk) 10:16, 1 November 2011 (UTC)
- Actually, if you read what I was talking about above, I was talking about the thermal excitation of a vibration mode. That is something different to the distribution of molecular speeds which is given by the Maxwell-Boltzmann distribution.
- To pick up some of your points:
- The Boltzmann constant is "energy per degree of freedom". This (E = ½ kT) is only true when the equipartion of energy holds. As has been repeatedly pointed out to you, equipartition breaks down when modes get frozen out if the quantum energy level spacing is too big. So any general statement like this needs to be taken with considerable caution. Furthermore, equipartition is about average energy per degree of freedom. It applies when there is a probability distribution of energies -- either for a single particle over time, or for a collection of particles.
- Similarly, as I have explained above, temperature is a property of things in thermal equilibrium -- things which can exchange energy with their surroundings. A single molecule in a particular state does not have a temperature. Temperature is not a property of states of single molecules. You cannot just calculate the energy and divide by k. This is not temperature.
- Do you understand this? Jheald (talk) 11:41, 1 November 2011 (UTC)
- Further you wrote "The Boltzmann constant is "energy per degree of freedom". This (E = ½ kT) is only true when the equipartion of energy holds" By this do you mean that 'equipartition ... etc." is relevant to the Boltzmann constant? That the meaning and calculation of the Boltzmann constant is meaningless without equipartition? That the energy in atoms and molecules has nothing to do with the Boltzmann constant unless thre is equipartition of energy? If you believe this, how can you do gas dynamics when the gas is accelerating and thus not in equilibrium? --Damorbel (talk) 15:09, 1 November 2011 (UTC — Preceding unsigned comment added by Damorbel (talk • contribs)
This debate could go on forever, I don't think Damorbel will ever be convinced and it is pointless to continue trying to do so. It is not Wikipedia's job to teach Damorbel. Wikipedia is based on sources, not written by working through theory ourselves. There is no shortage of quality physics textbooks which specifically and directly state that the temperature of a single molecule is meaningless. Unless Damorbel is able to produce more reliable sources to contradict this it is pretty much settled by the sources - it can't go in the article, period. SpinningSpark 13:02, 1 November 2011 (UTC)
The First Line in the Article (3)
Spinningspark, you wrote:- citing your Google search for 'temperature of a single molecule' "the temperature of a single molecule is meaningless " But you don't actually cite a single document of any kind!
- Just click on the first three or four results, they are all relevant. SpinningSpark 16:31, 1 November 2011 (UTC)
" Just click on the first three or four results, they are all relevant. But Spinningspark just giving the links is no contribution, you must also explain. Up until now I have not read anything relevant in what you write. Why are these links of yours relevant? I have looked at them and they do not have anything of particular interest to say. --Damorbel (talk) 17:00, 1 November 2011 (UTC)
- (ec) FFS. Did you notice at all the page after page of hits that SpinningSpark's search brings back?
- James Jeans (1921), The Dynamical Theory of Gases, p. 125. "It is of the utmost importance to notice that, for the kinetic theory, temperature is a statistical conception ; it is meaningless to talk of the temperature of a single molecule."
- W.C.M Lewis (1924), A system of physical chemistry, "The temperature, in fact, is determined by the average kinetic energy. It is therefore meaningless to speak of the temperature of a single molecule in a gas."
- Zhu (2003), Large-scale inhomogeneous thermodynamics, p. 88 "Since... the temperature of a single molecule is meaningless..."
- Gemmer et al (2004), Quantum thermodynamics: emergence of thermodynamic behavior within composite systems, p. 209. "It thus appears meaningless to talk about the temperature of an individual particle..."
- William A. Blanpied (1969), Physics: its structure and evolution, "It is quite meaningless to speak of the temperature of a single molecule..."
- Henry O. Hooper, Peter Gwynne (1977), Physics and the physical perspective, "The concept of temperature is plainly a statistical one that depends on large numbers of molecules. The idea of the temperature of a single molecule is meaningless;"
- Hugo O. Villar (1994), Advances in Computational Biology, "The temperature of two particles, for instance, is physically meaningless."
- A. D'Abro (1939), The decline of mechanism: (in modern physics), p. 42 "But when we pass to the level of molecular dimensions, the entire concept of temperature becomes meaningless. There is no sense in speaking of the temperature of a single molecule, for a large number of molecules in motion is necessary to give meaning to temperature."
- Henry Margenau (1982) "Physics and Reductionism", in Joseph Agassi, Robert Sonné Cohen (eds), Scientific philosophy today: essays in honor of Mario Bunge, p. 190. "the 'higher level' observables are meaningless at the lower level; entropy and temperature mean no more for a single molecule than volume does with respect to a plane figure".
- Robert L. Sells (1965) Elementary classical physics, "It is meaningless to speak of the "temperature" of a single molecule or of a small number of molecules"
- Charles Hirsch (2007) Numerical computation of internal and external flows Vol 1, p. 22. "... the temperature, or pressure, or entropy of an individual atom or molecule is not defined and generally meaningless".
- R.J.P. Williams, quoted in Gregory Bock, Jamie Goode (1998) The limits of reductionism in biology, p. 129 "Entropy is not reducible to a property of a molecule. Thus random kinetic energy of a large number of molecules is described by temperature, which is very different from kinetic energy of a single molecule."
- Meghnad Saha, B. N. Srivistava (1931, 1958) A text book of heat "It is meaningless to talk of the pressure exerted by a single molecule, or of the temperature of a single molecule."
- Paul Karlson, A. E. Fisher (1936), The world around us: a modern guide to physics, "Just as it was meaningless to apply the notion of temperature to a single molecule... "
- Alistair I.M. Rae (1994, 2004), Quantum physics, illusion or reality?, "just as meaningless... as it is to talk ahout the temperature of a single isolated particle."
- Michel A. Saad (1966), Thermodynamics for engineers, "It should be remarked that reference to properties, such as temperature or pressure, apply to a large number of molecules and it is meaningless to refer to such properties for a single molecule."
- etc, etc.
- Your original complaint was with the phrase "temperature, which must necessarily be observed at the collective or bulk level." These quotations more than justify that phrase. Jheald (talk) 17:08, 1 November 2011 (UTC)
Jheald your citations are about entropy, pressure or they do not refer to the Boltzmann constant. For example ""Entropy is not reducible to a property of a molecule" Of course it isn't. Of necessity, when refering to entropy, you are talking about a large number of molecules interacting in a random way, witch, at equilibrium have a Maxwell Boltmann distribution, this is not a requirement for the Boltzmann constant, it has no role in the definition of the Boltzmann constant yet you do not appear to recognise the fact. Neithe do you recognise the role of the Boltzmann constant in the redefinition of the Kelvin. You cite James Jeans "for the kinetic theory, temperature is a statistical conception" Yes of course it is, because Kinetic theory is about large numbers of particles interacting by random elastic collisions; the actual energy of the particles is not the concern of Kinetic theory, Kinetic theory relies on the particles having a Maxwell-Boltzmann which by definition assumes, that the particles do not have the same temperature, this assumption does not mean that they can't have the same temperature just as electrons in a beam do.
And with Lewis "W.C.M Lewis (1924), A system of physical chemistry, "The temperature, in fact, is determined by the average kinetic energy. It is therefore meaningless to speak of the temperature of a single molecule in a gas" Notice he says 'System' thus multiple interacting particle. Did you notice 'average kinetic energy'? And 'in a gas' It is well known that the particles in a gas (at equilibrium) have the Maxwell-Boltzmann distribution of velocities therefore, with their different velocities, they have all got different temperatures; temperatures that are changing when the molecules collide. Temperature is the measure of energy in a atom (molecule or degree of freedom). --Damorbel (talk) 18:05, 1 November 2011 (UTC)
- I'm sorry, but you haven't got a clue; it's become clear that it's a waste of time engaging with you, because you appear simply to be incapable of understanding when you haven't got a clue; and evidently you have absolutely no interest in acquiring a clue.
- For the last time, molecules don't have a temperature -- it is distributions of molecules that have a temperature. A Maxwell-Boltzmann distribution of molecular velocities is characterised by a particular temperature. Individual molecules are not. Asserting that the individual molecules in a Maxwell-Boltzmann distribution each have different temperatures because they have different velocities shows you simply don't get it (as the above citations make very clear).
- Now, as SpinningSpark noted above, and as I also put to you on the 15 September, it is not WP's job to teach you physics, nor -- see WP:TALK -- are the talk pages here to try to straighten out your personal misconceptions. If you can find what WP would consider a reliable source that contests this article's proposition that "temperature ... must necessarily be observed at the collective or bulk level", then bring it on. Otherwise this discussion is at an end. Jheald (talk) 19:48, 1 November 2011 (UTC)
Further you wrote "Unless Damorbel is able to produce more reliable sources to contradict this it is pretty much settled by the sources - it can't go in the article, period." Now please explain, if the Boltzmann constant, with a value of 1.380 6488(13)×10−23J/k is not the energy in a single degree of freedom, just what is it? Even in its present form the article says :- "The Boltzmann constant (k or kB) is the physical constant relating energy at the individual particle level with temperature". Now if you disagree with this well and good. But then perhaps you do not agree with the revisions to the fundamental physical constants, including the Boltzmann constant, currently being proposed by the CIPM? Wouldn't it still be a valid contribution to Wikipedia to draw the attention of users to the reasons why these changes are being considered? --Damorbel (talk) 15:56, 1 November 2011 (UTC)
- Enough enough enough. This has filled up the talk page to no end. Let me repeat what I said to you on my talk page, talk pages are for discussing improvements to the article, not discussing the subject itself. Any further posts of this nature will be deleted without comment. See WP:TALK. SpinningSpark 16:31, 1 November 2011 (UTC)
Again SpinningSpark you write:- "talk pages are for discussing improvements to the article". Yes they are. And do you not think that new ways of determining the Boltzmann constant would improve the article? Such as [1] and this [2], do pay attention to the title of the article, it is "Boltzmann constant"! --Damorbel (talk) 17:00, 1 November 2011 (UTC)
Here is a good article on just how the improvements in the measuring of the Boltzmann constant is being improved using Johnson Noise Thermometry (JNT) --Damorbel (talk) 17:06, 1 November 2011 (UTC)