Talk:Introduction to entropy: Difference between revisions

Content deleted Content added
Collecting opinions: pages from Cercignani
Line 336:
There is at least one objection above to leading off with the statistical-mechanical description. There is also at least one insistence that motion be mentioned in the first paragraph. If this article is to be about entropy in general — including the popular concept and information entropy — then it's inappropriate to lead off with a purely thermodynamic account. The statistical-mechanical account applies to both thermodynamics and information, and it explains how the physical concept led to the popular concept. That it "explains but does not define" thermodynamic entropy is a subtlety that seriously gums up the mission of getting entropy across to the English major or the financial analyst. I urge certain editors, for the nth time, to get over it in deference to that mission. If they absolutely cannot, perhaps the distinction can be addressed in the body of the article. [[User:Jordgette|'''<span style="color:black">-Jord</span><span style="color:darkred">gette</span>''']] [[User talk:Jordgette|<small>[talk]</small>]] 15:22, 5 December 2020 (UTC)
 
:The words “subtlety that seriously gums up the mission” and “in deference to that mission” remind me that some time ago, Editor Jordgette put his cards on the table thus: “Ironically your argument supports starting this article by describing entropy as a measure of disorder, which is far and away the most intuitive way to describe  what it is.” I don't know if he still thinks or intuits so. Such a respectable source as [[Edwin Thompson Jaynes]] had an alternative view when he wrote “Glib, unqualified statements to the effect that "entropy measures randomness" are in my opinion totally meaningless, and present a serious barrier to any real understanding of these problems.” I guess intuition and understanding are different.
 
:Assuming that the article is to be re-named to become about entropy in general, it isn't evident to me precisely whither leads Editor Jordgette's just above collection of opinions. In particular, it doesn't mention the most general conception of entropy, that of the mathematical theory of dynamical systems, just a little while ago relegated to the archive. The comments, such as [https://en.wikipedia.org/w/index.php?title=Talk%3AIntroduction_to_thermodynamic_entropy&type=revision&diff=988127796&oldid=988127466 this one], of our IP-only mathematician friend, [https://en.wikipedia.org/wiki/Special:Contributions/67.198.37.16 67.198.37.16], would return to relevance if the article were re-named to become about entropy in general.[[User:Chjoaygame|Chjoaygame]] ([[User talk:Chjoaygame|talk]]) 00:35, 6 December 2020 (UTC)
Line 560:
 
::::though that fact is not widely celebrated. (Very tied up right now, please give me till tomorrow to find page number.)[[User:Chjoaygame|Chjoaygame]] ([[User talk:Chjoaygame|talk]]) 19:37, 17 December 2020 (UTC)
 
::::Quoting from Cercignani, p. 8, quoting Boltzmann: "... erroneous to believe that the mechanical theory of heat is therefore afflicted with some uncertainty because the principles of probability theory are used. ... It is only doubly imperative to handle the conclusions with the greatest strictness."
 
::::Quoting from Cercignani, p. 8, commenting on Boltzmann: "But he also seemed to think that he had obtained a result which, except for these fluctuations, followed from the equations of mechanics without exception."
 
::::I don't recall us mentioning the origin of the 'disorder' doctrine. Perhaps this quote from Cercignani, p. 18, may help, though I think it isn't enough to settle the matter: "In 1877 he published his paper “Probabilistic foundations of heat theory”, in which he formulated what Einstein later called the ''Boltzmann principle''; the interpretation of the concept of entropy as a mathematically well-defined measure of what one can call the "disorder" of atoms, which had already appeared in his work of 1872, is here extended and becomes a general statement."
 
::::Coming to the present point. On page 18, Cercignani writes "In the same year [1877] he also wrote a fundamental paper, generally unknown to the majority of physicists, who by reading only second-hand reports are led to the erroneous belief that Boltzmann dealt only with ideal gases; this paper clearly indicates that he considered mutually interacting molecules as well, with non-negligible potential energy, and thus, as we shall see in Chapter 7, it is he and not Josiah Willard Gibbs (1839-1903) who should be considered as the founder of equilibrium statistical mechanics and of the method of ensembles."
 
::::On page 55, Cercignani quotes Boltzmann: "The assumption that the gas-molecules are aggregates of material points, in the sense of Boscovich, does not agree with facts." Boltzmann knew, from spectroscopy, that atoms must be intricately complex objects.
 
::::On page 64, Cercignani writes: "Thermodynamics ... can be regarded as a limitation of our ability to act on the mechanics of the minutest particles of a body ..." I think this is a wise statement. My reason is that it does not appeal to probability. Maxwell's demon has the ability that we lack.
 
::::On page 82, Cercignani writes about Carnot: "Essentially he saw that there was something that was conserved in reversible processes; this was not heat or caloric, however, but what was later called entropy." This might help us talk about the relation between (ir)reversibility and entropy, a matter that Editor Chetvorno has raised. The characteristic of entropy is that it increases as a result of a thermodynamic process. The pure mode of entropy generation is in heat transfer. In contrast, ideally pure work transfer generates no entropy. (At the risk of being accused of some crime, I may say that work is like heat transfer from a body that is at infinite temperature: <math>\delta Q/T = 0\,</math>; such transfer can heat any body in the surroundings. To extract all the internal energy of a body as work, we need a heat reservoir at zero temperature.)
 
::::On page 83, Cercignani writes: "The first attempts at explaining the Second Law on the grounds of kinetic theory are due to Rankine [22, 23]." Rankine (I forget the exact date, but about 1849 or 1850) used a quantity that he called "the thermodynamic function", later called 'entropy' by Clausius.
 
::::On pages 83–84, Cercignani writes: "Boltzmann himself makes his first appearance in the field with a paper [25] [1866] in which he tries to prove the Second Law starting from purely mechanical theorems, under the rather restrictive assumption that the molecular motions are periodic, with period <math>\tau</math>, and the awkward remark, which might perhaps be justified, that “if the orbits do not close after a finite time, one may think that they do in an infinite one”. Essentially, Boltzmann remarks that temperature may be thought of as the time average of kinetic energy, while heat can be equated to the average increase in kinetic energy; if we compute the unspecified period from one of the relations and substitute the result into the other, it turns out that the heat divided by the temperature is an exact differential. This part of the paper appears to be a rather primitive justification of the first part of the Second Law; as for the second part, Boltzmann's argument belongs more to pure thermodynamics than to statistical mechanics and leads to the conclusion that entropy must increase in an irreversible process." I guess that <math>\tau</math> is the Poincaré recurrence time.
 
::::My automatic tldr alarm is ringing loudly and I will stop for now.[[User:Chjoaygame|Chjoaygame]] ([[User talk:Chjoaygame|talk]]) 16:04, 18 December 2020 (UTC)
 
== Problems with the "Heat and entropy" section ==