H-theorem 424440 221766496 2008-06-26T00:26:18Z SmackBot 433328 Date the maintenance tags or general fixes In [[thermodynamics]], the '''H-theorem''', introduced by [[Boltzmann]] in [[1872]], describes the increase in the [[entropy]] of an [[ideal gas]] in an irreversible process, by considering the [[Boltzmann equation]]. It appears to predict an irreversible increase in entropy, despite microscopically reversible dynamics. This has led to much discussion. == Boltzmann's H-theorem == The quantity ''H'' is defined as the integral over velocity space : :{| style="width:100%" border="0" |- | style="width:95%" | <math> \displaystyle H \ \stackrel{\mathrm{def}}{=}\ \int { P ({\ln P}) d^3 v} = \left\langle { \ln P } \right\rangle </math> | style= | (1) |} where P(v) is the probability. ''H'' is a forerunner of Shannon's [[information entropy]]. The article on Shannon's [[information entropy]] contains a [[Shannon_entropy#Information_entropy_explained|good explanation]] of the discrete counterpart of the quantity <math>\displaystyle H</math>, known as the information entropy or information uncertainty (with a minus sign). By [[Shannon_entropy#Extending_discrete_entropy_to_the_continuous_case:_differential_entropy|extending the discrete information entropy to the continuous information entropy]], also called [[differential entropy]], one obtains the expression in Eq.(1), and thus a better feel for the meaning of <math>\displaystyle H</math>. Using the Boltzmann equation one can prove that ''H'' can only decrease. For a system of ''N'' statistically independent particles, ''H'' is related to the thermodynamic entropy ''S'' through: :<math>S \ \stackrel{\mathrm{def}}{=}\ - N k H</math> so, according to the H-theorem, ''S'' can only increase. However, [[Loschmidt]] objected that it should not be possible to deduce an irreversible process from time-symmetric dynamics and a time-symmetric formalism: something must be wrong ([[Loschmidt's paradox]]). The answer is that the theorem is based on Boltzmann's assumption of "[[molecular chaos]]", i.e., that it is acceptable for all the particles to be considered independent and uncorrelated. This in fact breaks time reversal symmetry and therefore [[begs the question]]. == Quantum mechanical H-theorem == The following quantum-mechanical analogue of Boltzmann's H-theorem is sometimes given (e.g., Waldram (1985), p.39). Starting from the Gibbs definition of thermodynamic entropy, :<math>S = - k \sum_i p_i \ln p_i \,</math> differentiating gives :<math>\frac{dS}{dt} = - k \sum_i \ln p_i \frac{dp_i}{dt}</math> (using the fact that ''∑ dp<sub>i</sub>/dt = 0'', since ''∑ p<sub>i</sub> = 1''). Now [[Fermi's golden rule]] gives a [[master equation]] for the probability of quantum jumps from state α to β; and from state β to α. For an isolated system the jumps will make a contribution ''ν<sub>αβ</sub>(p<sub>β</sub>-p<sub>α</sub>)'' to ''dp<sub>α</sub>/dt'', and a contribution ''ν<sub>αβ</sub>(p<sub>α</sub>-p<sub>β</sub>)'' to ''dp<sub>β</sub>/dt''; the micro-reversibility of the dynamics ensuring that the same transition constant ν<sub>αβ</sub> appears in both expressions. Thus :<math>\frac{dS}{dt} = \frac{1}{2} k \sum_{\alpha\beta} \nu_{\alpha\beta}(\ln p_{\beta}-\ln p_{\alpha})(p_{\beta}- p_{\alpha}).</math> But the two brackets will have the same sign, so each contribution to ''dS/dt'' cannot be negative. Therefore :<math>\Delta S \geq 0</math> for an isolated system. The same mathematics is sometimes also presented for classical systems, considering probability flows between [[coarse-grained]] cells in the [[phase space]] (e.g., [[Tolman]] (1938)). === Critique === Several criticisms can be made of the above "proof", for example by Gull (1989): # It relies on the use of approximate quantum mechanics (Fermi's golden rule), not necessarily valid for large perturbations. # Are the probabilities to be considered as representing ''N'' independent systems of 1 particle, or as applying to 1 system of ''N'' particles? If it is the former, then it is ignoring the inter-particle correlations between the systems after collisions, explaining the information loss. The 1-particle entropy also ignores many-body effects in the potential energy, so bears little relation to the entropy of any real gas. # On the other hand, treated properly, an N-particle system has N-particle states. An isolated system will presumably sit in one of its N-particle microstates and make no transitions at all. == Analysis == {{Original research|section|date=June 2008}} At the heart of the H-theorem is the replacement of ''1-state to 1-state'' deterministic dynamics by ''many-state to many-state'' [[Markov process|Markovian]] mixing, with information lost at each Markovian transition. Gull is correct that, with the powers of [[Laplace's demon]], one could in principle map forward exactly the ensemble of the original possible states of the N-particle system exactly, and lose no information. But this would not be very interesting. Part of the program of statistical mechanics, not least the [[Maximum entropy thermodynamics|MaxEnt school]] of which Gull is an enthusiastic proponent, is to see just how much of the detail information in the system one can ignore, and yet still correctly predict experimentally reproducible results. The H-theorem's program of regularly throwing information away, either by systematically ignoring detailed correlations between particles, or between particular sub-systems, or through systematic regular coarse-graining, leads to predictions such as those from the [[Boltzmann equation]] for dilute ideal gases or from the recent entropy-production [[fluctuation theorem]], which are useful and reproducibly observable. They also mean that we have ''learnt'' something qualitative about the system, and which parts of its information are useful for which purposes, which is additional beyond even the full specification of the microscopic dynamical particle trajectories. (It may be interesting that having rounded on the H-theorem for not considering the microscopic detail of the microscopic dynamics, Gull then chooses to demonstrate the power of the extended-time MaxEnt/Gibbsian method by applying it to a Brownian motion example - a not so dissimilar replacement of detailed deterministic dynamical information by a simplified stochastic/probabilistic summary!) However, it is an ''assumption'' that the H-theorem's coarse-graining is not getting rid of any 'interesting' information. With such an assumption, one moves firmly into the domain of ''predictive'' physics: if the assumption goes wrong, it may produce predictions which are systematically and reproducibly wrong. == See also == * [[Loschmidt's paradox]] * [[Arrow of time]] * [[Second Law of Thermodynamics]] * [[Fluctuation theorem]] == References == {{Nofootnotes|date=February 2008}} * {{cite book | author=Lifshitz, E. M.; and Pitaevskii, L. P. | title=Physical kinetics | year = 1981 | location= London | publisher=Pergamon | id = ISBN 0-08-026480-8 ISBN 0-7506-2635-6}} Vol. 10 of the Course of Theoretical Physics (3rd Ed). * {{cite book | author=Waldram, J. R. | title=The theory of thermodynamics | location=Cambridge | publisher=University Press | year=1985 | id=ISBN 0-521-28796-0}} * {{cite book | author=Tolman, Richard C. | title=The principles of statistical mechanics | location=Oxford | publisher=Clarendon Press | year=1938}}; (1979) New York: Dover ISBN 0-486-63896-0 * Gull, S.F. (1989) [http://www.ucl.ac.uk/~ucesjph/reality/entropy/text.html Some misconceptions about entropy] in: {{cite book | editor =B. Buck, V. A. Macaulay (Eds.) | title=Maximum Entropy in Action | publisher = Oxford University Press | date = 1991 | id = ISBN 0-19-853963-0}}. [[Category:Non-equilibrium thermodynamics]] [[Category:Thermodynamic entropy]] [[Category:Philosophy of thermal and statistical physics]] [[Category:Physics theorems]] [[Category:Fundamental physics concepts]] [[Category:Statistical theorems]] [[ja:H定理]] [[fr:Théorème_H]]