ebooksgratis.com

See also ebooksgratis.com: no banners, no cookies, totally FREE.

CLASSICISTRANIERI HOME PAGE - YOUTUBE CHANNEL
Privacy Policy Cookie Policy Terms and Conditions
H-theorem - Wikipedia, the free encyclopedia

H-theorem

From Wikipedia, the free encyclopedia

In thermodynamics, the H-theorem, introduced by Boltzmann in 1872, describes the increase in the entropy of an ideal gas in an irreversible process, by considering the Boltzmann equation.

It appears to predict an irreversible increase in entropy, despite microscopically reversible dynamics. This has led to much discussion.

Contents

[edit] Boltzmann's H-theorem

The quantity H is defined as the integral over velocity space :


   \displaystyle 
   H 
   \ \stackrel{\mathrm{def}}{=}\  
   \int { P ({\ln P}) d^3 v} 
   = \left\langle { \ln P } \right\rangle

(1)

where P(v) is the probability. H is a forerunner of Shannon's information entropy.

The article on Shannon's information entropy contains a good explanation of the discrete counterpart of the quantity \displaystyle H, known as the information entropy or information uncertainty (with a minus sign). By extending the discrete information entropy to the continuous information entropy, also called differential entropy, one obtains the expression in Eq.(1), and thus a better feel for the meaning of \displaystyle H.

Using the Boltzmann equation one can prove that H can only decrease.

For a system of N statistically independent particles, H is related to the thermodynamic entropy S through:

S \ \stackrel{\mathrm{def}}{=}\  - N k H

so, according to the H-theorem, S can only increase.

However, Loschmidt objected that it should not be possible to deduce an irreversible process from time-symmetric dynamics and a time-symmetric formalism: something must be wrong (Loschmidt's paradox). The answer is that the theorem is based on Boltzmann's assumption of "molecular chaos", i.e., that it is acceptable for all the particles to be considered independent and uncorrelated. This in fact breaks time reversal symmetry and therefore begs the question.

[edit] Quantum mechanical H-theorem

The following quantum-mechanical analogue of Boltzmann's H-theorem is sometimes given (e.g., Waldram (1985), p.39).

Starting from the Gibbs definition of thermodynamic entropy,

S = - k \sum_i p_i \ln p_i \,

differentiating gives

\frac{dS}{dt} =  - k \sum_i \ln p_i \frac{dp_i}{dt}

(using the fact that ∑ dpi/dt = 0, since ∑ pi = 1).

Now Fermi's golden rule gives a master equation for the probability of quantum jumps from state α to β; and from state β to α. For an isolated system the jumps will make a contribution ναβ(pβ-pα) to dpα/dt, and a contribution ναβ(pα-pβ) to dpβ/dt; the micro-reversibility of the dynamics ensuring that the same transition constant ναβ appears in both expressions.

Thus

\frac{dS}{dt} =  \frac{1}{2} k \sum_{\alpha\beta} \nu_{\alpha\beta}(\ln p_{\beta}-\ln p_{\alpha})(p_{\beta}- p_{\alpha}).

But the two brackets will have the same sign, so each contribution to dS/dt cannot be negative.

Therefore

\Delta S \geq 0

for an isolated system.

The same mathematics is sometimes also presented for classical systems, considering probability flows between coarse-grained cells in the phase space (e.g., Tolman (1938)).

[edit] Critique

Several criticisms can be made of the above "proof", for example by Gull (1989):

  1. It relies on the use of approximate quantum mechanics (Fermi's golden rule), not necessarily valid for large perturbations.
  2. Are the probabilities to be considered as representing N independent systems of 1 particle, or as applying to 1 system of N particles? If it is the former, then it is ignoring the inter-particle correlations between the systems after collisions, explaining the information loss. The 1-particle entropy also ignores many-body effects in the potential energy, so bears little relation to the entropy of any real gas.
  3. On the other hand, treated properly, an N-particle system has N-particle states. An isolated system will presumably sit in one of its N-particle microstates and make no transitions at all.

[edit] Analysis

At the heart of the H-theorem is the replacement of 1-state to 1-state deterministic dynamics by many-state to many-state Markovian mixing, with information lost at each Markovian transition.

Gull is correct that, with the powers of Laplace's demon, one could in principle map forward exactly the ensemble of the original possible states of the N-particle system exactly, and lose no information. But this would not be very interesting. Part of the program of statistical mechanics, not least the MaxEnt school of which Gull is an enthusiastic proponent, is to see just how much of the detail information in the system one can ignore, and yet still correctly predict experimentally reproducible results.

The H-theorem's program of regularly throwing information away, either by systematically ignoring detailed correlations between particles, or between particular sub-systems, or through systematic regular coarse-graining, leads to predictions such as those from the Boltzmann equation for dilute ideal gases or from the recent entropy-production fluctuation theorem, which are useful and reproducibly observable. They also mean that we have learnt something qualitative about the system, and which parts of its information are useful for which purposes, which is additional beyond even the full specification of the microscopic dynamical particle trajectories.

(It may be interesting that having rounded on the H-theorem for not considering the microscopic detail of the microscopic dynamics, Gull then chooses to demonstrate the power of the extended-time MaxEnt/Gibbsian method by applying it to a Brownian motion example - a not so dissimilar replacement of detailed deterministic dynamical information by a simplified stochastic/probabilistic summary!)

However, it is an assumption that the H-theorem's coarse-graining is not getting rid of any 'interesting' information. With such an assumption, one moves firmly into the domain of predictive physics: if the assumption goes wrong, it may produce predictions which are systematically and reproducibly wrong.

[edit] See also

[edit] References

Languages


aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -