Papers
Topics
Authors
Recent
Search
2000 character limit reached

Maximum Entropy Principle, Equal Probability a Priori and Gibbs Paradox

Published 20 May 2011 in cond-mat.stat-mech | (1105.4118v2)

Abstract: We prove that information-theoretic maximum entropy (MaxEnt) approach to canonical ensemble is mathematically equivalent to the classic approach of Boltzmann, Gibbs and Darwin-Fowler. The two approaches, however, "interpret" a same mathematical theorem differently; most notably observing mean-energy in the former and energy conservation in the latter. However, applying the same MaxEnt method to grand canonical ensemble fails; while carefully following the classic approach based on Boltzmann's microcanonical {\em equal probability a priori} produces the correct statistics: One does not need to invoke quantum mechanics; and there is no Gibbs paradox. MaxEnt and related minimum relative entropy principle are based on the mathematical theorem concerning large deviations of rare fluctuations. As a scientific method, it requires classic mechanics, or some other assumptions, to provide meaningful {\em prior distributions} for the expected-value based statistical inference. A naive assumption of uniform prior is not valid in statistical mechanics.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.