On Entropy Minimization and Convergence
Abstract: We examine the minimization of information entropy for measures on the phase space of bounded domains, subject to constraints that are averages of grand canonical distributions. We describe the set of all such constraints and show that it equals the set of averages of all probability measures absolutely continuous with respect to the standard measure on the phase space (with the exception of the measure concentrated on the empty configuration). We also investigate how the set of constrains relates to the domain of the microcanonical thermodynamic limit entropy. We then show that, for fixed constraints, the parameters of the corresponding grand canonical distribution converge, as volume increases, to the corresponding parameters (derivatives, when they exist) of the thermodynamic limit entropy. The results hold when the energy is the sum of any stable, tempered interaction potential that satisfies the Gibbs variational principle (e.g.~Lennard-Jones) and the kinetic energy. The same tools and the strict convexity of the thermodynamic limit pressure for continuous systems (valid whenever the Gibbs variational principle holds) give solid foundation to the folklore local homeomorphism between thermodynamic and macroscopic quantities.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.