Thermodynamic Entropy as Information -- A compression-based demonstration of the Shannon-Boltzmann equivalence in condensed matter
Abstract: We demonstrate that Shannon's information entropy and the thermodynamic entropy of Boltzmann and Gibbs are quantitatively equivalent for real condensed-matter systems. By interpreting atomic configurations as information sources, we compute entropy directly from the compressibility of molecular-dynamics trajectories, without physical partitioning or empirical modeling. A custom lossy-compression algorithm measures the minimum number of bits required to describe a microstate at finite precision, and this bit count maps exactly to thermodynamic entropy through the Shannon-Boltzmann relation. The method reproduces benchmark entropies for metals, semiconductors, oxides, and refractory ceramics in both solid and liquid phases, establishing information as the fundamental quantity underlying thermodynamic disorder. This equivalence unifies information theory and statistical mechanics, providing a general and computationally efficient framework for determining entropies and free energies directly from atomic data.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.