Dice Question Streamline Icon: https://streamlinehq.com

Gap to the Optimal n-State Engine in Asymptotic Work Extraction

Determine how close the information engines learned via thermodynamic machine learning—specifically, maximum-work training constrained to n predictive states—are to the optimal n-state information engines for the same input processes, measured by the asymptotic average work rate on test data.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper derives an exact expression for the asymptotic work rate of an information engine and demonstrates training and testing performance under memory constraints. In discussing empirical results (e.g., Fig. 4) the authors note that larger memories can approach thermodynamic limits on long training sequences, but they explicitly acknowledge uncertainty about how close these learned engines are to the best possible n-state engines.

This uncertainty concerns the optimality gap between the learned maximum-work model (within the class of n-state predictive models) and the true optimum n-state engine with respect to asymptotic work extraction on data from the true process.

References

While it is unclear how close to the optimal $n$-state engine these results are, we see that thermodynamic learning discovers enough of the hidden temporal structure to harvest much of the available free energy.

Thermodynamic Overfitting and Generalization: Energetic Limits on Predictive Complexity (2402.16995 - Boyd et al., 26 Feb 2024) in Section: Asymptotic Work Harvesting and Overfitting (discussion following Fig. 4)