Dice Question Streamline Icon: https://streamlinehq.com

How machine learning relates to scientific understanding

Ascertain how machine learning models that learn to predict physical quantities can relate to and contribute to scientific understanding, distinguishing such relations from explanations of the models’ internal behavior.

Information Square Streamline Icon: https://streamlinehq.com

Background

The review distinguishes between explaining the behavior of machine learning systems (a focus of XAI) and achieving scientific understanding of physical phenomena through the concepts learned by those systems. It emphasizes that current explainability techniques may not align with the forms of explanation relevant for science and points to a conceptual gap about the role ML can play in scientific understanding.

This open question frames a foundational issue for interpretable ML in physics: under what conditions and in what ways can learned representations or concepts be considered scientifically explanatory or understanding-enhancing, beyond mere predictive capability.

References

The distinction lies in how scientific understanding is approached and (the open question of) how ML can ultimately relate to it (Sec.~\ref{subsec:understanding_and_ML}).

Interpretable Machine Learning in Physics: A Review (2503.23616 - Wetzel et al., 30 Mar 2025) in Section 3, Philosophical perspectives