Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 170 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 41 tok/s Pro
GPT-4o 60 tok/s Pro
Kimi K2 208 tok/s Pro
GPT OSS 120B 440 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

PyLIT: Reformulation and implementation of the analytic continuation problem using kernel representation methods (2505.10211v1)

Published 15 May 2025 in physics.comp-ph and physics.plasm-ph

Abstract: Path integral Monte Carlo (PIMC) simulations are a cornerstone for studying quantum many-body systems. The analytic continuation (AC) needed to estimate dynamic quantities from these simulations is an inverse Laplace transform, which is ill-conditioned. If this inversion were surmounted, then dynamical observables (e.g. dynamic structure factor (DSF) $S(q,\omega)$) could be extracted from the imaginary-time correlation functions estimates. Although of important, the AC problem remains challenging due to its ill-posedness. To address this challenge, we express the DSF as a linear combination of kernel functions with known Laplace transforms that have been tailored to satisfy its physical constraints. We use least-squares optimization regularized with a Bayesian prior to determine the coefficients of this linear combination. We explore various regularization term, such as the commonly used entropic regularizer, as well as the Wasserstein distance and $L2$-distance as well as techniques for setting the regularization weight. A key outcome is the open-source package PyLIT (\textbf{Py}thon \textbf{L}aplace \textbf{I}nverse \textbf{T}ransform), which leverages Numba and unifies the presented formulations. PyLIT's core functionality is kernel construction and optimization. In our applications, we find PyLIT's DSF estimates share qualitative features with other more established methods. We identify three key findings. Firstly, independent of the regularization choice, utilizing non-uniform grid point distributions reduced the number of unknowns and thus reduced our space of possible solutions. Secondly, the Wasserstein distance, a previously unexplored regularizer, performs as good as the entropic regularizer while benefiting from its linear gradient. Thirdly, future work can meaningfully combine regularized and stochastic optimization. (text cut for char. limit)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: