Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 86 tok/s
Gemini 2.5 Pro 45 tok/s Pro
GPT-5 Medium 23 tok/s Pro
GPT-5 High 25 tok/s Pro
GPT-4o 111 tok/s Pro
Kimi K2 178 tok/s Pro
GPT OSS 120B 452 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Shannon entropy in quasiparticle states of quantum chains (2303.14132v2)

Published 24 Mar 2023 in quant-ph, cond-mat.stat-mech, and hep-th

Abstract: We investigate the Shannon entropy of the total system and its subsystems, as well as the subsystem Shannon mutual information, in quasiparticle excited states of free bosonic and fermionic chains and the ferromagnetic phase of the spin-1/2 XXX chain. For single-particle and double-particle states, we derive various analytical formulas for free bosonic and fermionic chains in the scaling limit. These formulas are also applicable to certain magnon excited states in the XXX chain in the scaling limit. We also calculate numerically the Shannon entropy and mutual information for triple-particle and quadruple-particle states in bosonic, fermionic, and XXX chains. We discover that Shannon entropy, unlike entanglement entropy, typically does not separate for quasiparticles with large momentum differences. Moreover, in the limit of large momentum difference, we obtain universal quantum bosonic and fermionic results that are generally distinct and cannot be explained by a semiclassical picture.

Citations (1)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (2)

X Twitter Logo Streamline Icon: https://streamlinehq.com