Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 59 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 29 tok/s Pro
GPT-4o 80 tok/s Pro
Kimi K2 181 tok/s Pro
GPT OSS 120B 454 tok/s Pro
Claude Sonnet 4.5 33 tok/s Pro
2000 character limit reached

Efficiency of fermionic quantum distillation (1707.01792v3)

Published 6 Jul 2017 in cond-mat.str-el and quant-ph

Abstract: We present a time-dependent density-matrix renormalization group investigation of the quantum distillation process within the Fermi--Hubbard model on a quasi-1D ladder geometry. The term distillation refers to the dynamical, spatial separation of singlons and doublons in the sudden expansion of interacting particles in an optical lattice, i.e., the release of a cloud of atoms from a trapping potential. Remarkably, quantum distillation can lead to a contraction of the doublon cloud, resulting in an increased density of the doublons in the core region compared to the initial state. As a main result, we show that this phenomenon is not limited to chains that were previously studied. Interestingly, there are additional dynamical processes on the two-leg ladder such as density oscillations and selftrapping of defects that lead to a less efficient distillation process. An investigation of the time evolution starting from product states provides an explanation for this behaviour. Initial product states are also considered, since in optical lattice experiments such states are often used as the initial setup. We propose configurations that lead to a fast and efficient quantum distillation.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube