Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 98 tok/s
Gemini 2.5 Pro 58 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 112 tok/s Pro
Kimi K2 165 tok/s Pro
GPT OSS 120B 460 tok/s Pro
Claude Sonnet 4 29 tok/s Pro
2000 character limit reached

Projective cluster-additive transformation for quantum lattice models (2303.04774v1)

Published 8 Mar 2023 in cond-mat.str-el

Abstract: We construct a projection-based cluster-additive transformation that block-diagonalizes wide classes of lattice Hamiltonians $\mathcal{H}=\mathcal{H}_0 +V$. Its cluster additivity is an essential ingredient to set up perturbative or non-perturbative linked-cluster expansions for degenerate excitation subspaces of $\mathcal{H}_0$. Our transformation generalizes the minimal transformation known amongst others under the names Takahashi's transformation, Schrieffer-Wolff transformation, des Cloiseaux effective Hamiltonian, canonical van Vleck effective Hamiltonian or two-block orthogonalization method. The effective cluster-additive Hamiltonian and the transformation for a given subspace of $\mathcal{H}$, that is adiabatically connected to the eigenspace of $\mathcal{H}_0$ with eigenvalue $e_0n$, solely depends on the eigenspaces of $\mathcal{H}$ connected to $e_0m$ with $e_0m\leq e_0n$. In contrast, other cluster-additive transformations like the multi-block orthognalization method or perturbative continuous unitary transformations need a larger basis. This can be exploited to implement the transformation efficiently both perturbatively and non-perturbatively. As a benchmark, we perform perturbative and non-perturbative linked-cluster expansions in the low-field ordered phase of the transverse-field Ising model on the square lattice for single spin-flips and two spin-flip bound-states.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.