Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 83 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 16 tok/s Pro
GPT-5 High 15 tok/s Pro
GPT-4o 109 tok/s Pro
Kimi K2 181 tok/s Pro
GPT OSS 120B 468 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Bond-weighting method for the Grassmann tensor renormalization group (2208.03227v2)

Published 5 Aug 2022 in hep-lat

Abstract: Recently, the tensor network description with bond weights on its edges has been proposed as a novel improvement for the tensor renormalization group algorithm. The bond weight is controlled by a single hyperparameter, whose optimal value is estimated in the original work via the numerical computation of the two-dimensional critical Ising model. We develop this bond-weighted tensor renormalization group algorithm to make it applicable to the fermionic system, benchmarking with the two-dimensional massless Wilson fermion. We show that the accuracy with the fixed bond dimension is improved also in the fermionic system and provide numerical evidence that the optimal choice of the hyperparameter is not affected by whether the system is bosonic or fermionic. In addition, by monitoring the singular value spectrum, we find that the scale-invariant structure of the renormalized Grassmann tensor is successfully kept by the bond-weighting technique.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)