Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 79 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 45 tok/s
GPT-5 High 43 tok/s Pro
GPT-4o 103 tok/s
GPT OSS 120B 475 tok/s Pro
Kimi K2 215 tok/s Pro
2000 character limit reached

A new class of variance reduction techniques using lattice symmetries (1208.4349v1)

Published 21 Aug 2012 in hep-lat

Abstract: We present a general class of unbiased improved estimators for physical observables in lattice gauge theory computations which significantly reduces statistical errors at modest computational cost. The error reduction techniques, referred to as covariant approximation averaging, utilize approximations which are covariant under lattice symmetry transformations. We observed cost reductions from the new method compared to the traditional one, for fixed statistical error, of 16 times for the nucleon mass at $M_\pi\sim 330$ MeV (Domain-Wall quark) and 2.6-20 times for the hadronic vacuum polarization at $M_\pi\sim 480$ MeV (Asqtad quark). These cost reductions should improve with decreasing quark mass and increasing lattice sizes.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube