Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Jet Tagging with More-Interaction Particle Transformer (2407.08682v3)

Published 11 Jul 2024 in hep-ph, hep-ex, and physics.data-an

Abstract: In this study, we introduce the More-Interaction Particle Transformer (MIParT), a novel deep learning neural network designed for jet tagging. This framework incorporates our own design, the More-Interaction Attention (MIA) mechanism, which increases the dimensionality of particle interaction embeddings. We tested MIParT using the top tagging and quark-gluon datasets. Our results show that MIParT not only matches the accuracy and AUC of LorentzNet and a series of Lorentz-equivariant methods, but also significantly outperforms the ParT model in background rejection. Specifically, it improves background rejection by approximately 25% at a 30% signal efficiency on the top tagging dataset and by 3% on the quark-gluon dataset. Additionally, MIParT requires only 30% of the parameters and 53% of the computational complexity needed by ParT, proving that high performance can be achieved with reduced model complexity. For very large datasets, we double the dimension of particle embeddings, referring to this variant as MIParT-Large (MIParT-L). We find that MIParT-L can further capitalize on the knowledge from large datasets. From a model pre-trained on the 100M JetClass dataset, the background rejection performance of the fine-tuned MIParT-L improved by 39% on the top tagging dataset and by 6% on the quark-gluon dataset, surpassing that of the fine-tuned ParT. Specifically, the background rejection of fine-tuned MIParT-L improved by an additional 2% compared to the fine-tuned ParT. The results suggest that MIParT has the potential to advance efficiency benchmarks for jet tagging and event identification in particle physics. The code is available at the following GitHub repository: https://github.com/USST-HEP/MIParT

Citations (3)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com