Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Jet Flavour Tagging at FCC-ee with a Transformer-based Neural Network: DeepJetTransformer (2406.08590v3)

Published 12 Jun 2024 in hep-ex and hep-ph

Abstract: Jet flavour tagging is crucial in experimental high-energy physics. A tagging algorithm, DeepJetTransformer, is presented, which exploits a transformer-based neural network that is substantially faster to train. The DeepJetTransformer network uses information from particle flow-style objects and secondary vertex reconstruction as is standard for $b$- and $c$-jet identification supplemented by additional information, such as reconstructed V$0$s and $K{\pm}/\pi{\pm}$ discrimination, typically not included in tagging algorithms at the LHC. The model is trained as a multiclassifier to identify all quark flavours separately and performs excellently in identifying $b$- and $c$-jets. An $s$-tagging efficiency of $40\%$ can be achieved with a $10\%$ $ud$-jet background efficiency. The impact of including V$0$s and $K{\pm}/\pi{\pm}$ discrimination is presented. The network is applied on exclusive $Z \to q\bar{q}$ samples to examine the physics potential and is shown to isolate $Z \to s\bar{s}$ events. Assuming all other backgrounds can be efficiently rejected, a $5\sigma$ discovery significance for $Z \to s\bar{s}$ can be achieved with an integrated luminosity of $60~\text{nb}{-1}$, corresponding to less than a second of the FCC-ee run plan at the $Z$ resonance.

Citations (1)

Summary

We haven't generated a summary for this paper yet.