Papers
Topics
Authors
Recent
2000 character limit reached

Resolving Extreme Jet Substructure (2202.00723v2)

Published 1 Feb 2022 in hep-ex and hep-ph

Abstract: We study the effectiveness of theoretically-motivated high-level jet observables in the extreme context of jets with a large number of hard sub-jets (up to $N=8$). Previous studies indicate that high-level observables are powerful, interpretable tools to probe jet substructure for $N\le 3$ hard sub-jets, but that deep neural networks trained on low-level jet constituents match or slightly exceed their performance. We extend this work for up to $N=8$ hard sub-jets, using deep particle-flow networks (PFNs) and Transformer based networks to estimate a loose upper bound on the classification performance. A fully-connected neural network operating on a standard set of high-level jet observables, 135 $\textrm{N}$-subjetiness observables and jet mass, reach classification accuracy of 86.90\%, but fall short of the PFN and Transformer models, which reach classification accuracies of 89.19\% and 91.27\% respectively, suggesting that the constituent networks utilize information not captured by the set of high-level observables. We then identify additional high-level observables which are able to narrow this gap, and utilize LASSO regularization for feature selection to identify and rank the most relevant observables and provide further insights into the learning strategies used by the constituent-based neural networks. The final model contains only 31 high-level observables and is able to match the performance of the PFN and approximate the performance of the Transformer model to within 2\%.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.