2000 character limit reached
Point Cloud Transformers applied to Collider Physics
Published 9 Feb 2021 in physics.data-an, cs.LG, and hep-ex | (2102.05073v2)
Abstract: Methods for processing point cloud information have seen a great success in collider physics applications. One recent breakthrough in machine learning is the usage of Transformer networks to learn semantic relationships between sequences in language processing. In this work, we apply a modified Transformer network called Point Cloud Transformer as a method to incorporate the advantages of the Transformer architecture to an unordered set of particles resulting from collision events. To compare the performance with other strategies, we study jet-tagging applications for highly-boosted particles.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.