Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Induced Generative Adversarial Particle Transformers (2312.04757v1)

Published 8 Dec 2023 in hep-ex, cs.LG, and physics.data-an

Abstract: In high energy physics (HEP), machine learning methods have emerged as an effective way to accurately simulate particle collisions at the Large Hadron Collider (LHC). The message-passing generative adversarial network (MPGAN) was the first model to simulate collisions as point, or particle'', clouds, with state-of-the-art results, but suffered from quadratic time complexity. Recently, generative adversarial particle transformers (GAPTs) were introduced to address this drawback; however, results did not surpass MPGAN. We introduce induced GAPT (iGAPT) which, by integratinginduced particle-attention blocks'' and conditioning on global jet attributes, not only offers linear time complexity but is also able to capture intricate jet substructure, surpassing MPGAN in many metrics. Our experiments demonstrate the potential of iGAPT to simulate complex HEP data accurately and efficiently.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (14)
  1. “Particle Cloud Generation with Message Passing Generative Adversarial Networks” In Advances in Neural Information Processing Systems 34 Curran Associates, Inc., 2021, pp. 23858 arXiv: https://papers.nips.cc/paper/2021/hash/c8512d142a2d849725f31a9a7a361ab9-Abstract.html
  2. “Evaluating generative models in high energy physics” In Phys. Rev. D 107.7, 2023, pp. 076017 DOI: 10.1103/PhysRevD.107.076017
  3. “LHC hadronic jet generation using convolutional variational autoencoders with normalizing flows” In Mach. Learn. Sci. Tech. 4, 2023, pp. 045023 DOI: 10.1088/2632-2153/ad04ea
  4. Benno Kach, Dirk Krücker and Isabell Melzer-Pellmann “Point Cloud Generation using Transformer Encoders and Normalising Flows”, 2022 arXiv:2211.13623 [hep-ex]
  5. “Attention to Mean-Fields for Particle Cloud Generation”, 2023 arXiv:2305.15254 [hep-ex]
  6. “PC-JeDi: Diffusion for Particle Cloud Generation in High Energy Physics”, 2023 arXiv:2303.05376 [hep-ph]
  7. Vinicius Mikuni, Benjamin Nachman and Mariel Pettee “Fast point cloud generation with diffusion models in high energy physics” In Phys. Rev. D 108.3, 2023, pp. 036025 DOI: 10.1103/PhysRevD.108.036025
  8. Erik Buhmann, Gregor Kasieczka and Jesse Thaler “EPiC-GAN: Equivariant point cloud generation for particle jets” In SciPost Phys. 15.4, 2023, pp. 130 DOI: 10.21468/SciPostPhys.15.4.130
  9. “Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks” In Proceedings of the 36th International Conference on Machine Learning 97, 2019, pp. 3744 arXiv: https://proceedings.mlr.press/v97/lee19d.html
  10. Karl Stelzner, Kristian Kersting and Adam R Kosiorek “Generative adversarial set transformers” In Workshop on Object-Oriented Learning at ICML 3, 2020, pp. 1 URL: https://ml-research.github.io/papers/stelzner2020ood_gast.pdf
  11. “JetNet” https://doi.org/10.5281/zenodo.6975118 Zenodo, 2022 DOI: 10.5281/zenodo.6975118
  12. “JetNet: A Python package for accessing open datasets and benchmarking machine learning methods in high energy physics” In J. Open Source Softw. 8.90, 2023, pp. 5789 DOI: 10.21105/joss.05789
  13. “GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium” In Advances in Neural Information Processing Systems 30 Red Hook, NY: Curran Associates, Inc., 2017, pp. 6626 arXiv: https://papers.nips.cc/paper_files/paper/2017/hash/8a1d694707eb0fefe65871369074926d-Abstract.html
  14. Patrick T. Komiske, Eric M. Metodiev and Jesse Thaler “Energy flow polynomials: A complete linear basis for jet substructure” In JHEP 04, 2018, pp. 013 DOI: 10.1007/JHEP04(2018)013
Citations (1)

Summary

We haven't generated a summary for this paper yet.