DeepTreeGANv2: Iterative Pooling of Point Clouds (2312.00042v2)
Abstract: In High Energy Physics, detailed and time-consuming simulations are used for particle interactions with detectors. To bypass these simulations with a generative model, the generation of large point clouds in a short time is required, while the complex dependencies between the particles must be correctly modelled. Particle showers are inherently tree-based processes, as each particle is produced by the decay or detector interaction of a particle of the previous generation. In this work, we present a significant extension to DeepTreeGAN, featuring a critic, that is able to aggregate such point clouds iteratively in a tree-based manner. We show that this model can reproduce complex distributions, and we evaluate its performance on the public JetNet 150 dataset.
- “DeepTreeGAN: Fast Generation of High Dimensional Point Clouds” Submitted to EPJ Web of Conferences CHEP, https://arxiv.org/abs/2311.12616 In EPJ Web Conf., 2023
- S. Agostinelli “GEANT4–a simulation toolkit” In Nucl. Instrum. Meth. A 506, 2003, pp. 250 DOI: 10.1016/S0168-9002(03)01368-8
- Johannes Albrecht “A Roadmap for HEP Software and Computing R&D for the 2020s” In Computing and Software for Big Science 3.1 Springer ScienceBusiness Media LLC, 2019, pp. 7 DOI: 10.1007/s41781-018-0018-8
- “High Luminosity Large Hadron Collider HL-LHC. High Luminosity Large Hadron Collider HL-LHC” Chapter 1 in High-Luminosity Large Hadron Collider (HL-LHC) : Preliminary Design Report In CERN Yellow Report, 2015, pp. 1–19 DOI: 10.5170/CERN-2015-005.1
- “CMS Phase-2 Computing Model: Update Document”, 2022 URL: https://cds.cern.ch/record/2815292
- The CMS collaboration “The CMS HGCAL detector for HL-LHC upgrade” In 5th Large Hadron Collider Physics Conference, 2017 DOI: 10.48550/arXiv.1708.08234
- “Generative Adversarial Networks” In Advances in Neural Information Processing Systems 63.11 New York, NY, USA: Association for Computing Machinery, 2020, pp. 139 DOI: 10.1145/3422622
- Diederik P Kingma and Max Welling “Auto-Encoding Variational Bayes” arXiv, 2013 DOI: 10.48550/arxiv.1312.6114
- Luke Oliveira, Michela Paganini and Benjamin Nachman In Comput. Software Big Sci. 4.1, 2017 DOI: 10.1007/s41781-017-0004-6
- Michela Paganini, Luke Oliveira and Benjamin Nachman “Accelerating Science with Generative Adversarial Networks: An Application to 3D Particle Showers in Multilayer Calorimeters” In Phys. Rev. Lett. 120.4, 2018, pp. 042003 DOI: 10.1103/PhysRevLett.120.042003
- Michela Paganini, Luke Oliveira and Benjamin Nachman “CaloGAN: Simulating 3D high energy particle showers in multilayer electromagnetic calorimeters with generative adversarial networks” In Phys. Rev. D 97.1, 2018, pp. 014021 DOI: 10.1103/PhysRevD.97.014021
- “Generating and refining particle detector simulations using the Wasserstein distance in adversarial networks” In Comput. Softw. Big Sci. 2.1, 2018, pp. 4 DOI: 10.1007/s41781-018-0008-x
- Martin Erdmann, Jonas Glombitza and Thorben Quast “Precise simulation of electromagnetic calorimeter showers using a Wasserstein Generative Adversarial Network” In Comput. Softw. Big Sci. 3.1, 2019, pp. 4 DOI: 10.1007/s41781-018-0019-7
- “Three dimensional Generative Adversarial Networks for fast simulation” In J. Phys. Conf. Ser. 1085.3, 2018, pp. 032016 DOI: 10.1088/1742-6596/1085/3/032016
- Georges Aad “AtlFast3: The Next Generation of Fast Simulation in ATLAS” In Comput Softw Big Sci 6, 2022, pp. 7 DOI: https://doi.org/10.1007/s41781-021-00079-7
- “JetFlow: Generating Jets with Conditioned and Mass Constrained Normalising Flows”, 2022 arXiv:2211.13630 [hep-ex]
- Benno Käch, Dirk Krücker and Isabell Melzer-Pellmann “Point Cloud Generation using Transformer Encoders and Normalising Flows”, 2022 arXiv:2211.13623 [hep-ex]
- Simon Schnake, Dirk Krücker and Kerstin Borras “Generating Calorimeter Showers as Point Clouds” In Machine Learning and the Physical Sciences, Workshop at the 36th conference on Neural Information Processing Systems (NeurIPS), 2022 URL: https://ml4physicalsciences.github.io/2022/files/NeurIPS_ML4PS_2022_77.pdf
- “The Graph Neural Network Model” In IEEE Transactions on Neural Networks 20, 2009, pp. 61–80 URL: https://api.semanticscholar.org/CorpusID:206756462
- Thomas N Kipf and Max Welling “Semi-supervised classification with graph convolutional networks” In IEEE Transactions on Neural Networks. 5.1, 2016, pp. 61–80 DOI: 10.1109/TNN.2008.2005605
- Dong Wook Shu, Sung Woo Park and Junseok Kwon “3D Point Cloud Generative Adversarial Network Based on Tree Structured Graph Convolutions”, 2019 arXiv:1905.06292 [cs.CV]
- “JetNet” Zenodo, 2022 DOI: 10.5281/zenodo.6975118
- Raghav Kansal “JetNet150” Zenodo, 2022 DOI: 10.5281/zenodo.6975117
- Torbjorn Sjostrand, Stephen Mrenna and Peter Z. Skands “A Brief Introduction to PYTHIA 8.1” In Comput. Phys. Commun. 178, 2008, pp. 852 DOI: 10.1016/j.cpc.2008.01.036
- Erik Buhmann “Getting High: High Fidelity Simulation of High Granularity Calorimeters with High Speed”, 2020
- “Decoding Photons: Physics in the Latent Space of a BIB-AE Generative Network” In EPJ Web Conf. 251, 2021, pp. 03003 DOI: 10.1051/epjconf/202125103003
- “Attention to Mean-Fields for Particle Cloud Generation”, 2023 arXiv:2305.15254 [hep-ex]
- “Score-based generative models for calorimeter shower simulation” In Phys. Rev. D 106.9, 2022, pp. 092009 DOI: 10.1103/PhysRevD.106.092009
- Vinicius Mikuni, Benjamin Nachman and Mariel Pettee “Fast Point Cloud Generation with Diffusion Models in High Energy Physics”, 2023 arXiv:2304.01266 [hep-ph]
- “Particle Cloud Generation with Message Passing Generative Adversarial Networks”, 2022 arXiv:2106.11535 [cs.LG]
- “Evaluating generative models in high energy physics” In Physical Review D 107.7 American Physical Society (APS), 2023 DOI: 10.1103/physrevd.107.076017
- “PC-JeDi: Diffusion for Particle Cloud Generation in High Energy Physics”, 2023 arXiv:2303.05376 [hep-ph]
- Erik Buhmann, Gregor Kasieczka and Jesse Thaler “EPiC-GAN: Equivariant Point Cloud Generation for Particle Jets”, 2023 arXiv:2301.08128 [hep-ph]
- “EPiC-ly Fast Particle Cloud Generation with Flow-Matching and Diffusion”, 2023 arXiv:2310.00049 [hep-ph]
- Alec Radford, Luke Metz and Soumith Chintala “Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks”, 2016 arXiv:1511.06434 [cs.LG]
- Shaked Brody, Uri Alon and Eran Yahav “How Attentive are Graph Attention Networks?” In International Conference on Learning Representations, 2022 URL: https://openreview.net/forum?id=F72ximsx7C1
- Matthias Fey and Jan E. Lenssen “Fast Graph Representation Learning with PyTorch Geometric” In ICLR Workshop on Representation Learning on Graphs and Manifolds, 2019
- “Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks” arXiv:1810.00825 [cs, stat] type: article, 2019 DOI: 10.48550/arXiv.1810.00825
- “Spectral Normalization for Generative Adversarial Networks” In 6th International Conference on Learning Representations, 2018 arXiv: https://openreview.net/forum?id=B1QRgziT-
- “Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift”, 2015 arXiv:1502.03167 [cs.LG]
- Jae Hyun Lim and Jong Chul Ye “Geometric GAN”, 2017 arXiv:1705.02894 [stat.ML]
- Diederik P. Kingma and Jimmy Ba “Adam: A Method for Stochastic Optimization” In 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Conference Track Proceedings, 2015 DOI: 10.48550/arXiv.1412.6980
- “Improved Techniques for Training GANs”, 2016 arXiv:1606.03498 [cs.LG]
- “Scikit-learn: Machine Learning in Python” In Journal of Machine Learning Research 12, 2011, pp. 2825–2830
- “JetNet: A Python package for accessing open datasets and benchmarking machine learning methods in high energy physics” In Journal of Open Source Software 8.90, 2023, pp. 5789 DOI: 10.21105/joss.05789