Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Exploring higher-order neural network node interactions with total correlation (2402.04440v1)

Published 6 Feb 2024 in cs.LG and stat.ML

Abstract: In domains such as ecological systems, collaborations, and the human brain the variables interact in complex ways. Yet accurately characterizing higher-order variable interactions (HOIs) is a difficult problem that is further exacerbated when the HOIs change across the data. To solve this problem we propose a new method called Local Correlation Explanation (CorEx) to capture HOIs at a local scale by first clustering data points based on their proximity on the data manifold. We then use a multivariate version of the mutual information called the total correlation, to construct a latent factor representation of the data within each cluster to learn the local HOIs. We use Local CorEx to explore HOIs in synthetic and real world data to extract hidden insights about the data structure. Lastly, we demonstrate Local CorEx's suitability to explore and interpret the inner workings of trained neural networks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (32)
  1. k-means++: the advantages of careful seeding. In Proceedings of the Eighteenth Annual ACM-SIAM Symposium on Discrete Algorithms, SODA ’07, pp.  1027–1035, USA, 2007. Society for Industrial and Applied Mathematics. ISBN 9780898716245.
  2. Pearson Correlation Coefficient, pp.  1–4. Springer Berlin Heidelberg, Berlin, Heidelberg, 2009. ISBN 978-3-642-00296-0.
  3. Representation learning: A review and new perspectives. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(8):1798–1828, 2013.
  4. Mutual information as a measure of multivariate association: analytical properties and statistical estimation. Journal of Statistical Computation and Simulation, 82(9):1257–1274, 2012.
  5. The structure and dynamics of networks with higher order interactions. Physics Reports, 1018:1–64, 2023.
  6. Robust feature-level adversaries are interpretability tools. In Koyejo, S., Mohamed, S., Agarwal, A., Belgrave, D., Cho, K., and Oh, A. (eds.), Advances in Neural Information Processing Systems, volume 35, pp.  33093–33106. Curran Associates, Inc., 2022.
  7. Diagnostics for deep neural networks with automated copy/paste attacks, 2023.
  8. Estimating total correlation with mutual information bounds. CoRR, abs/2011.04794, 2020.
  9. Explosive cooperation in social dilemmas on higher-order networks. arXiv preprint arXiv:2303.11475, 2023.
  10. Diffusion maps. Applied and computational harmonic analysis, 21(1):5–30, 2006.
  11. Knowledge neurons in pretrained transformers. arXiv preprint arXiv:2104.08696, 2021.
  12. Two’s company, three (or more) is a simplex: Algebraic-topological tools for understanding higher-order structure in neural data. Journal of computational neuroscience, 41:1–14, 2016.
  13. Higher-order interactions stabilize dynamics in competitive network models. Nature, 548(7666):210–213, 2017.
  14. Natural language descriptions of deep visual features. In International Conference on Learning Representations, 2022.
  15. Detecting and interpreting higher‐order interactions in ecological communities. Ecology Letters, 25(7):1604–1617, 2022.
  16. Kruskal, J. B. Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis. Psychometrika, 29(1):1–27, Mar 1964. ISSN 1860-0980.
  17. MNIST handwritten digit database. 2010.
  18. Locating and editing factual associations in GPT. Advances in Neural Information Processing Systems, 36, 2022.
  19. Visualizing structure and transitions in high-dimensional biological data. Nature Biotechnology, 37(12):1482–1492, Dec 2019. ISSN 1546-1696.
  20. Applied linear regression models. 1983.
  21. Comprehensive discovery of subsample gene expression components by information explanation: therapeutic implications in cancer. BMC Medical Genomics, 10(1):12, Mar 2017. ISSN 1755-8794.
  22. Homological scaffolds of brain functional networks. Journal of The Royal Society Interface, 11(101):20140873, 2014.
  23. Redmond, M. Communities and Crime. UCI Machine Learning Repository, 2009.
  24. Toward transparent ai: A survey on interpreting the inner structures of deep neural networks. In 2023 IEEE Conference on Secure and Trustworthy Machine Learning (SaTML), pp.  464–483, 2023.
  25. Editing a classifier by rewriting its prediction rules. In Ranzato, M., Beygelzimer, A., Dauphin, Y., Liang, P., and Vaughan, J. W. (eds.), Advances in Neural Information Processing Systems, volume 34, pp.  23359–23373. Curran Associates, Inc., 2021.
  26. Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res., 15(1):1929–1958, jan 2014. ISSN 1532-4435.
  27. Discovering structure in high-dimensional data through correlation explanation. In Proceedings of the 27th International Conference on Neural Information Processing Systems - Volume 1, NIPS’14, pp.  577–585, Cambridge, MA, USA, 2014. MIT Press.
  28. Maximally informative hierarchical representations of high-dimensional data, 2015.
  29. Detecting higher order genomic variant interactions with spectral analysis. In 2019 27th European Signal Processing Conference (EUSIPCO), pp.  1–5, 2019.
  30. Fast structure learning with modular regularization. In Wallach, H., Larochelle, H., Beygelzimer, A., d'Alché-Buc, F., Fox, E., and Garnett, R. (eds.), Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc., 2019.
  31. Using spearman’s correlation coefficients for exploratory data analysis on big dataset. Concurrency and Computation: Practice and Experience, 28(14):3866–3878, 2016.
  32. Higher-order interactions shape collective dynamics differently in hypergraphs and simplicial complexes. Nature Communications, 14(1):1605, 2023.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com