Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 76 tok/s
Gemini 2.5 Pro 59 tok/s Pro
GPT-5 Medium 24 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 95 tok/s Pro
Kimi K2 207 tok/s Pro
GPT OSS 120B 449 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Semantic Compression with Information Lattice Learning (2404.03131v1)

Published 4 Apr 2024 in cs.IT and math.IT

Abstract: Data-driven AI techniques are becoming prominent for learning in support of data compression, but are focused on standard problems such as text compression. To instead address the emerging problem of semantic compression, we argue that the lattice theory of information is particularly expressive and mathematically precise in capturing notions of abstraction as a form of lossy semantic compression. As such, we demonstrate that a novel AI technique called information lattice learning, originally developed for knowledge discovery and creativity, is powerful for learning to compress in a semantically-meaningful way. The lattice structure further implies the optimality of group codes and the successive refinement property for progressive transmission.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (36)
  1. C. S. K. Valmeekam, K. Narayanan, D. Kalathil, J.-F. Chamberland, and S. Shakkottai, “LLMZip: Lossless text compression using large language models,” arXiv:2306.04050 [cs.IT]., Jun. 2023.
  2. D. Gündüz, Z. Qin, I. E. Aguerri, H. S. Dhillon, Z. Yang, A. Yener, K. K. Wong, and C.-B. Chae, “Beyond transmitting bits: Context, semantics, and task-oriented communications,” IEEE J. Sel. Areas Commun., vol. 41, no. 1, pp. 5–41, Jan. 2023.
  3. Y. E. Sagduyu, T. Erpek, A. Yener, and S. Ulukus, “Will 6G be semantic communications? opportunities and challenges from task oriented and secure communications to integrated sensing,” arXiv:2401.01531 [cs.NI]., Jan. 2024.
  4. D. Kayser, “Abstraction and natural language semantics,” Phil. Trans. R. Soc. B, Biol. Sci., vol. 358, no. 1435, pp. 1261–1268, Jul. 2003.
  5. F. Pulvermüller, “How neurons make meaning: brain mechanisms for embodied and abstract-symbolic semantics,” Trends Cogn. Sci., vol. 17, no. 9, pp. 458–470, Sep. 2013.
  6. D. L. Schwartz, “The emergence of abstract representations in dyad problem solving,” J. Learning Sci., vol. 4, no. 3, pp. 321–354, 1995.
  7. H. Yu, J. A. Evans, and L. R. Varshney, “Information lattice learning,” J. Artif. Intell. Res., vol. 77, pp. 971–1019, Jul. 2023.
  8. C. E. Shannon, “A mathematical theory of communication,” Bell Syst. Tech. J., vol. 27, pp. 379–423, 623–656, July/Oct. 1948.
  9. ——, “The lattice theory of information,” Trans. IRE Prof. Group Inf. Theory, vol. 1, no. 1, pp. 105–107, Feb. 1953.
  10. H. Li and E. K. P. Chong, “Information lattices and subgroup lattices: Isomorphisms and approximations,” in Proc. 45th Annu. Allerton Conf. Commun. Control Comput., Sep. 2007, pp. 1103–1110.
  11. I. Delsol, O. Rioul, J. Béguinot, V. Rabiet, and A. Souloumiac, “An information theoretic condition for perfect reconstruction,” Entropy, vol. 26, no. 1, p. 86, Jan. 2024.
  12. Y. Bar-Hillel and R. Carnap, “Semantic information,” Br. J. Philos. Sci., vol. 4, no. 14, pp. 147–157, Aug. 1953.
  13. Y. Shao, Q. Cao, and D. Gündüz, “A theory of semantic communication,” arXiv:2212.01485 [cs.IT]., Dec. 2022.
  14. K. Niu and P. Zhang, “A mathematical theory of semantic communication,” arXiv:2401.13387 [cs.IT]., Jan. 2024.
  15. E. Ozyilkan and E. Erkip, “Distributed compression in the era of machine learning: A review of recent advances,” in Proc. 58th Annu. Conf. Inf. Sci. Syst. (CISS 2024), Mar. 2024.
  16. H. Yu and L. R. Varshney, “Towards deep interpretability (MUS-ROVER II): Learning hierarchical representations of tonal music,” in Proc. 6th Int. Conf. Learn. Represent. (ICLR), Apr. 2017.
  17. H. Yu, I. Mineyev, and L. R. Varshney, “Orbit computation for atomically generated subgroups of isometries of ℤnsuperscriptℤ𝑛\mathbb{Z}^{n}blackboard_Z start_POSTSUPERSCRIPT italic_n end_POSTSUPERSCRIPT,” SIAM J. Appl. Algebr. Geom., vol. 5, no. 3, pp. 479–505, Sep. 2021.
  18. H. Yu, H. Taube, J. A. Evans, and L. R. Varshney, “Human evaluation of interpretability: The case of AI-generated music knowledge,” in ACM CHI 2020 Workshop Artif. Intell. HCI: A Modern Approach, Apr. 2020.
  19. H. Yu, L. R. Varshney, H. Taube, and J. A. Evans, “(Re)discovering laws of music theory using information lattice learning,” IEEE BITS Inf. Theory Mag., vol. 2, no. 1, pp. 58–75, Oct. 2022.
  20. H. Yu, J. A. Evans, D. Gallo, A. J. Kruse, W. M. Patterson, and L. R. Varshney, “AI-aided co-creation for wellbeing,” in Proc. 2nd Workshop Future Co-Creative Syst., Sep. 2021, pp. 453–456.
  21. H. Yu, I. Mineyev, and L. R. Varshney, “A group-theoretic approach to computational abstraction: Symmetry-driven hierarchical clustering,” J. Mach. Learn. Res., vol. 24, no. 47, pp. 1–61, 2023.
  22. H. Yu, I. Mineyev, L. R. Varshney, and J. A. Evans, “Learning from one and only one shot,” arXiv:2201.08815 [cs.CV]., Jan. 2022.
  23. L. R. Varshney and V. K. Goyal, “Toward a source coding theory for sets,” in Proc. IEEE Data Compression Conf. (DCC 2006), Mar. 2006, pp. 13–22.
  24. S. Basu, P. Sattigeri, K. Natesan Ramamurthy, V. Chenthamarakshan, K. R. Varshney, L. R. Varshney, and P. Das, “Equi-tuning: Group equivariant fine-tuning of pretrained models,” in Proc. 37th AAAI Conf. Artif. Intell., Feb. 2023, pp. 6788–6796.
  25. L. R. Varshney and V. K. Goyal, “Ordered and disordered source coding,” in Proc. Inf. Theory Appl. Inaugural Workshop, Feb. 2006.
  26. G. Rossi, “Partition distances,” arXiv:1106.4579 [cs.DM]., Jun. 2011.
  27. Y. Zhang, K. Ding, N. Li, H. Wang, X. Huang, and C.-C. J. Kuo, “Perceptually weighted rate distortion optimization for video-based point cloud compression,” IEEE Trans. Image Process., vol. 32, pp. 5933–5947, Oct. 2023.
  28. V. K. Goyal, “Multiple description coding: Compression meets the network,” IEEE Signal Process. Mag., vol. 18, no. 5, pp. 74–93, Sep. 2001.
  29. M. Mortaheb, M. A. A. Khojastepour, S. T. Chakradhar, and S. Ulukus, “Semantic multi-resolution communications,” in Proc. IEEE Global Telecommun. Conf. (GLOBECOM 2023), Dec. 2023.
  30. V. N. Koshelev, “Hierarchical coding of discrete sources,” Probl. Inf. Transm., vol. 16, no. 3, pp. 31–49, 1980.
  31. W. H. R. Equitz and T. M. Cover, “Successive refinement of information,” IEEE Trans. Inf. Theory, vol. 37, no. 2, pp. 269–275, Mar. 1991.
  32. L. R. Varshney, J. Kusuma, and V. K. Goyal, “Malleable coding for updatable cloud caching,” IEEE Trans. Commun., vol. 64, no. 12, pp. 4946–4955, Dec. 2016.
  33. H. Charvin, N. C. Volpi, and D. Polani, “Exact and soft successive refinement of the information bottleneck,” Entropy, vol. 25, no. 9, p. 1355, 2023.
  34. T. Berger, F. Jelinek, and J. K. Wolf, “Permutation codes for sources,” IEEE Trans. Inf. Theory, vol. IT-18, no. 1, pp. 160–169, Jan. 1972.
  35. V. K. Goyal, S. A. Savari, and W. Wang, “On optimal permutation codes,” IEEE Trans. Inf. Theory, vol. 47, no. 7, pp. 2961–2971, Nov. 2001.
  36. A. O. Constantinescu, J. X. O’Reilly, and T. E. J. Behrens, “Organizing conceptual knowledge in humans with a gridlike code,” Science, vol. 352, no. 6292, pp. 1464–1468, Jun. 2016.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.