Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LightCode: Light Analytical and Neural Codes for Channels with Feedback (2403.10751v3)

Published 16 Mar 2024 in cs.IT, cs.AI, and math.IT

Abstract: The design of reliable and efficient codes for channels with feedback remains a longstanding challenge in communication theory. While significant improvements have been achieved by leveraging deep learning techniques, neural codes often suffer from high computational costs, a lack of interpretability, and limited practicality in resource-constrained settings. We focus on designing low-complexity coding schemes that are interpretable and more suitable for communication systems. We advance both analytical and neural codes. First, we demonstrate that PowerBlast, an analytical coding scheme inspired by Schalkwijk-Kailath (SK) and Gallager-Nakibo\u{g}lu (GN) schemes, achieves notable reliability improvements over both SK and GN schemes, outperforming neural codes in high signal-to-noise ratio (SNR) regions. Next, to enhance reliability in low-SNR regions, we propose LightCode, a lightweight neural code that achieves state-of-the-art reliability while using a fraction of memory and compute compared to existing deeplearning-based codes. Finally, we systematically analyze the learned codes, establishing connections between LightCode and PowerBlast, identifying components crucial for performance, and providing interpretation aided by linear regression analysis.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. C. Shannon, “The zero error capacity of a noisy channel,” IRE Transactions on Information Theory, vol. 2, no. 3, pp. 8–19, 1956.
  2. J. Schalkwijk and T. Kailath, “A coding scheme for additive noise channels with feedback–i: No bandwidth constraint,” IEEE Transactions on Information Theory, vol. 12, no. 2, pp. 172–182, 1966.
  3. J. Schalkwijk, “A coding scheme for additive noise channels with feedback–ii: Band-limited signals,” IEEE Transactions on Information Theory, vol. 12, no. 2, pp. 183–189, 1966.
  4. R. G. Gallager and B. Nakiboğlu, “Variations on a theme by schalkwijk and kailath,” IEEE Transactions on Information Theory, vol. 56, no. 1, pp. 6–17, 2009.
  5. A. Ben-Yishai and O. Shayevitz, “Interactive schemes for the awgn channel with noisy feedback,” IEEE Transactions on Information Theory, vol. 63, no. 4, pp. 2409–2427, 2017.
  6. Z. Chance and D. J. Love, “Concatenated coding for the awgn channel with noisy feedback,” IEEE Transactions on Information Theory, vol. 57, no. 10, pp. 6633–6649, 2011.
  7. R. Mishra, D. Vasal, and H. Kim, “Linear coding for awgn channels with noisy output feedback via dynamic programming,” IEEE Transactions on Information Theory, 2023.
  8. J. M. Ooi and G. W. Wornell, “Fast iterative coding techniques for feedback channels,” IEEE Transactions on Information Theory, vol. 44, no. 7, pp. 2960–2976, 1998.
  9. A. G. Perotti, B. M. Popovic, and A. R. Safavi, “Accumulative iterative codes based on feedback,” arXiv preprint arXiv:2106.07415, 2021.
  10. S. K. Ankireddy, S. A. Hebbar, Y. Jiang, P. Viswanath, and H. Kim, “Compressed error harq: Feedback communication on noise-asymmetric channels,” in 2023 IEEE International Symposium on Information Theory (ISIT).   IEEE, 2023, pp. 1160–1165.
  11. J. Griffin, P. Yuan, P. Popovski, K. R. Duffy, and M. Médard, “Code at the receiver, decode at the sender: Grand with feedback,” in 2023 IEEE Information Theory Workshop (ITW).   IEEE, 2023, pp. 341–346.
  12. M. Raghu, B. Poole, J. Kleinberg, S. Ganguli, and J. Sohl-Dickstein, “On the expressive power of deep neural networks,” in international conference on machine learning.   PMLR, 2017, pp. 2847–2854.
  13. E. Nachmani, Y. Be’ery, and D. Burshtein, “Learning to decode linear codes using deep learning,” in 2016 54th Annual Allerton Conference on Communication, Control, and Computing (Allerton).   IEEE, 2016, pp. 341–346.
  14. E. Nachmani, E. Marciano, L. Lugosch, W. J. Gross, D. Burshtein, and Y. Be’ery, “Deep learning methods for improved decoding of linear codes,” IEEE Journal of Selected Topics in Signal Processing, vol. 12, no. 1, pp. 119–131, 2018.
  15. S. K. Ankireddy and H. Kim, “Interpreting neural min-sum decoders,” in ICC 2023-IEEE International Conference on Communications.   IEEE, 2023, pp. 6645–6651.
  16. S. A. Hebbar, R. K. Mishra, S. K. Ankireddy, A. V. Makkuva, H. Kim, and P. Viswanath, “Tinyturbo: Efficient turbo decoders on edge,” in 2022 IEEE International Symposium on Information Theory (ISIT).   IEEE, 2022, pp. 2797–2802.
  17. S. A. Hebbar, V. V. Nadkarni, A. V. Makkuva, S. Bhat, S. Oh, and P. Viswanath, “Crisp: Curriculum based sequential neural decoders for polar code family,” in International Conference on Machine Learning.   PMLR, 2023, pp. 12 823–12 845.
  18. S. A. Hebbar, S. K. Ankireddy, H. Kim, S. Oh, and P. Viswanath, “Deeppolar: Inventing nonlinear large-kernel polar codes via deep learning,” arXiv preprint arXiv:2402.08864, 2024.
  19. Y. Li, Z. Chen, G. Liu, Y.-C. Wu, and K.-K. Wong, “Learning to construct nested polar codes: An attention-based set-to-element model,” IEEE Communications Letters, vol. 25, no. 12, pp. 3898–3902, 2021.
  20. S. K. Ankireddy, S. A. Hebbar, H. Wan, J. Cho, and C. Zhang, “Nested construction of polar codes via transformers,” arXiv preprint arXiv:2401.17188, 2024.
  21. H. Kim, Y. Jiang, S. Kannan, S. Oh, and P. Viswanath, “Deepcode: Feedback codes via deep learning,” Advances in neural information processing systems, vol. 31, 2018.
  22. A. R. Safavi, A. G. Perotti, B. M. Popovic, M. B. Mashhadi, and D. Gunduz, “Deep extended feedback codes,” arXiv preprint arXiv:2105.01365, 2021.
  23. M. B. Mashhadi, D. Gunduz, A. Perotti, and B. Popovic, “Drf codes: Deep snr-robust feedback codes,” arXiv preprint arXiv:2112.11789, 2021.
  24. J. Kim, T. Kim, D. Love, and C. Brinton, “Robust non-linear feedback coding via power-constrained deep learning,” arXiv preprint arXiv:2304.13178, 2023.
  25. Y. Shao, E. Ozfatura, A. Perotti, B. Popovic, and D. Gündüz, “Attentioncode: Ultra-reliable feedback codes for short-packet communications,” IEEE Transactions on Communications, 2023.
  26. E. Ozfatura, Y. Shao, A. G. Perotti, B. M. Popović, and D. Gündüz, “All you need is feedback: Communication with block attention feedback codes,” IEEE Journal on Selected Areas in Information Theory, vol. 3, no. 3, pp. 587–602, 2022.
  27. P. Elias, “Channel capacity without coding,” in Lectures on Communication System Theory, E. Baghdady, Ed.   New York: McGraw Hill, 1961, quarterly progress report, MIT Research Laboratory of Electronics, Oct 15 1956.
  28. M. Agrawal, D. J. Love, and V. Balakrishnan, “An iteratively optimized linear coding scheme for correlated gaussian channels with noisy feedback,” in 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton).   IEEE, 2011, pp. 1012–1018.
  29. Y. Bahri, E. Dyer, J. Kaplan, J. Lee, and U. Sharma, “Explaining neural scaling laws,” arXiv preprint arXiv:2102.06701, 2021.
Citations (4)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com