Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multi-Transmitter Coded Caching with Secure Delivery over Linear Networks -- Extended Version (2211.14672v1)

Published 26 Nov 2022 in cs.IT and math.IT

Abstract: In this paper, we consider multiple cache-enabled end-users connected to multiple transmitters through a linear network. We also prevent a totally passive eavesdropper, who sniffs the packets in the delivery phase, from obtaining any information about the original files in cache-aided networks. Three different secure centralized multi-transmitter coded caching scenarios namely, secure multi-transmitter coded caching, secure multi-transmitter coded caching with reduced subpacketization, and secure multi-transmitter coded caching with reduced feedback, are considered and closed-form coding delay and secret shared key storage expressions are provided. As our security guarantee, we show that the delivery phase does not reveal any information to the eavesdropper using the mutual information metric. Moreover, we investigate the secure decentralized multi-transmitter coded caching scenario, in which there is no cooperation between the clients and transmitters during the cache content placement phase and study its performance compared to the centralized scheme. We analyze the system's performance in terms of Coding Delay and guarantee the security of our presented schemes using the Mutual Information metric. Numerical evaluations verify that security incurs a negligible cost in terms of memory usage when the number of files and users are scaled up, in both centralized and decentralized scenarios. Also, we numerically show that by increasing the number of files and users, the secure coding delay of centralized and decentralized schemes became asymptotically equal.

Summary

We haven't generated a summary for this paper yet.