Papers
Topics
Authors
Recent
2000 character limit reached

Exploiting Tensor-based Bayesian Learning for Massive Grant-Free Random Access in LEO Satellite Internet of Things

Published 4 Dec 2022 in cs.IT and math.IT | (2212.01733v1)

Abstract: With the rapid development of Internet of Things (IoT), low earth orbit (LEO) satellite IoT is expected to provide low power, massive connectivity and wide coverage IoT applications. In this context, this paper provides a massive grant-free random access (GF-RA) scheme for LEO satellite IoT. This scheme does not need to change the transceiver, but transforms the received signal to a tensor decomposition form. By exploiting the characteristics of the tensor structure, a Bayesian learning algorithm for joint active device detection and channel estimation during massive GF-RA is designed. Theoretical analysis shows that the proposed algorithm has fast convergence and low complexity. Finally, extensive simulation results confirm its better performance in terms of error probability for active device detection and normalized mean square error for channel estimation over baseline algorithms in LEO satellite IoT. Especially, it is found that the proposed algorithm requires short preamble sequences and support massive connectivity with a low power, which is appealing to LEO satellite IoT.

Citations (10)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.