Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
140 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning to Demodulate from Few Pilots via Offline and Online Meta-Learning (1908.09049v3)

Published 23 Aug 2019 in eess.SP, cs.IT, and math.IT

Abstract: This paper considers an Internet-of-Things (IoT) scenario in which devices sporadically transmit short packets with few pilot symbols over a fading channel. Devices are characterized by unique transmission non-idealities, such as I/Q imbalance. The number of pilots is generally insufficient to obtain an accurate estimate of the end-to-end channel, which includes the effects of fading and of the transmission-side distortion. This paper proposes to tackle this problem by using meta-learning. Accordingly, pilots from previous IoT transmissions are used as meta-training data in order to train a demodulator that is able to quickly adapt to new end-to-end channel conditions from few pilots. Various state-of-the-art meta-learning schemes are adapted to the problem at hand and evaluated, including Model-Agnostic Meta-Learning (MAML), First-Order MAML (FOMAML), REPTILE, and fast Context Adaptation VIA meta-learning (CAVIA). Both offline and online solutions are developed. In the latter case, an integrated online meta-learning and adaptive pilot number selection scheme is proposed. Numerical results validate the advantages of meta-learning as compared to training schemes that either do not leverage prior transmissions or apply a standard joint learning algorithms on previously received data.

Citations (85)

Summary

We haven't generated a summary for this paper yet.