2000 character limit reached
Efficient Training of Volterra Series-Based Pre-distortion Filter Using Neural Networks (2112.06637v1)
Published 13 Dec 2021 in eess.SP and cs.AI
Abstract: We present a simple, efficient "direct learning" approach to train Volterra series-based digital pre-distortion filters using neural networks. We show its superior performance over conventional training methods using a 64-QAM 64-GBaud simulated transmitter with varying transmitter nonlinearity and noisy conditions.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.