Semantic Importance-Aware Communications Using Pre-trained Language Models (2302.07142v2)
Abstract: This letter proposes a semantic importance-aware communication (SIAC) scheme using pre-trained LLMs (e.g., ChatGPT, BERT, etc.). Specifically, we propose a cross-layer design with a pre-trained LLM embedded in/connected by the cross-layer manager. The pre-trained LLM is utilized to quantify the semantic importance of data frames. Based on the quantified semantic importance, we investigate semantic importance-aware power allocation. Unlike existing deep joint source-channel coding (Deep-JSCC)-based semantic communication schemes, SIAC can be directly embedded into current communication systems by only introducing a cross-layer manager. Our experimental results show that the proposed SIAC scheme can achieve lower semantic loss than existing equal-priority communications.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.