Papers
Topics
Authors
Recent
Search
2000 character limit reached

Semantic Importance-Aware Communications Using Pre-trained Language Models

Published 12 Feb 2023 in eess.SP | (2302.07142v2)

Abstract: This letter proposes a semantic importance-aware communication (SIAC) scheme using pre-trained LLMs (e.g., ChatGPT, BERT, etc.). Specifically, we propose a cross-layer design with a pre-trained LLM embedded in/connected by the cross-layer manager. The pre-trained LLM is utilized to quantify the semantic importance of data frames. Based on the quantified semantic importance, we investigate semantic importance-aware power allocation. Unlike existing deep joint source-channel coding (Deep-JSCC)-based semantic communication schemes, SIAC can be directly embedded into current communication systems by only introducing a cross-layer manager. Our experimental results show that the proposed SIAC scheme can achieve lower semantic loss than existing equal-priority communications.

Citations (29)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.