Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Adapting Large Language Models for Character-based Augmentative and Alternative Communication (2501.10582v2)

Published 17 Jan 2025 in cs.CL and cs.HC

Abstract: Users of Augmentative and Alternative Communication (AAC) may write letter-by-letter via an interface that uses a character LLM. However, most state-of-the-art large pretrained LLMs predict subword tokens of variable length. We investigate how to practically use such models to make accurate and efficient character predictions. We fine-tune models using a large dataset of sentences we curated in which each sentence is rated according to how useful it might be for spoken or written AAC communication. We find that using an algorithm to produce character predictions from a subword LLM provides more accurate predictions than adding a classification layer or using a byte-level model. We also find that our domain adaptation procedure is effective at improving model performance on simple, conversational text.

Summary

We haven't generated a summary for this paper yet.