Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Investigating the Timescales of Language Processing with EEG and Language Models (2406.19884v2)

Published 28 Jun 2024 in cs.CL and q-bio.NC

Abstract: This study explores the temporal dynamics of language processing by examining the alignment between word representations from a pre-trained transformer-based LLM, and EEG data. Using a Temporal Response Function (TRF) model, we investigate how neural activity corresponds to model representations across different layers, revealing insights into the interaction between artificial LLMs and brain responses during language comprehension. Our analysis reveals patterns in TRFs from distinct layers, highlighting varying contributions to lexical and compositional processing. Additionally, we used linear discriminant analysis (LDA) to isolate part-of-speech (POS) representations, offering insights into their influence on neural responses and the underlying mechanisms of syntactic processing. These findings underscore EEG's utility for probing language processing dynamics with high temporal resolution. By bridging artificial LLMs and neural activity, this study advances our understanding of their interaction at fine timescales.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Davide Turco (2 papers)
  2. Conor Houghton (22 papers)

Summary

We haven't generated a summary for this paper yet.