Dice Question Streamline Icon: https://streamlinehq.com

Systematic exploration of model-size scaling in brain-to-image decoding

Investigate the impact of model size on decoding performance by systematically scaling the parameter count and depth of the deep learning brain modules used to predict DINOv2-giant image embeddings from EEG, MEG, 3T fMRI, and 7T fMRI recordings, and determine how performance depends on model size under fixed data regimes to characterize data–model-size scaling laws.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper establishes log-linear scaling laws for image decoding performance with respect to data quantity across EEG, MEG, 3T fMRI, and 7T fMRI. While data quantity per subject is identified as the dominant driver of performance, the authors note that typical scaling law analyses also depend on the relationship between data size and model size. Their benchmark uses two state-of-the-art deep learning architectures tailored to M/EEG and fMRI but does not systematically vary architecture size beyond selected configurations.

Consequently, the role of model size in optimizing brain-to-image decoding remains unresolved. A systematic paper of increasingly large architectures would clarify compute-optimal regimes, potential diminishing returns, and whether larger models yield disproportionate gains for noisier modalities (EEG/MEG) versus higher-SNR modalities (fMRI), thereby informing future dataset and model design.

References

Finally, the study of scaling laws often considers the impact of data size in relation to the size of the model. The systematic exploration of increasingly large architectures remains an open question~\citep{kaplan2020scaling,hoffmann2024training}.

Scaling laws for decoding images from brain activity (2501.15322 - Banville et al., 25 Jan 2025) in Discussion, Contributions (Scaling laws)