Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
GPT-5.1
GPT-5.1 114 tok/s
Gemini 3.0 Pro 53 tok/s Pro
Gemini 2.5 Flash 132 tok/s Pro
Kimi K2 176 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

L2T-Hyena: Enhancing State-Space Models with an Adaptive Learn-to-Teach Framework (2511.05926v1)

Published 8 Nov 2025 in cs.IT and math.IT

Abstract: State-Space Models (SSMs) have emerged as efficient alternatives to computationally intensive architectures like Transformers, particularly for sequence modeling. However, a fundamental challenge in their training is the reliance on static loss functions, which may not be optimal across all learning stages. To address this issue, in this paper a hybrid model integrating the Hyena architecture with a Dynamic Loss Network (DLN) is proposed which is guided by a Learn-to-Teach (L2T) approach (L2T-DLN). In this framework, the Hyena model is a student, and its loss function is optimized adaptively. A teacher model, leveraging a memory of the student's past performance, guides the DLN in dynamically balancing the primary cross-entropy loss and a regularization term. Experiments on the Penn Treebank (PTB) dataset show that our approach significantly improves language modeling performance. Our proposed model achieved a validation Perplexity of 102.6, a notable improvement over the 110.4 achieved by a baseline Hyena model using a static loss function. This research indicates that combining SSMs with adaptive loss function markedly enhances the quality and efficiency of deep learning models for sequential data, showing potential for applications in NLP, time-series analysis, and biological signal processing.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.