Adapting xLSTM’s internal query mechanism for Mem-RPE
Investigate whether modifying the xLSTM architecture to directly connect its internal query mechanism to the goal image improves performance on the Mem‑RPE task compared to using xLSTM as a plug‑and‑play sequence model.
References
The more recent xLSTM performed less well than a GRU, but was used as a plug-n-play sequence model. We conjecture that performance could be optimized further by opening the black box and making the internal query mechanism connect to the goal image more directly.
— Kinaema: a recurrent sequence model for memory and pose in motion
(2510.20261 - Sariyildiz et al., 23 Oct 2025) in Section 5, Mem-RPE performance