2000 character limit reached
E2ATST: A Temporal-Spatial Optimized Energy-Efficient Architecture for Training Spiking Transformer (2508.00475v1)
Published 1 Aug 2025 in cs.AR and cs.NE
Abstract: (1) Pengcheng Laboratory, (2) Southern University of Science and Technology, (3) Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, (4) University of Chinese Academy of Sciences
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.