Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
Gemini 2.5 Pro
GPT-5
GPT-4o
DeepSeek R1 via Azure
2000 character limit reached

Hyft: A Reconfigurable Softmax Accelerator with Hybrid Numeric Format for both Training and Inference (2311.13290v2)

Published 22 Nov 2023 in cs.AR

Abstract: The attention mechanism is a pivotal element within the transformer architecture, making a substantial contribution to its exceptional performance. Within this attention mechanism, Softmax is an imperative component that enables the model to assess the degree of correlation between various segments of the input. Yet, prior research has shown that Softmax operations can significantly increase processing latency and energy consumption in the transformer network due to their internal nonlinear operations and data dependencies. In this work, we proposed Hyft, a hardware efficient floating point Softmax accelerator for both training and inference. Hyft aims to reduce the implementation cost of different nonlinear arithmetic operations within softmax by adaptively converting intermediate results into the most suitable numeric format for each specific operation, leading to reconfigurable accelerator with hybrid numeric format. The evaluation results highlight that Hyft achieves a remarkable 10x reduction in hardware resource utilization and a 6x reduction in processing latency, all while maintaining a negligible impact on transformer accuracy.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.