Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multi-head Temporal Attention-Augmented Bilinear Network for Financial time series prediction (2201.05459v1)

Published 14 Jan 2022 in cs.LG and cs.CE

Abstract: Financial time-series forecasting is one of the most challenging domains in the field of time-series analysis. This is mostly due to the highly non-stationary and noisy nature of financial time-series data. With progressive efforts of the community to design specialized neural networks incorporating prior domain knowledge, many financial analysis and forecasting problems have been successfully tackled. The temporal attention mechanism is a neural layer design that recently gained popularity due to its ability to focus on important temporal events. In this paper, we propose a neural layer based on the ideas of temporal attention and multi-head attention to extend the capability of the underlying neural network in focusing simultaneously on multiple temporal instances. The effectiveness of our approach is validated using large-scale limit-order book market data to forecast the direction of mid-price movements. Our experiments show that the use of multi-head temporal attention modules leads to enhanced prediction performances compared to baseline models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Mostafa Shabani (7 papers)
  2. Dat Thanh Tran (22 papers)
  3. Martin Magris (16 papers)
  4. Juho Kanniainen (40 papers)
  5. Alexandros Iosifidis (153 papers)
Citations (10)

Summary

We haven't generated a summary for this paper yet.