본문 바로가기
  • Home

Adaptive Multi-Modal Deep Learning for Financial Market Prediction: A Multi-Scale Attention Approach

  • Journal of The Korea Society of Computer and Information
  • Abbr : JKSCI
  • 2025, 30(12), pp.297~304
  • Publisher : The Korean Society Of Computer And Information
  • Research Area : Engineering > Computer Science
  • Received : November 6, 2025
  • Accepted : December 5, 2025
  • Published : December 31, 2025

Liu Ming 1 Seong-Yoon Shin ORD ID 1

1국립군산대학교

Accredited

ABSTRACT

Financial market prediction remains challenging due to complex non-linear dependencies and regime shifts. Existing multi-modal approaches suffer from limited temporal horizons(5-10 days), simplistic features, and static fusion mechanisms. In this paper, we presents an enhanced dual-channel architecture with three innovations: (1) multi-scale temporal convolution capturing 5-40 day patterns; (2) adaptive cross-modal attention dynamically balancing sentiment and technical signals; (3) extended 60-day windows with 16 technical indicators. Experiments demonstrate 81.39% accuracy versus baseline's 58.23%, with ablation studies confirming individual contributions of 7.2%, 5.8%, and 5.6% respectively.It also outperforms state-of-the-art models like Deep Convolutional Transformer and 2CAT in both short-term and long-term forecasting tasks across multiple global stock indices. Moreover, the model’s interpretability is enhanced through attention weight visualization, enabling practitioners to identify key market drivers during different regimes.

Citation status

* References for papers published after 2024 are currently being built.