Journal of Beijing Normal University, Volume. 61, Issue 3, 293(2025)

Time series data forecasting method based on spatio-temporal dimensions reconstruction

JIANG Shan1,2, CHANG Le1, and YIN Lu3、*
Author Affiliations
  • 1School of Information Engineering, Minzu Univeristy of China, Beijing, China
  • 2Hainan International College, Minzu University of China, Lingshui, Hainan, China
  • 3School of Mechanical and Electrical Engineering, Chengdu University of Technology, Chengdu, Sichuan, China
  • show less
    References(21)

    [1] [1] KANG B G, LEE D J, KIM H G, et al. Introducing spectral attention for long-range dependency in time series fore-casting[EB/OL]. [2025-03-29]. https://proceedings.neurips.cc/paper_files/paper/2024/file/f6adf61977467560f79b95485d1f3a79-Paper-Conference.pdf

    [5] [5] LIU Y, WU H X, WANG J M, et al. Non-stationary transformers: exploring the stationarity in time series forecasting[EB/OL]. [2025-03-29]. https://proceedings.neurips.cc/paper_files/paper/2022/file/4054556fcaa934b0bf76da52cf4f92cb-Paper-Conference.pdf

    [6] [6] TAN M T, MERRILL M A, GUPTA V Y, et al. Are language models actually useful for time series forecasting[EB/OL]. [2025-03-29]. https://proceedings.neurips.cc/paper_files/paper/2024/file/6ed5bf446f59e2c6646d23058c86424b-Paper-Conference.pdf

    [7] [7] SALINAS D, FLUNKERT V, GASTHAUS J, et al. DeepAR: probabilistic forecasting with autoregressive recurrent networks[J]. International Journal of Forecasting, 2020, 36(3): 1181

    [8] [8] KILIAN L, LTKEPOHL H. Structural vector autoregressive analysis[M]. Cambridge: Cambridge University Press, 2017

    [9] [9] XUE T J, ADRIAENSSENS S, MAO S. Learning the nonlinear dynamics of mechanical metamaterials with graph networks[J]. International Journal of Mechanical Sciences, 2023, 238: 107835

    [10] [10] ROGERS A, KOVALEVA O, RUMSHISKY A. A primer in BERTology: what we know about how BERT works[J]. Transactions of the Association for Computational Linguistics, 2020, 8: 842

    [11] [11] DOSOVITSKIY A, BEYER L, KOLESNIKOV A, et al. An image is worth 16×16 words: transformers for image recognition at scale[EB/OL]. (2021-06-03) [2025-03-29]. https://arxiv.org/pdf/2010.11929/1000

    [12] [12] LI S Y, JIN X Y, XUAN Y, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[EB/OL]. (2020-01-03) [2025-03-31]. https://arxiv.org/abs/1907.00235v3

    [13] [13] ZHOU H Y, ZHANG S H, PENG J Q, et al. Informer: beyond efficient transformer for long sequence time-series forecasting[EB/OL]. (2021-03-28) [2025-03-31]. https://arxiv.org/abs/2012.07436v3

    [14] [14] WU H X, XU J H, WANG J M, et al. Autoformer: decomposition transformers with auto-correlation for long-term series forecasting[EB/OL]. (2022-01-07) [2025-03-31]. https://arxiv.org/abs/2106.13008v5

    [15] [15] LI X X, AI Q, XU M. RUL prediction of lithium-ion batteries based on timeGAN-pyraformer-BiLSTM[J]. Engineering Letters, 2024, 32(8): 1675

    [16] [16] NIE Y Q, NGUYEN N H, SINTHONG P W, et al. A time series is worth 64 words: long-term forecasting with transformers[EB/OL]. [2025-03-31]. https://openreview.net/pdf?id=Jbdc0vTOcol

    [17] [17] JIN Q Z, ZHANG X B, XIAO X Y, et al. Preformer: simple and efficient design for precipitation nowcasting with transformers[J]. IEEE Geoscience and Remote Sensing Letters, 2023, 21: 1000205

    [18] [18] LIU Y, HU T G, ZHANG H R, et al. iTtransformer: inverted Transformers are effective for time series forecasting[EB/OL]. (2024-03-14) [2025-03-31]. https://arxiv.org/abs/2310.06625

    [19] [19] LIU Y J, LIU Q X, ZHANG J W, et al. Multivariate time-series forecasting with temporal polynomial graph neural networks[EB/OL]. [2025-04-02]. https://proceedings.neurips.cc/paper_files/paper/2022/file/7b102c908e9404dd040599c65db4ce3e-Paper-Conference.pdf

    [21] [21] BAHDANAU D, CHO K H, BENGIO Y H. Neural machine translation by jointly learning to align and translate[EB/OL]. (2016-05-19) [2025-04-02]. https://3dvar.com/Bahdanau2014Neural.pdf

    [22] [22] WANG D Z, CHEN C Y. Spatiotemporal self-attention-based LSTNet for multivariate time series prediction[J]. International Journal of Intelligent Systems, 2023, 2023(1): 9523230

    [23] [23] LIU R, HUANG C Z. Data-driven modeling approach of heavy-duty gas turbine with physical constraint by MTGNN and Transformer[J]. Control Engineering Practice, 2024, 151: 106014

    [24] [24] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[EB/OL]. [2025-04-02]. https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf

    [25] [25] ZHANG Y H, YAN J C. Crossformer: transformer utilizing cross-dimension dependency for multivariate time series forecasting[EB/OL]. [2025-04-02]. https://openreview.net/pdf?id=vSVLM2j9eie

    Tools

    Get Citation

    Copy Citation Text

    JIANG Shan, CHANG Le, YIN Lu. Time series data forecasting method based on spatio-temporal dimensions reconstruction[J]. Journal of Beijing Normal University, 2025, 61(3): 293

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Received: Apr. 9, 2025

    Accepted: Aug. 21, 2025

    Published Online: Aug. 21, 2025

    The Author Email: YIN Lu (yinlu@cdut.edu.cn)

    DOI:10.12202/j.0476-0301.2025057

    Topics