Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Look Ahead: Improving the Accuracy of Time-Series Forecasting by Previewing Future Time Features

Authors
Kim, SeonminChae, Dong-Kyu
Issue Date
Jul-2023
Publisher
Association for Computing Machinery, Inc
Keywords
Time-series forecasting; Time-series representation learning; Timestamp embedding; Transformer-based architectures
Citation
SIGIR 2023 - Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp.2134 - 2138
Indexed
SCOPUS
Journal Title
SIGIR 2023 - Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval
Start Page
2134
End Page
2138
URI
https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/192946
DOI
10.1145/3539618.3592013
Abstract
Time-series forecasting has been actively studied and adopted in various real-world domains. Recently there have been two research mainstreams in this area: building Transformer-based architectures such as Informer, Autoformer and Reformer, and developing time-series representation learning frameworks based on contrastive learning such as TS2Vec and CoST. Both efforts have greatly improved the performance of time series forecasting. In this paper, we investigate a novel direction towards improving the forecasting performance even more, which is orthogonal to the aforementioned mainstreams as a model-agnostic scheme. We focus on time stamp embeddings that has been less-focused in the literature. Our idea is simple-yet-effective: based on given current time stamp, we predict embeddings of its near future time stamp and utilize the predicted embeddings in the time-series (value) forecasting task. We believe that if such future time information can be previewed at the time of prediction, they can be utilized by any time-series forecasting models as useful additional information. Our experimental results confirmed that our method consistently and significantly improves the accuracy of the recent Transformer-based models and time-series representation learning frameworks. Our code is available at: https://github.com/sunsunmin/Look_Ahead.
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 컴퓨터소프트웨어학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE