Two-dimensional attention-based multi-input LSTM for time series prediction
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kim, Eun Been | - |
dc.contributor.author | Park, Jung Hoon | - |
dc.contributor.author | Lee, Yung-Seop | - |
dc.contributor.author | Lim, Changwon | - |
dc.date.accessioned | 2021-08-13T05:40:15Z | - |
dc.date.available | 2021-08-13T05:40:15Z | - |
dc.date.issued | 2021-01 | - |
dc.identifier.issn | 2287-7843 | - |
dc.identifier.issn | 2383-4757 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/48331 | - |
dc.description.abstract | Time series prediction is an area of great interest to many people. Algorithms for time series prediction are widely used in many fields such as stock price, temperature, energy and weather forecast; in addtion, classical models as well as recurrent neural networks (RNNs) have been actively developed. After introducing the attention mechanism to neural network models, many new models with improved performance have been developed; in addition, models using attention twice have also recently been proposed, resulting in further performance improvements. In this paper, we consider time series prediction by introducing attention twice to an RNN model. The proposed model is a method that introduces H-attention and T-attention for output value and time step information to select useful information. We conduct experiments on stock price, temperature and energy data and confirm that the proposed model outperforms existing models. | - |
dc.format.extent | 19 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | KOREAN STATISTICAL SOC | - |
dc.title | Two-dimensional attention-based multi-input LSTM for time series prediction | - |
dc.type | Article | - |
dc.identifier.doi | 10.29220/CSAM.2021.28.1.039 | - |
dc.identifier.bibliographicCitation | COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, v.28, no.1, pp 39 - 57 | - |
dc.identifier.kciid | ART002682666 | - |
dc.description.isOpenAccess | N | - |
dc.identifier.wosid | 000616531100003 | - |
dc.identifier.scopusid | 2-s2.0-85102135897 | - |
dc.citation.endPage | 57 | - |
dc.citation.number | 1 | - |
dc.citation.startPage | 39 | - |
dc.citation.title | COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS | - |
dc.citation.volume | 28 | - |
dc.type.docType | Article | - |
dc.publisher.location | 대한민국 | - |
dc.subject.keywordAuthor | recurrent neural network | - |
dc.subject.keywordAuthor | correlation | - |
dc.subject.keywordAuthor | attention | - |
dc.subject.keywordAuthor | time series | - |
dc.subject.keywordPlus | REPRESENTATIONS | - |
dc.relation.journalResearchArea | Mathematics | - |
dc.relation.journalWebOfScienceCategory | Statistics & Probability | - |
dc.description.journalRegisteredClass | scopus | - |
dc.description.journalRegisteredClass | esci | - |
dc.description.journalRegisteredClass | kciCandi | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
84, Heukseok-ro, Dongjak-gu, Seoul, Republic of Korea (06974)02-820-6194
COPYRIGHT 2019 Chung-Ang University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.