(Publisher of Peer Reviewed Open Access Journals)

International Journal of Advanced Computer Research (IJACR)

ISSN (Print):2249-7277    ISSN (Online):2277-7970
Volume-9 Issue-44 September-2019
Full-Text PDF
Paper Title : Determining the impact of window length on time series forecasting using deep learning
Author Name : Ammar Azlan, Yuhanis Yusof and Mohamad Farhan Mohamad Mohsin
Abstract :

Time series forecasting is a method of predicting the future based on previous observations. It depends on the values of the same variable, but at different time periods. To date, various models have been used in stock market time series forecasting, in particular using deep learning models. However, existing implementations of the models did not determine the suitable number of previous observations, that is the window length. Hence, this study investigates the impact of window length of long short-term memory model in forecasting stock market price. The forecasting is performed on S&P500 daily closing price data set. A different window length of 25-day, 50-day, and 100-day were tested on the same model and data set. The result of the experiment shows that different window length produced different forecasting accuracy. In the employed dataset, it is best to utilize 100 as the window length in forecasting the stock market price. Such a finding indicates the importance of determining the suitable window length for the problem in-hand as there is no One-Size-Fits-All model in time series forecasting.

Keywords : Deep learning, Long short-term memory (LSTM), Time series forecasting, Window length.
Cite this article : Azlan A, Yusof Y, Mohsin MF. Determining the impact of window length on time series forecasting using deep learning . International Journal of Advanced Computer Research. 2019; 9(44):260-267. DOI:10.19101/IJACR.PID77.
References :
[1]Ullah MI. Time series analysis and forecasting. 2013. http://itfeature.com/time-series-analysis-and-forecasting/time-series-analysis-forecasting. Accessed 17-July-2018.
[2]Malkiel BG, McCue K. A random walk down Wall Street. New York: Norton; 1985.
[Google Scholar]
[3]Khare K, Darekar O, Gupta P, Attar VZ. Short term stock price prediction using deep learning. In international conference on recent trends in electronics, information & communication technology 2017 (pp. 482-6). IEEE.
[Crossref] [Google Scholar]
[4]Sezer OB, Ozbayoglu AM, Dogdu E. An artificial neural network-based stock trading system using technical analysis and big data framework. In proceedings of the southeast conference 2017 (pp. 223-6). ACM.
[Crossref] [Google Scholar]
[5]Zhang J, Cui S, Xu Y, Li Q, Li T. A novel data-driven stock price trend prediction system. Expert Systems with Applications. 2018; 97:60-9.
[Crossref] [Google Scholar]
[6]Murphy JJ. Study guide for technical analysis of the futures markets: a self-training manual. New York Inst. of Finance; 1987.
[Google Scholar]
[7]Heydt M. Learning pandas. Packt Publishing Ltd; 2017.
[Google Scholar]
[8]Collobert R, Weston J. A unified architecture for natural language processing: deep neural networks with multitask learning. In proceedings of the international conference on machine learning 2008 (pp. 160-7). ACM.
[Crossref] [Google Scholar]
[9]Hinton G, Deng L, Yu D, Dahl G, Mohamed AR, Jaitly N, et al. Deep neural networks for acoustic modeling in speech recognition. Signal Processing Magazine. 2012; 29: 82-97.
[Google Scholar]
[10]Grover A, Kapoor A, Horvitz E. A deep hybrid model for weather forecasting. In proceedings of the ACM SIGKDD international conference on knowledge discovery and data mining 2015 (pp. 379-86). ACM.
[Crossref] [Google Scholar]
[11]Fischer T, Krauss C. Deep learning with long short-term memory networks for financial market predictions. European Journal of Operational Research. 2018; 270(2):654-69.
[Crossref] [Google Scholar]
[12]Shynkevich Y, McGinnity TM, Coleman SA, Belatreche A, Li Y. Forecasting price movements using technical indicators: investigating the impact of varying input window length. Neurocomputing. 2017; 264:71-88.
[Crossref] [Google Scholar]
[13]Zhang J, Liu J, Luo Y, Fu Q, Bi J, Qiu S, et al. Chemical substance classification using long short-term memory recurrent neural network. In IEEE international conference on communication technology 2017 (pp. 1994-7). IEEE.
[Crossref] [Google Scholar]
[14]Vui CS, Soon GK, On CK, Alfred R, Anthony P. A review of stock market prediction with artificial neural network (ANN). In IEEE international conference on control system, computing and engineering 2013 (pp. 477-82). IEEE.
[Crossref] [Google Scholar]
[15]LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015; 521:436-44.
[Google Scholar]
[16]Heaton JB, Polson NG, Witte JH. Deep learning for finance: deep portfolios. Applied Stochastic Models in Business and Industry. 2017; 33(1):3-12.
[Crossref] [Google Scholar]
[17]Graves A, Mohamed AR, Hinton G. Speech recognition with deep recurrent neural networks. In international conference on acoustics, speech and signal processing 2013 (pp. 6645-9). IEEE.
[Crossref] [Google Scholar]
[18]Mikolov T, Karafiát M, Burget L, Černocký J, Khudanpur S. Recurrent neural network-based language model. In conference of the international speech communication association 2010 (pp.1045-8).
[Google Scholar]
[19]Chaudhary JR, Patel AC. Bilingual machine translation using RNN based deep learning. International Journal of Scientific Research in Science, Engineering and Technology. 2018; 4(4):1480-4.
[Google Scholar]
[20]Jindal V. Generating image captions in Arabic using root-word based recurrent neural networks and deep neural networks. In thirty-second AAAI conference on artificial intelligence 2018 (pp.8093-4).
[Google Scholar]
[21]Gregor K, Danihelka I, Graves A, Rezende DJ, Wierstra D. Draw: a recurrent neural network for image generation. International conference on machine learning 2015.
[Google Scholar]
[22]Bengio Y, Simard P, Frasconi P. Learning long-term dependencies with gradient descent is difficult. IEEE Transactions on Neural Networks. 1994; 5(2):157-66.
[Crossref] [Google Scholar]
[23]Hochreiter S, Schmidhuber J. Long short-term memory. Neural Computation. 1997; 9(8):1735-80.
[Crossref] [Google Scholar]
[24]Wu YX, Wu QB, Zhu JQ. Improved EEMD-based crude oil price forecasting using LSTM networks. Physica A: Statistical Mechanics and its Applications. 2019; 516:114-24.
[Crossref] [Google Scholar]
[25]Schmidhuber J. Deep learning in neural networks: an overview. Neural Networks. 2015; 61:85-117.
[Crossref] [Google Scholar]
[26]Chammas E, Mokbel C, Likforman-Sulem L. Handwriting recognition of historical documents with few labeled data. In IAPR international workshop on document analysis systems 2018 (pp. 43-8). IEEE.
[Crossref] [Google Scholar]
[27]Tüske Z, Schlüter R, Ney H. Investigation on LSTM recurrent n-gram language models for speech recognition. Interspeech 2018 (pp. 3358-62).
[Google Scholar]
[28]Huffman S. The Google assistant, powering our new family of hardware. 2017. https://www.blog.google/products/assistant/google-assistant-powering-our-new-family-hardware/. Accessed 13-September-2018.
[29]Leswing K. Apple completely changed how Siri works and almost nobody noticed. 2016. https://www.businessinsider.my/apples-siri-using-neural-networks-2016-8/?r=US&IR=T.(checked). Accessed 13-September-2018.
[30]https://venturebeat.com/2018/09/14/gamesbeat-decides-96-tom-nook-goes-to-work/. Accessed 13-September-2018.
[31]https://www.zdnet.com/article/amazon-echo-the-four-hard-problems-amazon-had-to-solve-to-make-it-work. Accessed 13-September-2018.
[32]http://mattmahoney.net/dc/text.html. Accessed 13-September-2018.
[33]Pigou L, Van Den Oord A, Dieleman S, Van Herreweghe M, Dambre J. Beyond temporal pooling: recurrence and temporal convolutions for gesture recognition in video. International Journal of Computer Vision. 2018; 126(2-4):430-9.
[Crossref] [Google Scholar]
[34]Al-Smadi M, Talafha B, Al-Ayyoub M, Jararweh Y. Using long short-term memory deep neural networks for aspect-based sentiment analysis of Arabic reviews. International Journal of Machine Learning and Cybernetics. 2019; 10(8):2163-75.
[Crossref] [Google Scholar]
[35]Kim HY, Won CH. Forecasting the volatility of stock price index: a hybrid model integrating LSTM with multiple GARCH-type models. Expert Systems with Applications. 2018; 103:25-37.
[Crossref] [Google Scholar]
[36]Fallah S, Deo R, Shojafar M, Conti M, Shamshirband S. Computational intelligence approaches for energy load forecasting in smart energy management grids: state of the art, future challenges, and research directions. Energies. 2018; 11(3):1-31.
[Crossref] [Google Scholar]
[37]Pang X, Zhou Y, Wang P, Lin W, Chang V. An innovative neural network approach for stock market prediction. The Journal of Supercomputing. 2018:1-21.
[Crossref] [Google Scholar]