Coder Social home page Coder Social logo

deeplearning_timeseries's Introduction

Using Empirical Mode Decomposition and Convolutional Neural Networks for Time Series Forecasting.

EMD is a method of breaking down a signal without leaving the time domain. It can be compared to other analysis methods like Fourier Transforms and wavelet decomposition. The process is useful for analyzing natural signals, which are most often non-linear and non-stationary. The algorithm centers around a sifting process which produces Intrinsic Mode Functions--orthogonal oscillating components of the signal. The process works more or less as a dyadic filter bank, meaning that with each successive IMF, the frequency of the oscillation is roughly halved. What is left after the sifting process-the specifics depending on the algorithm and methods one is using--is the residue, which is like the "trend" that is leftover when you use ARIMA type methods.

Ensemble Empirical Mode Decomposition, used in many of the experiments here, is an improvement on the traditional EMD algorithm. It introduces some noise into the process, so that the IMFs produced on the other end should be more accurate.

In the notebooks collected here, I use a variety of real and synthetic time series of varying lengths, usually from 5k-10k data points, to experiment with how EMD can help us with forecasting. I only performed EMD on the training set, and used the EMD time series NOT as input features--but as targets--so that the network could learn how to go from a sequence of 20 lagged values of the time series to a sequence of the IMF components that, added together, comprise the next value of the series.

I did this way because I didn't want to assume that you could always do EEMD and retrain the network every time you observed new data--the idea was to used something "pretrained" so that you could take the last 20 values and forecast the next IMF values, which, after the fact, would be added together to produce a forecast for the original series itself. This way not only do you get the forecast but you retain some sense of how the dynamic of the various components going forward.

There's not a lot of literature out there covering the hybrid approach of using EEMD and convolutional neural networks--there are papers using EMD + other machine learning methods, like SVM, and there are papers about fully convolutional networks using ablation, else known as dilated kernels, but not many trying to combine the two. I wanted to see what could be done when we bring some of the best techniques from time series analysis in applied mathematics, natural science and engineering and combined with the latest techniques in deep learning. There were times when it worked fairly well, and beat out a XGBoost Regression model that I ran using AutoML, and other times when they seemed to be neck and neck.

The Chen, Rabinovich Fabrikant and Faes are synthetic time series, of varying degrees of nonlinearity, nonstationarity and chaos. The other time series are drawn from space physics (solar wind, magnetic fields), as well as finance (IRX is a treasury stock or ETF) and meteorology (air pressure) and biology (zooplankton time series). In each case, the architecture had to be tuned--from 2 to sometimes 4-5 convolutional layers, from filter sizes of 1 to 3. In most cases, I concatenated the feature maps produced by these layers with the original time series, rather than add or multiply as you often see in deep learning literature. I did this so that, if necessary, it could more or less learn the identity function and just use the time series itself as features, if indeed that was the more fruitful and accurate way to go.

I really could not have done any of this without comp-engine.org, which is a veritable gold mine of time series data, and allows us to find a great variety of time series with which to test these ideas, and to see what a general approach to time series forecasting might look like.

deeplearning_timeseries's People

Contributors

3catz avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.