Title: Neural networks for asynchronous time series
Authors: Mikolaj Binkowski - Imperial College London and Hellebore Capital Ltd. (United Kingdom) [presenting]
Abstract: Many real-world time series are asynchronous in the sense that their separate dimensions are observed at different and irregular moments of time. However, most of approaches to discrete time series or continuous stochastic processes require at least synchronous observations. As a solution to this problem, we propose Significance-Offset Convolutional Neural Network, a deep convolutional network architecture inspired by autoregressive models and gating mechanisms used in recurrent neural networks. The architecture uses a weighting system designed for asynchronous inputs, where the final predictor is obtained as a weighted sum of adjusted regressors, while the weights are data-dependent functions learnt through a convolutional network. We evaluate it on a hedge fund proprietary dataset of over 2 million quotes for a credit derivative index, an artificially generated noisy autoregressive series and household electricity consumption dataset. The proposed architecture achieves promising results as compared to convolutional and recurrent neural networks.