Title: Forecasting by learning the evolution-driving function
Authors: Dalia Chakrabarty - Brunel University London (United Kingdom) [presenting]
Abstract: The state that a generic deterministic dynamical system is in, at any time in the future, is computable, as long as we have information on the function that drives the evolution of this system. Indeed, probabilistic learning of this evolution-driver will lead to probabilistic forecasting of states. In contrast, learning patterns in the already observed values of the phase space variables in the past, does not guarantee correct forecasting, irrespective of the sophistication of the learning and/or of parametrisation of information replication at the considered time point in the future. However, we do not possess any training data to permit supervised learning of the sought evolution driving function. So to enable such supervised learning, the evolution-driver is learnt at a given input, (namely, a given time and state), by embedding it in the support of the pdf of the phase space variables. This generates the originally-absent training set, using which we learn the sought function (by modelling it with a Gaussian Process), and predict it at the (future) test time. Phase space variables attained at that future time are then computed by inputting this forecast evolution-driving function using a generalised version of Newton's 2nd Law. An empirical illustration of the methodology is made to perform forecasting of daily new COVID19 infection numbers.