Variational inference for differential equations
Project description
Bayesian inference can be computationally challenging, and we often have to resort to random sampling methods such as Markov Chain Monte Carlo (MCMC) in order to find posterior distributions. Variational methods are an alternative approach to sampling methods: they seek to turn the problem of finding posteriors into an optimisation problem. They work by specifying a parametric form for the posterior distribution (e.g. assuming it is Gaussian), and then seek to find the parameters in the approximation by minimising a cost function. Variational methods are widely used in machine learning, and have proven to be a fast and effective alternative to MCMC.
In this project, we will look at using the variational autoencoder (VAE) framework to do Bayesian inference for differential equation models. The VAE seeks to make use of either automatic differentiation software (such as Tensorflow) or adjoint methods to find derivatives of the cost function, which are a function of the solution to the differential equations. These can then be used to solve the variational inference problem at speed.
The project will focus on developing and testing the methodology to do this, and will use a variety of exemplar problems to guide the development.
Project published references
https://arxiv.org/abs/1312.6114
More information
Full details of our Maths PhD
How to apply to the University of Nottingham