Bayesian approaches to statistical inference and uncertainty quantification became practical with the development of effective sampling methods like Markov Chain Monte Carlo (MCMC) and advances in high-performance computing. However, as the size and complexity of inference problems has dramatically increased, improvements in MCMC algorithms have struggled to keep up, making the development of more efficient algorithms an active area of research. Dynamical systems-based approaches, such as Hamiltonian Monte Carlo, have achieved higher efficiency than standard MCMC methods because they treat the PDF as a potential energy function of a dynamical system.
Here, we present a stochastic dynamical system-based MCMC algorithm, which uses a damped second-order Langevin stochastic differential equation (SDE) to sample the posterior PDF. The parameters of the SDE are chosen such that the desired probability distribution is the stationary distribution of the SDE. Since this method is based upon an underlying dynamical system, we can apply ideas from dynamics and control theory to optimize our sampler's performance. Our objectives are to increase the convergence rate to the stationary distribution and to reduce the sample and energy correlation. This is achieved by adapting the damping and mass matrix to exploit the local structure of the PDF. We then discuss an extension of this work to on-line Bayesian inference and show an application to a system identification problem for estimating the parameters of an unknown ODE model.