Videos from JuliaCon are now available online
2018
Previous editions: 2017 | 2016 | 2015 | 2014
Will Tebbutt

University of Cambridge



Probabilistic Programming and Gaussian Processes in Julia

The goal of a Probabilistic Programming Language (PPL) is to make probabilistic modeling as composable and straightforward to use as Deep Learning (DL). A PPL enables domain experts to compose probabilistic models and perform inference, without having to worry precisely how inference happens. This is analogous to how DL frameworks let you compose various layers and learn their parameters, without having to worry about the details of how their gradients are computed.

Unfortunately, a PPL that works well in practice is significantly harder to implement than a DL framework. This is because a PPL ultimately approximates high-dimensional integrals (hard), whereas a DL framework is concerned with computing high-dimensional gradients (easy-ish). There is, however, an edge case in which PP is easy: if all of the random variables in our programme are jointly Gaussian, then these high-dimensional integrals can be computed exactly. Perhaps surprisingly, by considering very large collections of jointly Gaussian random variables, known as Gaussian processes (GPs), we can construct a large class of useful probabilistic models for functions.

In this poster I

Speaker's bio

Will is a PhD student in the Computational and Biological Learning lab at the University of Cambridge, working on problems in probabilistic machine learning. He is particularly interested in Gaussian processes, automating Bayesian inference, and the use of machine learning to solve problems in climate science. When not working he can be found playing the guitar, or listening to people play it well.