June 21, 2020

Probabilistic modeling using normalizing flows pt.1

Probabilistic models give a rich representation of observed data and allow us to quantify uncertainty, detect outliers, and perform simulations. Classic probabilistic modeling require us to model our domain with conditional probabilities, which is not always feasible. This is particularly true for high-dimensional data such as images or audio. In these scenarios, we would like to learn the data distribution without all the modeling assumptions. normaizing flows is a powerful class of models that allow us to do just that, without resorting to approximations. Read more

May 14, 2020

Bayesian inference with Stochastic Gradient Langevin Dynamics

Modern machine learning algorithms can scale to enormous datasets and reach superhuman accuracy on specific tasks. Yet, they are largely incapable of answering “I don’t know” when queried with new data. Taking a Bayesian approach to learning lets models be uncertain about their predictions, but classical Bayesian methods do not scale to modern settings. In this post we are going to use Julia to explore Stochastic Gradient Langevin Dynamics (SGLD), an algorithm which makes it possible to apply Bayesian learning to deep learning models and still train them on a GPU with mini-batched data. Read more

April 18, 2020

Setting up Julia and Jupyter with GPU support

Did you know the J in Jupyter refers to Julia? It is easy to miss that Jupyter supports loads of different languages since Python is so dominant. This is unsurprising since Python just works out of the box, while getting other languages running requires some tinkering. In this post we are going to look into setting up Jupyter together with a Julia kernel, and package everything into a Docker container. As an extra, we are going to do this with GPU support. Read more

© Sebastian Callh 2020