August 22, 2023

Bayesian Ordinal Regression for Wine data

A while ago I wanted to explore my career options so I did a bit of interviewing for various companies. In one of the technical interviews, I was tasked to analyse a dataset and build a predictive model. Noticing that the target variable was ordinal, I decided to build an ordinal regression model using a Bayesian approach. Now, I’m guessing ordinal regression, and Bayesian methods, aren’t that well known, because the interviewers were completely unfamiliar and somewhat sceptical. Read more

May 22, 2023

A Bayesian beer tasting

Me and my mates are big fans of craft beer, and from time to time organise our own beer tastings. Each participant gets to give a single score between 1 and 5 (half point increments are allowed) to each beer and afterwards we compare scores and see which beer came out on top. Sour beer and imperial stouts are some of our favourite styles, but this time we decided to do something different: A blind tasting with the cheapest lager we could find. Read more

May 9, 2021

VAEs as a framework for probabilistic inference

VAEs frequently get compared to GANs, and then dismissed since “GANs produce better samples”. While this might be true for specific VAEs, I think this sells VAEs short. Do I claim that VAEs generate better samples of imaginary celebrities? No (but they are also pretty good). What I mean is that they are qualitatively different and much more general than people give them credit. In this article we are going to consider VAEs as a family of latent variable models and discover that they offer a unified black-box inference framework for probabilistic modelling. Read more

May 14, 2020

Bayesian inference with Stochastic Gradient Langevin Dynamics

Modern machine learning algorithms can scale to enormous datasets and reach superhuman accuracy on specific tasks. Yet, they are largely incapable of answering “I don’t know” when queried with new data. Taking a Bayesian approach to learning lets models be uncertain about their predictions, but classical Bayesian methods do not scale to modern settings. In this post we are going to use Julia to explore Stochastic Gradient Langevin Dynamics (SGLD), an algorithm which makes it possible to apply Bayesian learning to deep learning models and still train them on a GPU with mini-batched data. Read more

© Sebastian Callh 2020