August 22, 2023

Bayesian Ordinal Regression for Wine data

A while ago I wanted to explore my career options so I did a bit of interviewing for various companies. In one of the technical interviews, I was tasked to analyse a dataset and build a predictive model. Noticing that the target variable was ordinal, I decided to build an ordinal regression model using a Bayesian approach. Now, I’m guessing ordinal regression, and Bayesian methods, aren’t that well known, because the interviewers were completely unfamiliar and somewhat sceptical. Read more

May 9, 2021

VAEs as a framework for probabilistic inference

VAEs frequently get compared to GANs, and then dismissed since “GANs produce better samples”. While this might be true for specific VAEs, I think this sells VAEs short. Do I claim that VAEs generate better samples of imaginary celebrities? No (but they are also pretty good). What I mean is that they are qualitatively different and much more general than people give them credit. In this article we are going to consider VAEs as a family of latent variable models and discover that they offer a unified black-box inference framework for probabilistic modelling. Read more

February 2, 2021

Time series forecasting with Spectral Mixture Kernels

Time series modelling is a fundamental yet difficult problem. Forecasting in particular is incredibly challenging and requires strong inductive biases to give good predictions. One powerful framework for encoding inductive biases are kernel functions used with Gaussian Processes (GPs), however, kernels require manual work to embed domain knowledge which might not always be desirable. One might ask if we can learn kernel structures directly from the data, and indeed the answer is yes! Read more

June 22, 2020

Probabilistic modeling using normalizing flows pt.2

This is a follow-up post where we will see how to apply a normalizing flow model to learn the density of observed data. If you are not familiar with these models I recommend checking out the first part which explains how normalizing flows work and the math behind them. The promise of normalizing flows is that we can learn probability densities over our observations without having to model our entire domain by hand. Read more

June 21, 2020

Probabilistic modeling using normalizing flows pt.1

Probabilistic models give a rich representation of observed data and allow us to quantify uncertainty, detect outliers, and perform simulations. Classic probabilistic modeling require us to model our domain with conditional probabilities, which is not always feasible. This is particularly true for high-dimensional data such as images or audio. In these scenarios, we would like to learn the data distribution without all the modeling assumptions. normaizing flows is a powerful class of models that allow us to do just that, without resorting to approximations. Read more

© Sebastian Callh 2020