University of Amsterdam
Science Park, Lab 42, L4.04
I completed my PhD, on lossless compression with latent variable models, in 2020, supervised by Professor David Barber at the UCL AI Centre in London. Most of my research to date has been on deep generative models and lossless compression. I’m also interested in unsupervised learning more generally, approximate inference, Monte Carlo methods, optimization and the design of machine learning software systems.
During the PhD I spent a lot of time working on the Python/NumPy automatic differentiation software Autograd. I interned under the tutelage of Matthew Johnson at Google Brain in San Francisco in Spring 2018, where I was fortunate enough to work on JAX during the early stages of the project.