GPJax

GPJax is a didactic Gaussian process package written entirely in Jax that is targetted at researchers wishing to develop their own custom Gaussian process models. The code is fully compatible with GPUs and TPUs and boasts impressive runtimes in comparison to other GP packages written in Python. In addition to regular GP regression, GPJax also supports inference in non-conjugate models and induces scalable approximations through sparse schemes. Custom GP modelling for graphs and Wasserstein barycentres is also supported.

See Github for the code repository and JOSS for the supporting paper.

GaussianProcesses.jl

This package provides functionality for Gaussian process models in the Julia programming language. We make use of core features of Julia: multiple dispatch and JIT compilation. This results in a highly intuitive API with exceptionally efficient computation. At current we enable inference in a broad range of models by supporting a large number of likelihood and kernel functions. Further, through implemented Markov Chain Monte-Carlo (MCMC) schemes such as Hamiltonian Monte-Carlo and elliptical slice sampling, we are able to support fully Bayesian inference. Finally, implementations of several sparsity inducing schemes allows for GP modelling to be conducted on big data problems.

See Github for the code repository and arXiv for the supporting paper.

SteinGP

This package is written purely in Python and extends the TensorFlow and GPFlow libraries for GP inference using Stein variational gradient descent (SVGD). SVGD can be seen as an alternative to MCMC and variational inference (VI) techniques. The selling point of SVGD though is that unlike VI schemes, fully-Bayesian inference is possible in sparse GP modelling with much better scaling than MCMC. This package currently supports classification and regression with stochastic modelling soon to be added. Due to the TensorFlow backend, these models can be fit more efficiently on GPUs, should they be available to the user.

See Github for the code repository and arXiv for the supporting paper.

Open Source Contributions

In addition to the above packages, I have also contributed to the following open source packages: GPFlow, Distrax, and NumPyro.