Bayesian Synthetic Difference-in-Differences via Cut Posteriors
A NumPyro implementation of Bayesian Synthetic Difference-in-Differences using modular inference and cut posteriors.
A NumPyro implementation of Bayesian Synthetic Difference-in-Differences using modular inference and cut posteriors.
Let’s explore how prior predictive checks can help you understand whether your priors are reasonable before you see any data. This is an important step in the Bayesian workflow that is often overlooked, but it can save you from fitting models that encode assumptions you never intended.
A NumPyro implementation of Bayesian Synthetic Control Methods.
A NumPyro implementation of Bayesian estimation supersedes the t-test.
A gentle introduction to variational inference applied to Bayesian logistic regressions with accompanying PyTorch implementation.
A short post on scoring rules and their connection to a divergence metric.
Let’s derive the evidence lower bound used in variational inference using Jensen’s inequality and the Kullback-Leibler divergence.
A GP implementation using Jax that shows kernel computation, conditioning of Gaussian distributions, parameter transformations, and gradient-based optimisation.