HKU HKU Dept of Statistics & Actuarial Science, HKU
 
 

Seminar by Dr. Edwin FONG from Lead Data Scientist, Methods, Innovation & Outreach at Novo Nordisk


DateMonday, 5 December 2022
Time4:00 p.m. – 5:00 p.m.
VenueVia Zoom
 
TitleMartingale posterior distributions
Abstract

The prior distribution is the usual starting point for Bayesian uncertainty. In this seminar, we present a different perspective which focuses on missing observations as the source of statistical uncertainty, with the parameter of interest being known precisely given the entire population. We argue that the foundation of Bayesian inference is to assign a distribution on missing observations conditional on what has been observed. In the i.i.d. setting with an observed sample of size n, the Bayesian would thus assign a predictive distribution on the missing Yn+1: ∞ conditional on Y1: n , which then induces a distribution on the parameter. We utilize Doob’s theorem, which relies on martingales, to show that choosing the Bayesian predictive distribution returns the conventional posterior as the distribution of the parameter. Taking this as our cue, we relax the predictive machine, avoiding the need for the predictive to be derived solely from the usual prior to posterior to predictive density formula. We introduce the martingale posterior distribution, which returns Bayesian uncertainty on any statistic via the direct specification of the joint predictive. To that end, we introduce new predictive methodologies for multivariate density estimation, regression and classification that build upon recent work on bivariate copulas.

About the speaker

Dr. Edwin Fong is a data scientist within Methods, Innovation & Outreach at Novo Nordisk, where he develops and applies novel statistical methods within the healthcare setting in collaboration with academia. Previously, he completed his PhD at the Department of Statistics, University of Oxford and the Alan Turing Institute, supervised by Professor Chris Holmes. He is broadly interested in causal inference, machine learning, and Bayesian inference, with a primary focus on applications to clinical trial and observational data. During his PhD, he focused on the foundational intersection between prediction and Bayesian inference. He also investigated scalable Bayesian nonparametric methods such as the Bayesian bootstrap, with a focus on computation and model misspecification.