- This event has passed.
PhD Defense: Alexander Murph
21 Apr @ 10:00 am - 12:00 pm
PhD Defense: Alexander Murph21 Apr @ 10:00 am – 12:00 pm
Generalized Fiducial Inference on Differentiable Manifolds
We consider Generalized Fiducial (GF) and Bayesian Inferential problems on parameter spaces that take values on Riemannian manifolds. In the GF case, we introduce the Constrained Generalized Fiducial Distribution (CGFD), a sub-manifold analogue of a GF distribution on an ambient Euclidean space. We prove several results for the CGFD, including a “Bernstein-von Mises (BvM) Inheritance” property, where a BvM result from a GF distribution transfers to its CGFD analogue.
We apply a Bayesian manifold-learning approach to a model-monitoring problem at the Mayo Clinic in Rochester, MN. When a predictive model is in production, it must be monitored over time to ensure that its performance does not suffer from drift or abrupt changes to data. Typically, this is done by evaluating the algorithm’s predictions to outcome data and ensuring that the algorithm maintains an acceptable level of accuracy over time. However, it is far preferable to learn about major changes in the input data that could affect the model’s performance in real-time, long before learning that the performance of the model itself has dropped by monitoring outcome data. We consider the problem of change-point detection on high-dimensional longitudinal data with mixed variable types and missing values. The approach considers fitting an array of Gaussian Graphical Mixture Models (GGMMs) to groupings of homogeneous data in time, called regimes, which are modeled as the observed states of a Markov process with unknown transition probabilities. The parameter space for GGMMs can be framed as a Riemannian manifold, so we compare this Bayesian approach to the CGFD approach.