Lecture Series-Greg Terlov
6 Oct @ 3:30 pm - 4:30 pm
Lecture Series-Greg Terlov6 Oct @ 3:30 pm – 4:30 pm
Introduction to Stein’s method
Abstract: In this lecture series I will give a gentle introduction to Stein’s method for proving distributional convergence. In 1972 Charles Stein combined Gaussian integration by parts with a certain “noise robustness” property of a sequence of random variables to derive Central Limit Theorem (CLT) with an explicit bound on the rate of convergence. In particular, this gave a completely new way of showing CLT by converting the problem to the question of understanding local behavior of the model. In the last 50 years, this idea was generalized and extended to a plethora of settings such as different limiting distributions (Poisson, Exponential, Geometric, etc.), various dependency structures, Malliavin calculus, Concentration inequalities and more. More recently, Stein’s method led to advances in computational statistics, machine learning, and sample quality testing. I will focus on the fundamentals of Stein’s method and present several approaches for establishing Berry-Esseen type theorems for the sums of random variables under various dependency assumptions.