Colloquium: Richard Rimanyi (UNC)
14 Oct @ 3:30 pm - 4:30 pm
Colloquium: Richard Rimanyi (UNC)
14 Oct @ 3:30 pm – 4:30 pmRichard Rimanyi (UNC Chapel Hill, Department of Mathematics):
On a geometric problem of machine learning
In this talk we will enumerate the main reasons for a collection of matrices multiplying to 0. Our motivation is Singular (Bayesian) Learning Theory, where one of the goals is to progressively approximate an unknown distribution using data generated from that distribution. A key component in this framework is a function K (relative entropy, or Kullback-Leibler divergence), which is often highly singular. These cases, for example neural networks, require more geometric tools than the study of regular statistical models. The invariants of the singularities of K (in the style of `log canonical threshold’) are related to how well the Singular Learning Theory “generalizes”—or, in machine learning terms, how efficiently the model can be trained. Computing the singularity invariants in real-life scenarios of machine learning is notoriously difficult. In this talk, we focus on an elementary example and compute the learning coefficients of Linear Neural Networks. Joint work with S. P. Lehalleur.