- This event has passed.
STOR Colloquium: Xiuyuan Chen, Duke University
October 17 @ 3:30 pm - 4:30 pm
Convergence of Gaussian kernelized graph Laplacian: eigen-convergence and bi-stochastic normalization
Consider kernelized graph affinity matrix constructed from $N$ data points i.i.d. sampled from a general unknown $d$-dimensional manifold embedded in a possibly high-dimensional space. The setting is generic in graph-based high dimensional data analysis and manifold learning. In this talk, we first introduce a result on the spectral convergence of graph Laplacians to the Laplace-Beltrami operator. By analyzing Dirichlet form convergence and constructing candidate approximate eigenfunctions via convolution with manifold heat kernel, we prove eigen-convergence with rates as $N$ increases and the kernel bandwidth $\epsilon$ decreases accordingly. When data density is non-uniform on the manifold, we prove the same rates for the density-corrected graph Laplacian. The second result proves the convergence rate of the bi-stochastically normalized graph Laplacian to the (weighted-)manifold Laplacian operator, and the result is extended when the manifold data are corrupted by outlier noise. Motivated by our analysis, we propose an approximate and constrained matrix scaling problem that can be solved by Sinkhorn iterations with early termination. Numerical experiments support the theory and show the robustness of bi-stochastically normalized graph Laplacian to high-dimensional outlier noise. Joint work with Nan Wu and Boris Landa.