Webinar Series: Mathematical Foundations of Data Science

Friday, April 9, 2021 11am ET

Global Convergence of EM?


Harrison Zhou, (Yale University)


In this talk I will first discuss a recent joint work with Yihong Wu: https://arxiv.org/abs/1908.10935. We show that the randomly initialized EM algorithm for parameter estimation in the symmetric two-component Gaussian mixtures converges to the MLE in at most $\sqrt(n)$ iterations with high probability. Then I will mention the limitations of that work and propose an extension to general Gaussian mixtures by overparameterization.


Harrison Zhou is a Henry Ford II Professor and Chair of the Department of Statistics and Data Science at Yale. His main research interests include asymptotic decision theory, large covariance matrices estimation, graphical models, Bayesian nonparametrics, statistical network analysis, sparse canonical correlation analysis and principal component analysis, and analysis of iterative algorithms. His research has been acknowledged with awards including the National Science foundation Career Award, the Noether Young Scholar Award from the American Statistical Association, the Tweedie Award, the IMS Medallion lecture and IMS Fellow from the Institute of mathematical Statistics.

Event Type


Georgia Institute of Technology
Northwestern University
Pennsylvania State University
Princeton University
University of Illinois at Urbana-Champaign
National Institute of Statistical Sciences
Harvard University
Two Sigma
ORAI China


Registration is free.


Online Webinar
Speaker: Harrison Zhou, (Yale University)