Abstract |
Modern machine learning has constantly presented puzzling empirical properties and surprised the classical statistical theory. Learning with overparametrized models is becoming a norm in data-analytic applications, and the tension of memorization rarely bothers practitioners.
In this talk, I will discuss the training of overparametrized neural networks from both the neural tangent kernel and the mean-field perspectives, which guarantees the global convergence property despite the non-convexity of the optimization landscape. I will also discuss more interesting phenomena in a series of overparametrized statistical questions. |
About the speaker |
Dr. Pengkun Yang current is an assistant professor at the Center for Statistical Science, Tsinghua University. He received his PhD in Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, and was a Postdoc fellow at Princeton University. His research areas include theory and algorithms for high-dimensional statistics, mathematical data science, statistical machine learning, and optimization. He has published several papers at top statistical journals and top machine learning conferences. |