MATH - Seminar on Data Science and Applied Math - A Generalized Neural Tangent Kernel Analysis for Two-layer Neural Networks
11:00am - 12:00pm
https://hkust.zoom.us/j/5616960008

A recent line of research on deep learning shows that the training of extremely wide neural networks can be characterized by a kernel function called neural tangent kernel (NTK). However, it is known that this type of result does not perfectly match the practice, as NTK-based analysis requires the network weights to stay very close to their initialization throughout training, and cannot handle regularizers or gradient noises. In this talk, I will present a generalized neural tangent kernel analysis and show that noisy gradient descent with weight decay can still exhibit a ``kernel-like'' behavior. This implies that the training loss converges linearly up to a certain accuracy. I will also discuss the generalization error of an infinitely wide two-layer neural network trained by noisy gradient descent with weight decay.

講者/ 表演者:
Dr. Yuan CAO
UCLA

Dr. Yuan CAO is a postdoctoral researcher in the Department of Computer Science at UCLA working with Professor Quanquan Gu. Before joining UCLA, he received his B.S. from Fudan University and Ph.D. from Princeton University. Yuan’s research interests include the theory of deep learning, non-convex optimization, high-dimensional graphical models and their applications in computational genomics.

語言
英文
適合對象
校友
教職員
研究生
本科生
主辦單位
數學系
聯絡方法
新增活動
請各校內團體將活動發布至大學活動日曆。