通知公告

10月24日:王亚珍 | Analysis of Stochastic Gradient Descent

题目:Analysis of Stochastic Gradient Descent

报告人:王亚珍 教授

主持人:张日权 教授

时间:102415:00-16:00

地点:中北校区理科大楼A302

报告人简介:王亚珍,美国威斯康星大学麦迪逊分校统计学教授,统计系系主任,1987毕业于华东师范大学,先后获学士和硕士学位,1992获加州大学伯克利分校博士学位。主要从事金融统计、超高维统计推断、小波分析、长记忆过程和受限制过程的统计推断等方面的研究。现为国际数理统计学会成员、美国统计学会会士、Stat. Interface主编,曾任JASAAnn. Statist.Ann. Appl. Stat.J. Bus. Econom. Statist.Statist. SinicaEconom. J.等杂志副主编。

报告内容简介:Gradient descent algorithms such as accelerated gradient descent and stochastic gradient descent are widely employed to solve optimization problems in statistics and machine learning. This talk will present a new asymptotic analysis of these algorithms by continuous-time ordinary or stochastic differential equations. I will illustrate that the analysis can provide a novel unified framework for a joint computational and statistical asymptotic analysis on dynamic behaviors of these algorithms with the number of iterations in the algorithms and large sample behaviors of the optimization solutions (i.e. statistical decision rules like estimators and classifiers) that the algorithms are applied to compute. I will also discuss the implication of the analysis results for deep learning. 


发布者:王璐瑶发布时间:2018-10-09浏览次数:926