“统计与数据科学前沿理论和方法”系列讲座(二)
时 间:2020年10月21(周三)上午10:00-11:00
地 点:腾讯会议:436 411 766
题 目:Progressive Principle Component Analysis forCompressing Deep Convolutional Neural Networks
报告人:周静 中国人民大学副教授
摘 要:
In this work, we propose a progressive principal component analysis (PPCA) method for compressing deep convolutional neural networks. The proposed method starts with a prespecified layer and progressively moves on to the final output layer. For each target layer, PPCA conducts kernel principal component analysis for the estimated kernel weights. This leads to a significant reduction in the number of kernels in the current layer. As a consequence, the channels used for the next layer are also reduced substantially. This is because the number of kernels used in the current layer determines the number of channels for the next layer. For convenience, we refer to this as a progressive effect. As a consequence, the entire model structure can be substantially compressed, and both the number of parameters and the inference costs can be substantially reduced. Meanwhile, the prediction accuracy remains very competitive with respect to that of the baseline model. The effectiveness of the proposed method is evaluated on a number of classical CNNs (AlexNet, VGGNet, ResNet and MobileNet) and benchmark datasets. The empirical findings are very encouraging. The code is available at https://github.com/zhoujing89/ppca.
个人简介:
周静,中国人民大学统计学院副教授、应用统计科学研究中心研究员,北京大学光华管理学院博士,研究方向为社交网络、空间计量、模型压缩等,在Journal of Business & Economic Statistics,Statistic Sinica,Science China Mathematics,Electronic Commerce Research and Applications,管理科学,营销科学学报等国内外核心期刊发表论文十余篇,编著《深度学习:从入门到精通》教材一本,主持国自科、北社科、统计局重点等多项省部级以上课题。担任人民邮电出版社数据科学与统计·商业分析系列教材编委会委员。