时间:2022年10月20日10:00
地点:Zoom会议ID:88132259167 密码:352453
报告人:李国栋 教授
主持人:石芸 副教授
摘要:
Most currently used tensor regression models for high-dimensional data are based on Tucker decomposition, which has good properties but loses its efficiency in compressing tensors very quickly as the order of tensors increases, say greater than four or five. However, for the simplest tensor autoregression in handling time series data, its coefficient tensor already has the order of six. This paper revises a newly proposed tensor train (TT) decomposition and then applies it to tensor regression such that a nice statistical interpretation can be obtained. The new tensor regression can well match the data with hierarchical structures, and it even can lead to a better interpretation for the data with factorial structures, which are supposed to be better fitted by models with Tucker decomposition. More importantly, the new tensor regression can be easily applied to the case with higher order tensors since TT decomposition can compress the coefficient tensors much more efficiently. The methodology is also extended to tensor autoregression for time series data, and nonasymptotic properties are derived for the ordinary least squares estimations of both tensor regression and autoregression. A new algorithm is introduced to search for estimators, and its theoretical justification is also discussed. Theoretical and computational properties of the proposed methodology are verified by simulation studies, and the advantages over existing methods are illustrated by two real examples.
报告人简介:
李国栋,本科和硕士毕业于北大数学学院,2007年于香港大学统计精算系获得统计学博士,随后在南洋理工大学任助理教授。现任香港大学统计精算系教授。主要研究方向包括时间序列分析,分位数回归,高维统计数据分析和机器学习。李教授目前发表学术论文40余篇,其中10余篇发表在统计学4大顶级期刊,以及机器学习的顶级会议上。