时 间:20214年7月17日 15:30-16:30
地 点:普陀校区理科大楼A1514
报告人:Xiao Wang 美国普渡大学教授
主持人:谌自奇 华东师范大学研究员
摘 要:
Bayesian neural networks are a powerful tool for characterizing predictive uncertainty, but they face two challenges in practice. First, it is difficult
to define meaningful priors for the weights of the network. Second, conventional computational strategy becomes impractical for large and complex applications. In this paper, we adopt a class of implicit generative priors and propose a novel neural adaptive empirical Bayes framework for Bayesian modeling and inference. These priors are derived through a nonlinear transformation of a known low-dimensional distribution, allowing us to handle complex data distributions and capture the underlying manifold structure effectively. Our framework combines variational inference with a gradient ascent algorithm, which serves to select the hyperparameter and approximate the posterior distribution. Theoretical justification is established through both the posterior and classification consistency. We demonstrate the practical applications of our framework through extensive examples, including two-spiral problem, regression, and 10 UCI datasets, as well as MNIST and CIFAR-10 image classification. The results of our experiments highlight the superiority of our proposed framework over existing methods, such as sparse variational Bayesian and generative models, in terms of prediction accuracy and uncertainty quantification。
报告人简介:
Dr. Xiao Wang is Head and J.O. Berger and M.E. Bock professor of statistics at Purdue University. He received his Ph.D. from the University of Michigan, and his research centers on nonparametric statistics, functional data analysis, and machine learning, with particular emphasis on developing methods for high-dimensional and complex data. Dr. Wang is the fellow of the Institute of Mathematical Statistics (IMS) and the fellow of the American Statistical Association (ASA). Dr. Wang is currently an associate editor for JASA, Technometrics, and Lifetime Data Analysis.