Speaker: Prof. Rencang Li(李仁仓教授)

Time: 12:10-13:10, 17 November 2022 (Thursday) (Beijing time)

Venue: A103, Lijiao Building

Tencent Meeting ID: 546-820-158


Abstract

One of the guiding principles in designing machine learning models is to highlight one characteristic of interest in data while diminish others, or equivalently to contrastively pitch one characteristic against others. When it comes to dimensionality reduction and subspace learning, traces of matrices of interest are inherently ubiquitous ingredients that are being built into optimization models so that hidden characteristics of the original high dimensional space are exposed in the reduced space. In this talk, we will start by surveying several existing optimization models on the Stiefel manifold for dimensionality reduction and subspace learning and then focus on one of our recent works concerning a multiview contrastive learning model in the form of a trace ratio optimization problem over the product of Stiefel manifolds. It broadly includes Fisher’s LDA and a recent orthogonal CCA (OCCA) as special cases. We will explain how to efficiently solve the model numerically by an NEPv approach via SCF, and then showcase it on real-world multiview data sets


About Prof. Li

Ren-Cang Li is a Chair Professor of Mathematics (09/2021-present) at HKBU and a Professor of Mathematics at University of Texas at Arlington (07/2006-present). He received his BS in Computational Mathematics from Xiamen University in 1985 and his MS also in Computational Mathematics from Chinese Academy of Science in 1988 and his Ph.D. in Applied Mathematics from UC Berkeley in 1995. His research interest includes floating-point support for scientific computing, numerical linear algebra, reduced order modeling, and unconventional schemes for ordinary differential equations, optimization and machine learning. His secular equation solver codes sit at the kernels of MATLAB's eig and svd and LAPACK eigenvalue/singular value routines that are being used around the clock. He has served/is serving as an editor/associated editor/co-Editor-in-Chief on the editorial boards of several international journals.

He was awarded the Householder Fellowship in scientific computing by ORNL in 1995, a Friedman memorial prize in Applied Mathematics from UC Berkeley in 1996, a CAREER award from NSF in 1999, several distinguished paper awards from the International Consortium of Chinese Mathematicians (ICCM),  and the 2019 INFORMS' SAS Data Mining Best Paper Award.