太阳成集团tyc7111cc学术报告
Gradient Descent with Random Initialization for Symmetric Tensor Decomposition
刘海霞 博士
(华中科技大学)
报告时间:2021年6月1日星期二 下午2:00-3:00
会议地点:沙河E404(线下); 腾讯会议ID: 127 416 879(线上)
报告摘要:Symmetric tensor decomposition is of great importance in applications. Several studies have employed greedy approach for tensor with order m > 2. That is, we first find a best rank-one approximation of given tensor and subtract the corresponding component and repeat the process. In this talk we focus on finding a best rank-one approximation of order-3 symmetric tensor, which is formalized as nonconvex optimization model. Firstly, we give a geometric landscape analysis of the nonconvex objective function. In particular, we show that any local minimizer must be a factor in the low-rank decomposition, and any other critical points are linear combinations of the factors. Then, we start from random initialization and iterate by gradient descent algorithm to solve the nonconvex optimization model. We prove that the algorithm must converge to one factor of the CP decomposition. This result, combined with the landscape, reveals that the greedy algorithm, with random initialized gradient descent, gets the CP low-rank decomposition of symmetric tensor. Numerical results coincide with the theoretical proof.
报告人简介:Haixia Liu got her Ph.D degree from The Chinese University of Hong Kong, supervised by Prof. Raymond Chan. Before joining HUST, she was working as a postdoc at The Hong Kong University of Science and Technology, mentored by Prof. Yang Wang. Her research interests mainly focus on algorithm design and theoretical analysis in data science problems with its applications.
邀请人: 谢家新