请升级浏览器版本

你正在使用旧版本浏览器。请升级浏览器以获得更好的体验。

学术报告

首页 >> 学术报告 >> 正文

【学术报告】A Mini-Batch Proximal Stochastic Recursive Gradient Algorithm with Diagonal Barzilai–Borwein Stepsize

发布日期:2023-09-07    点击:

学术报告

A Mini-Batch Proximal Stochastic Recursive Gradient Algorithm with Diagonal Barzilai–Borwein Stepsize

于腾腾(北京农学院

报告时间:202399日星期六  1515-1600


报告地点:沙河主楼E405


报告摘要: Many machine learning problems can be formulated as minimizing the sum of a function and a non-smooth regularization term. Proximal stochastic gradient methods are popular for solving such composite optimization problems. We propose a mini-batch proximal stochastic recursive gradient algorithm SRG-DBB, which incorporates the diagonal Barzilai–Borwein (DBB) stepsize strategy to capture the local geometry of the problem. The linear convergence and complexity of SRG-DBB are analyzed for strongly convex functions. We further establish the linear convergence of SRG-DBB under the non-strong convexity condition. Moreover, it is proved that SRG-DBB converges sublinearly in the convex case. Numerical experiments on standard data sets indicate that the performance of SRG-DBB is better than or comparable to the proximal stochastic recursive gradient algorithm with best-tuned scalar stepsizes or BB stepsizes. Furthermore, SRG-DBB is superior to some advanced mini-batch proximal stochastic gradient methods.

 

报告人简介:于腾腾,2016年硕士毕业于西安电子科技大学,师从刘三阳教授。2021年博士毕业于河北工业大学,师从刘新为教授。2021年10月至2023年8月在中国科学院数学与系统科学研究院从事博士后研究,合作导师袁亚湘院士。主要研究兴趣为大规模机器学习中的随机梯度算法,相关成果发表在IEEE Transactions on Neural Networks and Learning Systems、Journal of Scientific Computing、Journal of the Operations Research Society of China等期刊。


邀请人:崔春风

快速链接

版权所有©2024 太阳成集团tyc7111cc(中国) Macau Sun City
地址:北京市昌平区高教园南三街9号   网站:www.zbsddq.com

Baidu
sogou