学术报告 您所在的位置:网站首页 comparative中文 学术报告

学术报告

2023-03-12 22:41| 来源: 网络整理| 查看: 265

学术报告

题      目:Averaged Method of Multipliers for Bi-Level Optimization without Lower-Level Strong Convexity

报  告  人:尧伟   副教授  (邀请人:陈艳男 )

                                    南方科技大学数学系和深圳国家应用数学中心

时      间:3月10日  15:00-16:00

地     点:数科院东楼401

报告人简介:

     博士毕业于香港中文大学,任职南方科技大学数学系和深圳国家应用数学中心研究助理教授,主要研究方向包括双层规划算法和理论分析,及其在机器学习和机制设计领域的应用。代表性论文发表在SIAM J Optim、Calculus of Variations and Partial Differential Equations、Journal of Differential Equations等运筹优化、偏微分方程领域的国际著名期刊上。

摘      要:

      Gradient methods have become mainstream techniques for Bi-Level Optimization (BLO) in learning fields. The validity of existing works heavily rely on either a restrictive Lower- Level Strong Convexity (LLSC) condition or on solving a series of approximation subproblems with high accuracy or both. In this work, by averaging the upper and lower level objectives, we propose a single loop Bi-level Averaged Method of Multipliers (sl-BAMM) for BLO that is simple yet efficient for large-scale BLO and gets rid of the limited LLSC restriction. We further provide non-asymptotic convergence analysis of sl-BAMM towards KKT stationary points, and the comparative advantage of our analysis lies in the absence of strong gradient boundedness assumption, which is always required by others. Thus our theory safely captures a wider variety of applications in deep learning, especially where the upper-level objective is quadratic w.r.t. the lower-level variable. Experimental results demonstrate the superiority of our method.



【本文地址】

公司简介

联系我们

今日新闻

    推荐新闻

    专题文章
      CopyRight 2018-2019 实验室设备网 版权所有