通知公告

当前位置:首页 > 通知公告 > 正文

2019年度BETVLCTOR伟德官方网站外请专家学术报告之十四

时间:2019-05-14 16:43:26 来源:BETVLCTOR伟德官方网站 作者: 阅读:

报告题目1Stochastic Gradient Methods (I)

报告人:张洪超,博士

报告时间2019520日(周一)15:00—17:00      

报告地点:BETVLCTOR伟德官方网站学术报告厅

摘要We consider the convex composite optimization problem whose objective function is a summation of a Lipschitz continuously differentiable but possibly nonconvex function and a simple convex but possibly nonsmooth function. Since the objective function can not be computed with high accuracy, it is in general difficult to apply standard deterministic optimization algorithms to solve the problem. In this talk, we introduce gradient and optimal gradient methods for solving this problem, and discuss the convergence of these methods as well as their convergence complexities.

 

报告题目2Stochastic Gradient Methods (II)

报告人:张洪超,博士

报告时间2019522日(周15:00—17:00      

报告地点:BETVLCTOR伟德官方网站学术报告厅

摘要 In this talk, we introduce stochastic approximation (SA) method for solving the convex composite optimization problem whose objective function is a summation of a Lipschitz continuously differentiable but possibly nonconvex function and a simple convex but possibly nonsmooth function. The SA method only assumes the stochastic gradient instead of exact gradient of the function. We study the SA method for solving the convex composite optimization problem including convex case and nonconvex case, and discuss the convergence of these methods as well as their convergence complexities.

 

报告题目3Stochastic Gradient Methods (III)

报告人:张洪超,博士

报告时间2019523日(周15:00—17:00      

报告地点:BETVLCTOR伟德官方网站学术报告厅

摘要 In this talk, we introduce the sample average approximation (SAA) method for solving the convex composite optimization problem whose objective function is a summation of a Lipschitz continuously differentiable but possibly nonconvex function and a simple convex but possibly nonsmooth function. The SAA method generates a random sample of size and approximate the original function by its sample average. We study the SAA method for solving the convex composite optimization problem including convex case and nonconvex case, and discuss the convergence of these methods as well as their convergence complexities.

 

报告题目4A Nonmonotone Smoothing Newton Algorithm for Weighted Complementarity Problems

报告人:张洪超,博士

报告时间2019527日(周15:00—17:00      

报告地点:BETVLCTOR伟德官方网站学术报告厅

摘要 The weighted complementarity problem (denoted by WCP) significantly

extends the general complementarity problem and can be used for modeling a larger class of problems from science and engineering. By introducing a one-parametric class of smoothing functions which includes the weight vector, we propose a smoothing Newton algorithm with nonmonotone line search to solve the WCP. We prove that any accumulation point of the iteration sequence is a solution of the WCP. Moreover, when the solution set of the WCP is nonempty, we prove that our algorithm has local superlinear/quadratical convergence under assumptions which are weaker than the Jacobian nonsingularity assumption. Under such weak assumptions, we further show that the iteration sequence is bounded and it converges to one solution of the WCP superlinearly or quadratically. Promising numerical results are also reported.
 

专家简介张洪超,美国路易斯安那州立大学数学系和计算中心副教授,博士生导师。研究兴趣包括非线性优化理论和应用,稀疏矩阵,医学成像中的反问题,无导数优化等,近年来主持多项美国国家自然科学基金项目。在Mathematical ProgrammingSIAM Journal on OptimizationSIAM Journal on Numerical AnalysisSIAM Journal on Scientific ComputingSIAM Journal on ImagingSIAM Journal on Control and Optimization等优化计算领域顶尖期刊发表论文30余篇。现为Computational Optimization and Applications (SCI二区)Optimization Letters(SCI三区)Numerical Algebra, Control and Optimization等国际期刊编委。

 

欢迎广大师生参加