Skip to main content
Loading Events

« All Events

  • This event has passed.

PHD Defense: Deyi Liu

10 Jun @ 10:00 am - 12:00 pm

PHD Defense: Deyi Liu

10 Jun @ 10:00 am – 12:00 pm

Ph.D. Thesis Defense

Public Presentation
Friday, June 10, 2022  10:00 AM

Location: Zoom

Deyi Liu

Advised by Quoc Tran-Dinh

 Efficient and Provable Algorithms for Convex Optimization Problems Beyond Lipschitz Continuous Gradients

We aim to develop efficient algorithms for solving complex and constrained convex optimization problems with provable convergence guarantees. Unlike existing methods that heavily rely on the Lipschitz continuity of the objective function gradient, we instead exploit the notion of self-concordance introduced by Nesterov and Nemirovskii in the 1990s. This concept has been intensively used in interior-point methods but has recently been exploited in other optimization schemes. In addition, self-concordant functions cover many new and prominent applications in statistics and machine learning, such as inverse covariance-type estimation, regularized logistic regression, portfolio optimization, and optimal experimental design. We are also interested in nonsmooth convex composite problems widely used in machine learning and image processing. In this setting, we utilize Nesterov’s smoothing technique to deal with the nonsmooth part of the objective function. Our approach is to go beyond the Lipschitz gradient structure and develop novel and efficient algorithms for different convex problems with polynomial-time iteration complexity.

In the first work, a new Newton Frank-Wolfe algorithm is developed for a class of constrained self-concordant minimization problems. The new algorithm combines the Newton and Frank-Wolfe methods to exploit the linear minimization oracles (LMO) of the feasible set. We theoretically prove that the number of LMO calls of our method is nearly the same as that of the Frank-Wolfe method in the L-smooth case.

The second work proposes a new inexact interior-point Lagrangian decomposition method to solve a general constrained composite minimization problem. Our method allows one to approximately solve the primal subproblems (called the slave problems), which leads to inexact oracles (i.e., inexact function value, gradient, and Hessian) of the smoothed dual problem (called the master problem). By appropriately controlling the inexact computation in both the slave and master problems, we can establish a polynomial-time iteration-complexity of our algorithm and recover primal solutions.

Finally, we combine the accelerated stochastic gradient method and Nesterov’s smoothing technique to solve a stochastic nonsmooth convex composite problem. Unlike existing works, which rely on unbiased estimators of subgradients of the nonsmooth term, we instead explore its proximal operator via smoothing techniques to obtain a better convergence rate. We further adapt the variance reduction technique to the proposed smoothing method in the finite sum case and achieve the best-known complexity.

Share this Event

PHD Defense: Deyi Liu

This event has passed.

Details

Date:
10 Jun
Time:
10:00 am – 12:00 pm

Venue

zoom

Organizer

Details

Date:
10 Jun
Time:
10:00 am - 12:00 pm
Event Category:

Venue

zoom