Colloquium: Tianjiao Li (Georgia Tech)
24 Jan @ 3:30 pm - 4:30 pm
Colloquium: Tianjiao Li (Georgia Tech)
24 Jan @ 3:30 pm – 4:30 pmTitle: Universal Parameter-Free Methods for Convex, Nonconvex, and Stochastic Optimization
Abstract:
First-order methods are widely used to tackle data science and machine learning problems with complex structures, such as nonconvexity, nonsmoothness, and stochasticity. However, in many real-world scenarios, the problem structure and parameters can be unknown or ambiguous, creating challenges for algorithm design and stepsize selections.
In this talk, I will present novel parameter-free methods that are universal in solving different classes of optimization problems without requiring prior knowledge of the problem parameters or resorting to any line search or backtracking procedures. In the first part of the talk, we propose a novel parameter-free projected gradient method for smooth optimization, with the best-known unified complexity for convex and nonconvex problems. We then generalize the method to the stochastic setting, achieving new complexity bounds that are nearly optimal for both convex and nonconvex problems. In the second part of the talk, we focus on convex optimization problems and propose a uniformly optimal method for smooth, weakly smooth, and nonsmooth problems. The advantages of the proposed methods are demonstrated by encouraging numerical results.