Ph.D. Defense: Yuzixuan Zhu
Preprocessing and First-Order Primal-Dual Algorithms for
Advisors: Dr. Gabor Pataki and Dr. Quoc Tran-Dinh
This talk focuses on two topics in the field of convex optimization: preprocessing algorithms for semidefinite programs (SDPs), and first-order primal-dual algorithms for convex-concave saddle-point problems.
In the first part of this talk, we introduce Sieve-SDP, a simple facial reduction algorithm to preprocess SDPs. Sieve-SDP inspects the constraints of the problem to detect lack of strict feasibility, deletes redundant rows and columns, and reduces the size of the variable matrix. It often detects infeasibility. It does not rely on any optimization solver: the only subroutine it needs is Cholesky factorization, hence it can be implemented in a few lines of code in machine precision. We present extensive computational results on several problem collections from the literature, with many SDPs coming from polynomial optimization.
In the second part, we develop two first-order primal-dual algorithms to solve a class of convex-concave saddle-point problems involving non-bilinear coupling function, which includes SDP as one of the many special cases. Both algorithms are single-loop and have low per-iteration complexity. Our first algorithm can achieve O(1/k) convergence rates on the duality gap in both ergodic (averaging) sense and semi-ergodic sense, i.e., non-ergodic (last-iterate) on the primal, and ergodic on the dual. This rate can be further improved on non-ergodic primal objective residual using a new parameter update rule. Under strong convexity assumption, our second algorithm can boost these convergence rates to no slower than O(1/k^2). Our results can be specified to handle general convex cone-constrained problems. We test our algorithms on applications such as two-player games and image processing to compare our algorithms with existing methods.