Skip to main content
 

Dr. Quoc Tran-Dinh received NSF grant on "Efficient methods for large scale self concordant convex minimization"

February 15, 2017

Recent progress in modern convex optimization provides a powerful tool for scientific discovery. In theory, many classical convex optimization models have well-understood structures, and hence can efficiently be solved by state-of-the-art algorithms. In practice, however, modern applications present a host of increasingly larger-scale and nonsmooth models that can render these methods impractical. Fortunately, recent advances in convex optimization offer a surprising new angle to fundamentally re-examine the theory and practice of large-scale problems in a unified fashion. This project focuses on exploiting and generalizing a prominent concept so-called self-concordance to develop new efficient convex optimization techniques to attack two classes of large-scale convex optimization problems, and will be integrated into three interdisciplinary work packages (WPs).

WP1. Composite self-concordant convex optimization: While existing convex optimization methods essentially rely on the Lipschitz gradient assumption, the PI instead focuses on the self- concordance structure and its generalizations. Such a concept is key to the theory of interior-point methods, but has remained unexploited in composite minimization. Grounded in this structure, the PI will develop novel and provable convex optimization algorithms for solving several subclasses of large-scale composite convex problems.

WP2. Constrained convex optimization involving self-concordant barriers: Various constrained convex applications are integrated with a self-concordant barrier structure, while other convex constraints often have a ‘‘simple’’ structure. Existing general-purpose convex algorithms solve these problems by mainly employing either a standard interior-point method or an augmented Lagrangian framework. The PI alternatively concentrates on exploiting special structures of these problems and combining them with both the interior-point idea and the proximal framework to develop new and scalable algorithms equipped with a rigorous convergence guarantee, while offering a parallel and distributed implementation.

WP3. Implementation and applications: This WP aims at investigating the implementation aspects of the PI’s algorithms and upgrading his SCOPT solver. The theory and methods developed in WP1 and WP2 will be validated through three concrete applications: Poisson imaging, graph learning, and max-cut-type problems. While these applications are different, their underlying convex formulation possesses the following features: (i) it has non-Lipschitz gradient objectives but features a self-concordance structure, and (ii) the problem dimension can easily reach several billions of variables.