318 Hanes Hall, CB #3260 Chapel Hill, NC 27599-3260
919-962-1329
Loading Events

Ph.D. Defense: Tianxiao Sun

August 20, 2018 @ 2:00 pm - 4:00 pm

Ph.D. Thesis Defense

Public Presentation

Monday, August 20th, 2018

125 Hanes Hall
2:00 PM

Tianxiao Sun

 

Newton-type methods under generalized self-concordance

and inexact oracles

(Under the direction of Dr. Quoc Tran-Dinh and Dr. Shu Lu)

Many modern applications in machine learning, image/signal processing, and statistics require to solve large-scale convex optimization problems. These problems share some common challenges such as high-dimensionality, nonsmoothness, and complex objectives and constraints. Due to these challenges, the theoretical assumptions for existing numerical methods are not satisfied. In numerical methods, it is also impractical to do exact computations in many cases (e.g. noisy computation, storage or time limitation). Therefore, new approaches as well as inexact computations to design new algorithms should be considered. In this thesis, we develop fundamental theories and numerical methods, especially second-order methods, to solve some classes of convex optimization problems, where first-order methods are inefficient or do not have a theoretical guarantee. We aim at exploiting the underlying smoothness structures of the problem to design novel Newton-type methods. More specifically, we generalize a powerful concept called self-concordance introduced by Nesterov and Nemirovski to a broader class of convex functions. We develop several basic properties of this concept and prove key estimates for function values and its derivatives. Then, we apply our theory to design different Newton-type methods such as damped-step Newton methods, full-step Newton methods, and proximal Newton methods. Our new theory allows us to establish both global and local convergence guarantees of these methods without imposing unverifiable conditions as in classical Newton-type methods. Numerical experiments show that our approach has several advantages compared to existing works. In the second part of this thesis, we introduce new global and local inexact oracle settings, and apply them to develop inexact proximal Newton-type schemes for optimizing general composite convex problems equipped with such inexact oracles. These schemes allow us to measure errors theoretically and systematically and still lead to desired convergence results. Moreover, they can be applied to solve a wider class of applications arising in statistics and machine learning.

 

Share This Event

  • This event has passed.

Details

Date:
August 20, 2018
Time:
2:00 pm - 4:00 pm
Event Category:

Venue

Hanes 125