STOR Colloquium: Ion Necoara, Polytechnic University of Bucharest
Polytechnic University of Bucharest
Conditions for linear convergence of (stochastic) first order methods
For convex optimization problems deterministic first order methods have linear convergence when the objective function is smooth and strongly convex. Moreover, under the same conditions – smoothness and strong convexity – sublinear convergence rates have been derived for stochastic first order methods. However, in many applications (machine learning, statistics, control, signal processing) the strong convexity condition does not hold, but the objective function still has a special structure. In this talk we replace the smoothness/strong convexity conditions with several other conditions, that are less conservative, for which we are able to prove that several (stochastic) first order methods are converging linearly. We also provide necessary conditions for linear convergence of (stochastic) gradient method. Finally, we discuss several applications of these results (Lasso problem, linear systems, linear programming, convex feasibility, etc).
Refreshments will be served at 3:00pm in the 3rd floor lounge of Hanes Hall