BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//UNC Statistics & Operational Research - ECPv5.16.0//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:UNC Statistics & Operational Research
X-ORIGINAL-URL:https://stor.unc.edu
X-WR-CALDESC:Events for UNC Statistics & Operational Research
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20210314T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20211107T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20210201T153000
DTEND;TZID=America/New_York:20210201T163000
DTSTAMP:20220811T184613
CREATED:20210125T092132Z
LAST-MODIFIED:20210125T092132Z
UID:9954-1612193400-1612197000@stor.unc.edu
SUMMARY:Colloquium: Dmitriy Drusvyatskiy\, University of Washington
DESCRIPTION:Dmitriy Drusvyatskiy\nUniversity of Washington at Seattle \nStochastic methods for nonsmooth nonconvex optimization \nStochastic iterative methods lie at the core of large-scale optimization and its modern applications to data science. Though such algorithms are routinely and successfully used in practice on highly irregular problems (e.g. deep neural networks)\, few performance guarantees are available outside of smooth or convex settings. In this talk\, I will describe a framework for designing and analyzing stochastic methods on a large class of nonsmooth and nonconvex problems\, with provable efficiency guarantees. The problem class subsumes such important tasks as phase retrieval\, robust PCA\, and minimization of risk measures\, while the methods include stochastic subgradient\, Gauss-Newton\, and proximal point iterations. The main thread of the proposed framework is appealingly intuitive. I will show that a wide variety of stochastic methods can be interpreted as inexact gradient descent on an implicit smoothing of the problem. Optimal learning rates and novel sample-complexity guarantees (for various signal and matrix recovery problems) follow quickly from this viewpoint. \nBisketch: Dmitriy Drusvyatskiy received his PhD from the Operations Research and Information Engineering department at Cornell University in 2013\, followed by a post doctoral appointment in the Combinatorics and Optimization department at Waterloo\, 2013-2014. He joined the Mathematics department at University of Washington as an Assistant Professor in 2014\, and was promoted to an Associate Professor in 2019. Dmitriyâ€™s research broadly focuses on designing and analyzing algorithms for large-scale optimization problems\, primarily motivated by applications in data science. Dmitriy has received a number of awards\, including the Air Force Office of Scientific Research (AFOSR) Young Investigator Program (YIP) Award\, NSF CAREER\, INFORMS Optimization Society Young Researcher Prize 2019\, and finalist citations for the Tucker Prize 2015 and the Young Researcher Best Paper Prize at ICCOPT 2019. Dmitriy is currently a co-PI of the NSF funded Transdisciplinary Research in Principles of Data Science (TRIPODS) institute at University of Washington.
URL:https://stor.unc.edu/event/colloquium-dmitriy-drusvyatskiy-university-of-washington/
LOCATION:zoom
CATEGORIES:STOR Colloquium
END:VEVENT
END:VCALENDAR