ACMS Statistics Seminar: Bayan Saparbayeva

Location: 154 Hurley Hall

Bayan Saparbayeva
University of Notre Dame

3:30 PM
154 Hurley Hall

Catalyst Acceleration for Non-convex Optimization on Manifolds

In the following work, we propose an extension of a class of optimization algorithms to non-convex problems on manifold spaces. To accomplish this, we leverage ideas from recent work on the “Catalyst” algorithm for non-convex optimization on Euclidean spaces H. Lin et al. (2017). This method is initially designed to accelerate existing optimization algorithms for convex objective functions but is also amenable to non-convex functions. We apply the Catalyst acceleration scheme to many existing optimization routines, such as gradient descent, Newtonian methods and algorithms that are designed for distributed parallel inference on manifolds such as Sarpabayeva et al. (2018). One can show that the algorithm generally converges to the stationary point, and in the case of strongly convex functions, it converges to the local minimum. We will provide convergence analysis of the algorithms and present applications for a large class of objective functions in various manifold spaces.

 

Full List of Statistics Seminar Speakers


 


 


 


 


 


 


 


 


 

Screen Shot 2019 03 20 At 10

 

Originally published at acms.nd.edu.