All Dates/Times are Australian Eastern Standard Time (AEST)

Technical Program

Paper Detail

Paper IDD5-S5-T3.3
Paper Title First Order Methods take Exponential Time to Converge to Global Minimizers of Non-Convex Functions
Authors Krishna Reddy Kesari, Jean Honorio, Purdue University, United States
Session D5-S5-T3: Optimization
Chaired Session: Friday, 16 July, 23:20 - 23:40
Engagement Session: Friday, 16 July, 23:40 - 00:00
Abstract Machine learning algorithms typically perform optimization over a class of non-convex functions. In this work, we provide bounds on the fundamental hardness of identifying the global minimizer of a non convex function. Specifically, we design a family of parametrized non-convex functions and employ statistical lower bounds for parameter estimation. We show that the parameter estimation problem is equivalent to the problem of function identification in the given family. We then claim that non convex optimization is at least as hard as function identification. Jointly, we prove that any first order method can take exponential time to converge to a global minimizer.