Loading Events

« All Events

ISyE – Auto-Conditioned First-Order and Stochastic Optimization Methods

January 15 @ 12:00 PM 1:00 PM

In this talk, I will present a novel class of first-order methods, termed auto-conditioned methods, that are universal for solving various classes of optimization problems without requiring prior knowledge of problem parameters or resorting to any line search or backtracking procedures. In the first part of the talk, we focus on convex optimization and propose a uniformly optimal method for smooth, weakly smooth, and nonsmooth problems. In the second part of the talk, we consider smooth but possibly nonconvex optimization, and propose a novel parameter-free projected gradient method with the best-known unified complexity for convex and nonconvex problems. We then generalize the method to the stochastic setting, achieving new universal complexity bounds that are nearly optimal for both convex and nonconvex problems. The advantages of the proposed methods are demonstrated by encouraging numerical results.

1513 Engineering Dr.
Madison, WI 53706 United States
View Venue Website

Bio: Tianjiao Li is a postdoctoral associate at the MIT Sloan School of Management and Operations Research Center. He received his Ph.D. in Operations Research from the H. Milton Stewart School of Industrial and Systems Engineering at Georgia Tech, where he was advised by Prof. George Lan and Prof. Ashwin Pananjady. His research interests lie in the theory and methodology of nonlinear optimization, stochastic optimization, and reinforcement learning, with a central focus on bridging rigorous theoretical development with practical relevance, especially in data science and artificial intelligence. His work has been recognized as an honorable mention in the INFORMS George Nicholson Student Paper Competition and as second place in the INFORMS Optimization Society Student Paper Award.