Department of Mathematics,
University of California San Diego
****************************
Math 295 - Mathematics Colloquium
Michael Overton
Courant Institute of Mathematical Sciences
Nonsmooth, Nonconvex Optimization
Abstract:
There are many algorithms for minimization when the objective function is differentiable, convex, or has some other known structure, but few options when none of the above hold, particularly when the objective function is nonsmooth at minimizers, as is often the case in applications. We describe two simple algorithms for minimization of nonsmooth, nonconvex functions. Gradient Sampling is a relatively new method that, although computationally intensive, has a nice convergence theory. The method is robust and the convergence theory has recently been extended to constrained problems. BFGS is an old method, developed for smooth problems, for which we have very limited theoretical results, but some remarkable empirical observations, extensive success in applications, and a rather bold conjecture. Limited Memory BFGS is a popular extension for large problems, and it too is applicable to the nonsmooth case, although our experience with it is more mixed.
Hosts: Philip Gill, Bill Helton and Jiawang Nie
February 4, 2010
3:00 PM
AP&M 6402
****************************