Smoothness in Nonsmooth Optimization
Adrian S. Lewis, Cornell University
Fast black-box nonsmooth optimization, while theoretically out of reach in the worst case, has long been an intriguing goal in practice. Generic concrete nonsmooth objectives are "partly" smooth: their subdifferentials have locally smooth graphs with powerful constant-rank properties, often associated with hidden structure in the objective. One typical example is the proximal mapping for the matrix numerical radius, whose output is surprisingly often a "disk" matrix. Motivated by this expectation of partial smoothness, this talk describes a Newtonian black-box algorithm for general nonsmooth optimization. Local convergence is provably superlinear on a representative class of objectives, and early numerical experience is promising more generally.
Joint work with Xiaoyan Han, Jingwei Liang, Michael Overton, and Calvin Wylie.
Adrian Lewis received his B.A., M.A., and Ph.D. degrees from Cambridge University, U.K. After faculty positions at the University of Waterloo and Simon Fraser University in Canada, he joined Cornell University in 2004 as a Professor in the School of Operations Research and Information Engineering, where he completed a three-year term as Director in 2013. His research concerns nonsmooth optimization and variational analysis. He has authored nearly 100 refereed publications and a book, and he is Co-Editor of Mathematical Programming. He received the 1995 Aisenstadt Prize, the 2003 Lagrange Prize, and a 2005 Outstanding Paper Prize from SIAM, and he is a SIAM Fellow. He was an invited section speaker at the 2014 International Congress of Mathematicians in Seoul.