Research

The research conducted by the members of the OptML group focuses on the design, analysis, and implementation of numerical methods for solving large-scale optimization problems arising in machine learning applications.

Katya Scheinberg and her students focus on developing practical algorithms and their theoretical analyses for solving various problems in continuous optimization, including those that arise in convex optimization, derivative free optimization, machine learning, quadratic optimization, and more. Her book entitled Introduction to Derivative Free Optimization, co-authored by Andrew R. Conn and Luis N. Vicente, won the Lagrange Prize in Continuous Optimization, awarded jointly by the Mathematical Optimization Society (MOS) and the Society for Industrial and Applied Mathematics (SIAM). Her recent research relates to the analysis of probabilistic methods in derivative free optimization and beyond.

Frank E. Curtis and his students’ research revolves around the design of algorithms for solving optimization problems of a variety of types: large-scale nonlinear, PDE-constrained, (nonconvex) nonsmooth, stochastic, and potentially infeasible nonlinear problems. He is a recipient of an Early Career Award in Advanced Scientific Computing Research (ASCR) from the US Department of Energy (DOE). His recent research related to the OptML group involves the design and analysis of deterministic and stochastic methods for solving nonconvex optimization problems.

Martin Takáč and his students’ research interests include convex and nonconvex optimization, the design and analysis of algorithms, parallel and distributed computing, GPU computing, and machine learning.