The research conducted by the members of the OptML group focuses on the design, analysis, and implementation of numerical methods for solving large-scale optimization problems arising in machine learning applications.
Frank E. Curtis and his students’ research revolves around the design of algorithms for solving optimization problems of a variety of types: large-scale nonlinear, PDE-constrained, (nonconvex) nonsmooth, stochastic, and potentially infeasible nonlinear problems. He is a recipient of an Early Career Award in Advanced Scientific Computing Research (ASCR) from the US Department of Energy (DOE). His recent research related to the OptML group involves the design and analysis of deterministic and stochastic methods for solving nonconvex optimization problems.
Daniel P. Robinson and his students focus on designing practical algorithms for large-scale continuous optimization problems. His research in the OptML group relates broadly to nonconvex optimization, and more narrowly to algorithms for solving problems arising in unsupervised learning (e.g., clustering algorithms) and supervised learning (e.g., using nonsmooth regularizers to promote sparsity and other important solution features)..
Martin Takáč and his students’ research interests include convex and nonconvex optimization, the design and analysis of algorithms, parallel and distributed computing, GPU computing, and machine learning.
Luis Nunes Vicente has various research interests spanning continuous optimization, computational science and engineering, and machine learning and data science. His book entitled Introduction to Derivative Free Optimization, co-authored with Andrew R. Conn and Katya Scheinberg, won the Lagrange Prize in Continuous Optimization, awarded jointly by the Mathematical Optimization Society (MOS) and the Society for Industrial and Applied Mathematics (SIAM).