Newton-type methods with complexity guarantees for nonconvex data science
Topic: Optimization | All
Date : 02/02/2023 De 16h00 A 17h00
Lien / Link :
https://mines-paristech.zoom.us/j/97017438759?pwd=SHZpRHNLOGFrWHpKUXQwTjlRZG1qQT09
ID de la réunion / Meeting ID : 97017438759
Mot de passe / Password : 047769
SPEAKER
Clément Royer, Université Paris Dauphine-PSL
TITLE
Newton-type methods with complexity guarantees for nonconvex data science
ABSTRACT
Nonconvex optimization formulations are now commonly used in various areas of data science, from deep learning to matrix analysis and robust statistics. Such instances typically come with a great deal of structure, that can be leveraged for optimization purposes. In particular, the solutions of these problems are often completely characterized by means of the first- and second-order derivatives. Consequently, the development of efficient optimization routines based on those derivatives represents an important area of research, wherein the theoretical efficiency of a method is measured by deriving complexity bounds. Such results quantify the effort
required to reach an approximate solution, and are often used to compare optimization schemes. On the other hand, approaches that achieve the best theoretical results tend to depart from standard optimization frameworks, and may even by outperformed by textbook algorithms in practice.
In this talk, we revisit popular numerical optimization frameworks to equip them with complexity guarantees while maintaining their practical appeal, as evidenced on certain data science problems. We focus on Newton-type
frameworks that we equip with the best-known complexity guarantees for nonconvex optimization. We first propose a generic method that hews close to the classical Newton-Conjugate gradient algorithm, and illustrate the
practical impact of guaranteeing good complexity. We then propose a variant of our method tailored to specific nonconvex optimization landscapes, for which even stronger results can be derived. Finally, we review nonconvex problems arising in the development of neural network architectures, and the guarantees that can be sought for on such instances.
BIO
Clément W. Royer is an associate professor of computer science at Université Paris Dauphine-PSL. Clément received his Ph.D. from the University of Toulouse, France, and was then a postdoctoral research associate at the Wisconsin Institute of Discovery, University of Wisconsin-Madison, USA. His research activities revolve around numerical optimization and its applications, especially in complex systems and data science: he currently
holds a springboard chair in optimization at the Paris Artificial Intelligence Research InstitutE (PRAIRIE).
Lien / Link :
https://mines-paristech.zoom.us/j/97017438759?pwd=SHZpRHNLOGFrWHpKUXQwTjlRZG1qQT09
ID de la réunion / Meeting ID : 97017438759
Mot de passe / Password : 047769
SPEAKER
Clément Royer, Université Paris Dauphine-PSL
TITLE
Newton-type methods with complexity guarantees for nonconvex data science
ABSTRACT
Nonconvex optimization formulations are now commonly used in various areas of data science, from deep learning to matrix analysis and robust statistics. Such instances typically come with a great deal of structure, that can be leveraged for optimization purposes. In particular, the solutions of these problems are often completely characterized by means of the first- and second-order derivatives. Consequently, the development of efficient optimization routines based on those derivatives represents an important area of research, wherein the theoretical efficiency of a method is measured by deriving complexity bounds. Such results quantify the effort
required to reach an approximate solution, and are often used to compare optimization schemes. On the other hand, approaches that achieve the best theoretical results tend to depart from standard optimization frameworks, and may even by outperformed by textbook algorithms in practice.
In this talk, we revisit popular numerical optimization frameworks to equip them with complexity guarantees while maintaining their practical appeal, as evidenced on certain data science problems. We focus on Newton-type
frameworks that we equip with the best-known complexity guarantees for nonconvex optimization. We first propose a generic method that hews close to the classical Newton-Conjugate gradient algorithm, and illustrate the
practical impact of guaranteeing good complexity. We then propose a variant of our method tailored to specific nonconvex optimization landscapes, for which even stronger results can be derived. Finally, we review nonconvex problems arising in the development of neural network architectures, and the guarantees that can be sought for on such instances.
BIO
Clément W. Royer is an associate professor of computer science at Université Paris Dauphine-PSL. Clément received his Ph.D. from the University of Toulouse, France, and was then a postdoctoral research associate at the Wisconsin Institute of Discovery, University of Wisconsin-Madison, USA. His research activities revolve around numerical optimization and its applications, especially in complex systems and data science: he currently
holds a springboard chair in optimization at the Paris Artificial Intelligence Research InstitutE (PRAIRIE).