A mathematical model for nonsmooth algorithmic differentiation with applications to machine learning
Topic: Optimization | All
Thursday 5th November, 2pm – 3pm.
SPEAKER
Edouard Pauwels, Toulouse 3 Paul Sabatier University, France.
ABSTRACT
We are interested in nonsmooth analysis of algorithmic differentiation, a central building block of the learning phase implemented in modern deep learning software librairies, such as Tensorflow or Pytorch. First I will illustrate how blind application of differential calculus to nonsmooth objects can be problematic, requiring a proper mathematical model. Then I will introduce a weak notion of generalized derivative, named conservativity, and illustrate how it complies with calculus and optimization for well structured objects. We provide stability results for empirical risk minimization similar as in the smooth setting for the combination of nonsmooth automatic differentiation, minibatch stochastic approximation and first order optimization. This is joint work with Jérôme Bolte.
BIO
Edouard Pauwels is assistant professor in Toulouse 3 Paul Sabatier university, working between the Informatics and Mathematics institutes. Edouard received his PhD in November 2013 at Center for computational biology, Mines ParisTech, under the supervision of professor Veronique Stoven. From January to September 2014, he was a postdoc in MAC team at LAAS-CNRS. Between October 2014 and July 2015, he did Postdoc at the Technion, Israel.
Download Slides
SPEAKER
Edouard Pauwels, Toulouse 3 Paul Sabatier University, France.
ABSTRACT
We are interested in nonsmooth analysis of algorithmic differentiation, a central building block of the learning phase implemented in modern deep learning software librairies, such as Tensorflow or Pytorch. First I will illustrate how blind application of differential calculus to nonsmooth objects can be problematic, requiring a proper mathematical model. Then I will introduce a weak notion of generalized derivative, named conservativity, and illustrate how it complies with calculus and optimization for well structured objects. We provide stability results for empirical risk minimization similar as in the smooth setting for the combination of nonsmooth automatic differentiation, minibatch stochastic approximation and first order optimization. This is joint work with Jérôme Bolte.
BIO
Edouard Pauwels is assistant professor in Toulouse 3 Paul Sabatier university, working between the Informatics and Mathematics institutes. Edouard received his PhD in November 2013 at Center for computational biology, Mines ParisTech, under the supervision of professor Veronique Stoven. From January to September 2014, he was a postdoc in MAC team at LAAS-CNRS. Between October 2014 and July 2015, he did Postdoc at the Technion, Israel.
Download Slides