Numerical methods for unconstrained nonlinear programming and nonlinear least-squares. Stochastic gradient type methods for optimization problems in machine-learning.
Basic Python instructions and scipy.optimize module.
Applications to the training of neural-networks and other learning models.
Matlab Optimization Toolbox.
J. Nocedal, S.J. Wright, "Numerical Optimization", 2nd ed., 2006
I. Goodfellow, Y. Bengio, A. Courville, Deep Learning, The MIT Press, 2016.
G. Strang, Linear algebra and learning from data,
Wellesley-Cambridge Press, 2019.
Learning Objectives
The aim of the course is to introduce the students to the numerical solution of unconstrained optimization problems.
At the end of the course the students will
gain knowledge of the main numerical optimization methods and will be able to analyse their theoretical background with focus on unconstrained optimization problems arising in machine-learning.
Prerequisites
Courses recommended: first level courses of Mathematical Analysis, Numerical Analysis and Probability.
The exam is based on an oral exam on the topics studied in the course and on a report on one of the activities carried out during the lab lessons.
In the oral exam it will be tested the depth of the students' understanding of the subjects and their ability to explain, defend, reflect, critically evaluate, and possibly improve their work;
regarding the report, the students will be asked to asses the computational performance of one the the studied method.
It will be evaluated the student's ability in testing the performance of the methods in terms of robustness, memory requirement and computational cost.
Course program
Unconstrained nonlinear programming: optimality conditions. Gradient method and Conjugate Gradient method for objective quadratic functions. Gradient method, with complexity analysis, Newton method and quasi-Newton methods for nonlinear objective functions.
Gauss-Newton method for nonlinear least-squares problems.
Line-search globalization techniques.
Optimization problems in machine-learning: introduction to neural networks,
classification problems and optimization problems arising neural networks training.
Stochastic gradient method, variance reduction approaches, Adam. Convergence and complexity in expectation.
Introduction to Python language: basic instructions, functions, modules. Scipy.optimize module and applications to classification problems. Numerical solution of problems in neural-networks training via Keras.
Matlab Optimization Toolbox.
Sustainable Development Goals 2030
Goal 9: Industry, innovation and infrastructure.
In particular Target 9.5 Enhance research and upgrade industrial technologies.
Enhance scientific research, upgrade the technological capabilities of industrial sectors in all countries including, by 2030, encouraging innovation.