Credits
6
Types
Elective
Requirements
This subject has not requirements
, but it has got previous capacities
Department
EIO
Teachers
Person in charge
- Jordi Castro Pérez ( jordi.castro@upc.edu )
Others
- F. Javier Heredia Cervera ( f.javier.heredia@upc.edu )
- Jessica Rodríguez Pereira ( jessica.rodriguez@upc.edu )
Weekly hours
Theory
4
Problems
0
Laboratory
0
Guided learning
0
Autonomous learning
7.53
Competences
Information literacy
Third language
Basic
Generic
Especifics
Objectives
-
Discerning what is an optimization problem and its type and having a basic knowledge of optimization algorithms
Related competences: CB10, CT5, CE1, -
Optimization of neural networks and support vector machines.
Related competences: CB10, CT5, CE1, -
Modelling languages. Formulating optimization problems and representing them through a modeling language. Choosing and adequate solver for a given problem.
Related competences: CB6, CB7, CT4, CT5, CG2,
Contents
-
Unconstrained Optimization
Optimality conditions. Convexity. Descent directions.
Line search. Acceptability of step sizes.
General minimization algorithm.
Gradient method. Rate of convergence.
Newton's method. Factorizations to ensure convergence.
Quasi-Newton methods.
Optimization of neural networks. -
Constrained Optimization and Support Vector Machines.
- Introduction to the modelling langauge AMPL.
- Introduction to Support Vector Macines (SVM)
- KKT Optimality conditions of constrained optimization. Optimality conditions of SVM.
- Duality in Optimization. The dual of the SVM. -
Integer Programming
- Modelling problems with binary variables.
- The branch and bound algorithm for integer programming
- Gomory's cutting planes algorithm.
- Minimal spanning tree problem and algorithms of Kruskal and Prim. Application to clustering.
Activities
Activity Evaluation act
Unconstrained Optimization
Optimality conditions. Convexity. Descent directions. Line search. Acceptability of step sizes. General minimization algorithm. Gradient method. Rate of convergence. Newton's method. Factorizations to ensure convergence. Weighted least squares. Introduction to AMPL. The Neos solver site.Objectives: 3 1 2
Theory
17.3h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
33h
Constrained Optimization and Support Vector Machines
- Introduction to Support Vector Macines (SVM) - KKT Optimality conditions of constrained optimization. Optimality conditions of SVM. - Duality in Optimization. The dual of the SVM.Objectives: 3 1 2
Theory
17.4h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
33h
Integer Programming
- Modelling problems with binary variables. - The branch and bound algorithm for integer programming - Gomory's cutting planes algorithm. - Minimal spanning tree problem and algorithms of Kruskal and PrimObjectives: 3 1
Theory
17.3h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
33h
Teaching methodology
The students will have available all the course material.About two thirds of lecture time will be devoted to optimization algorithms and their properties, and the rest will be for presenting and solving exercises and problems
Lab sessions will be devoted to the solution of some data science applications (neural networks, support vector machines, clustering) using optimization methods.
Evaluation methodology
Each of the three parts of the topic (unconstrained optimization, constrained optimization, integer optimization) has a weight 1/3 on the final mark, and is evaluated as: 40% theory and 60% practical assignment. The evaluation activities are:- Theory (40%). There will be two midterm exams (there is no final exam). The first midterm exam (2/3 of 40%) will consist on some practical exercises about the first two parts of the course, and it will be done before January. The second midterm exam (1/3 of 40%) will be about the third part of the course, and it will be done in January.
- Practical assignments (60%). There will be 3 lab assignments, one for each part of the course, all of them with the same weight on the final mark.
Additional exercises provided during the lectures may be taken into consideration to decide or to boost the final mark.
Bibliography
Basic
-
Numerical optimization
- Nocedal, J.; Wright, S.J,
Springer,
2006.
ISBN: 9780387303031
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003178739706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Linear and nonlinear programming
- Luenberger, D.G.; Ye, Y,
Springer,
2021.
ISBN: 9783030854492
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991005136979306711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Integer programming
- Wolsey, L.A,
Wiley,
2021.
ISBN: 9781119606536
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991005125279506711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
AMPL: a modeling language for mathematical programming
- Fourer, R.; Gay, D.M.; Kernighan, B.W,
Thomson Brooks/Cole,
2003.
ISBN: 0534388094
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991002629329706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
An introduction to support vector machines: and other kermel-based learning methods
- Cristianini, N.; Shawe-Taylor, J,
Cambridge University Press,
2000.
ISBN: 0521780195
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991001992979706711&context=L&vid=34CSUC_UPC:VU1&lang=ca
Web links
- Tool for self-learning LP an ILP algorithms. http://www-eio.upc.es/teaching/ple/pfc_ing.html
- NEOS server https://neos-server.org/neos/