The course introduces the basic concepts of optimization and the different types of optimization problems, the iterative algorithms to solve these problems, and their properties. The practice of optimization using modeling languages to describe a problem and commercial and publicly available solvers is also emphasized.
Teachers
Person in charge
Jordi Castro Pérez (
)
Others
F. Javier Heredia Cervera (
)
Weekly hours
Theory
3.2
Problems
0
Laboratory
0
Guided learning
0.777
Autonomous learning
52
Objectives
Discerning what is an optimization problem and its type and having a basic knowledge of optimization algorithms
Related competences:
CEE5.3,
CEE3.2,
CTR2,
CTR6,
Formulating optimization problems and representing them through a modeling language
Related competences:
CEE5.3,
CG1,
CG3,
CEE3.2,
CEE3.3,
CB9,
CEC1,
CEC2,
Choosing an adequate solver type for a given problem
Related competences:
CEE5.3,
CEE3.2,
CEE3.3,
Using publicly available and commercial solvers. Interpreting results from solvers and communicating in writing results from optimization
Related competences:
CEE5.3,
CTR6,
Contents
Unconstrained Optimization
Optimality conditions. Convexity. Descent directions.
Line search. Acceptability of step sizes.
General minimization algorithm.
Gradient method. Rate of convergence.
Newton's method. Factorizations to ensure convergence.
Weighted least squares.
Introduction to AMPL. The Neos solver site.
Constrained Optimization and Support Vector Machines.
- Introduction to Support Vector Macines (SVM)
- KKT Optimality conditions of constrained optimization. Optimality conditions of SVM.
- Duality in Optimization. The dual of the SVM.
Integer Programming
- Modelling problems with binary variables.
- The branch and bound algorithm for integer programming
- Gomory's cutting planes algorithm.
- Minimal spanning tree problem and algorithms of Kruskal and Prim.
Activities
ActivityEvaluation act
Unconstrained Optimization
Optimality conditions. Convexity. Descent directions.
Line search. Acceptability of step sizes.
General minimization algorithm.
Gradient method. Rate of convergence.
Newton's method. Factorizations to ensure convergence.
Weighted least squares.
Introduction to AMPL. The Neos solver site. Objectives:1234
Theory
17.3h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
35h
Constrained Optimization and Support Vector Machines
- Introduction to Support Vector Macines (SVM)
- KKT Optimality conditions of constrained optimization. Optimality conditions of SVM.
- Duality in Optimization. The dual of the SVM. Objectives:1234
Theory
17.4h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
35h
Integer Programming
- Modelling problems with binary variables.
- The branch and bound algorithm for integer programming
- Gomory's cutting planes algorithm.
- Minimal spanning tree problem and algorithms of Kruskal and Prim Objectives:1234
Theory
17.3h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
35h
Teaching methodology
The students will have available all the course material.
About two thirds of lecture time will be devoted to optimization algorithms and their properties, and the rest will be for presenting and solving exercises and problems
Lab sessions (using AMPL) will be devoted to the solution of some data mining applications.
Evaluation methodology
- Theory (40%). There will be a short midterm exam based on practical questions for the first two parts of the course, and some individual numerical exercises for the third (and last) part.
- Practical assigments (60%). There will be 3 lab assignments, one for each part of the course, all of them with the same weight on the final mark.
Additional exercises provided during the lectures may be taken into consideration to decide or to boost the final mark.