Technische Universität Ilmenau

Data-Driven Optimization for Machine Learning Applications - Modultafeln of TU Ilmenau

The module lists provide information on the degree programmes offered by the TU Ilmenau.

Please refer to the respective study and examination rules and regulations for the legally binding curricula (Annex Curriculum).

You can find all details on planned lectures and classes in the electronic university catalogue.

Information and guidance on the maintenance of module descriptions by the module officers are provided at Module maintenance.

Please send information on missing or incorrect module descriptions directly to modulkatalog@tu-ilmenau.de.

module properties Data-Driven Optimization for Machine Learning Applications in degree program Master Ingenieurinformatik 2014
module number200135
examination number220491
departmentDepartment of Computer Science and Automation
ID of group 2212 (Simulation and Optimal Processes)
module leaderProf. Dr. Pu Li
term summer term only
languageEnglisch
credit points5
on-campus program (h)45
self-study (h)105
obligationelective module
examexamination performance with multiple performances
details of the certificateDas Modul Data-Driven Optimization for Machine Learning Applications mit der Prüfungsnummer 220491 schließt mit folgenden Leistungen ab:
  • alternative semesterbegleitende Prüfungsleistung mit einer Wichtung von 30% (Prüfungsnummer: 2200829)
  • mündliche Prüfungsleistung über 30 Minuten mit einer Wichtung von 70% (Prüfungsnummer: 2200830)

Details zum Abschluss Teilleistung 1:

Programmieraufgaben als Hausbeleg

signup details for alternative examinationsDie Anmeldung zur alternativen semesterbegleitenden Abschlussleistung erfolgt über das Prüfungsverwaltungssystem (thoska) außerhalb des zentralen Prüfungsanmeldezeitraumes. Die früheste Anmeldung ist generell ca. 2-3 Wochen nach Semesterbeginn möglich. Der späteste Zeitpunkt für die An- oder Abmeldung von dieser konkreten Abschlussleistung ist festgelegt auf den (falls keine Angabe, erscheint dies in Kürze):
maximum number of participants
previous knowledge and experience

 BSc level. Basic linear algebra and computer programming skills are advantageous.

learning outcome

The students know and can explain

  • basic model-driven, model-driven data-augumented, and data-driven optimization
  • numerical linear algebra methods for machine learning
  • convexity and regularization of functions
  • non-negative matrix factorization and application
  • modern mathematical optimization algorithms for pattern recognition and classification
  • modern mathematical optimization algorithms for neural-network-based modeling.

They can implement

  • optimization algorithms for linear and nonlinear regressions
  • quadratic programming methods for support vector machines
  • optimization algorithms for non-negative matrix factorization, pattern recognition, and applications
  • and evaluate various optimization algorithms for neural network-based modeling and applications

The students learn the theory, models, methods, and algorithms of the corresponding subjects in the lectures. In the exercises, they are activated to solve example tasks. In project tasks, they analyze, solve, and evaluate programming problems.

content

1. Introduction - Motivation, Data-driven versus Model-driven appraoch, importance of data-driven optimization; overview of optimization problems arising in machine learning applications;

2. Preiminaries - linear algebra; convex sets convex functions; gradient, sub-gradient, hessian matrix;

3. Programming basics (Python, R, Matlab); data loading and preprocessing; 

4. Unconstrained optimization for machine learning: regularization-meaning and relevance; regression problems; neural networks and back-propagation of errors; optimization methods for deep learning ;

5. Uncostrained Optimiztion Algorithms; 5A: First-order algorithms - gradient descent, accelerated gradient descent, stochastic gradient descent, conjugate gradient methods, coordinate descent; R and Python implementations; sub-gradient methods (optional); 5B. Second-order algorithms: The Newton Method; quasi-Newton methods; LBFGS; R and Python implementations;

6. Constrained Optimization Methods for Machine Learning - the interior point method; face-recongintion with supprot vector machine using Python, Scikit-Learn and OpenCV ;Matrix factorization methods for pattern recognition- SVD, PCA, non-negative matrix factorization (NMF); Matlab and Python Scikit-Learn implementations; Proximal-Point Algorithms: proximal gradient methods; alternating direction of multupliers (ADMM);

7. Bayesian Optimization methods for Machine Learning;

8. Optimization algorithms in Deep Learning Tools TensorFlow, Kerays, pyTorch

media of instruction

Lecture Slides, PC Pools, Machine Learning Tools and Libraries

literature / references

Bottou, Léon; Curtis Frank E., Nocedal, Jorge: Optimization Methods for Large-Scale Machine Learning. SIAM Review, 60(2), 223-311.                                                                                                                                                                                                                                                                                                                                                                           Emrouznejad, Ali (ed.): Big Data Optimization: Recent developments and challenges. Volume 18, Studies in Big Data Series, Springer, 2016.                                                                                                                                                                          Geron, Aurelien: Hands-on machine learning with scikit-learn, Keras & TensorFlow, 2nd Ed. O'Reilly, 2019.   Goodfellow, Ian; Bengio, Yoshua; Courville, Aaron: Deep Learning. The MIT Press, 2017.                                                        

evaluation of teaching