Machine Learning

Lecturers:Feigl, T.; Löffler, C.; Mutschler, C.
Coverage:2 SWS (5 ECTS)
Prerequisites:

Registration vie e-mail to tobias.feigl@fau.de

Exam criteria:

  • 40 minutes oral presentation
  • Preparation of a written elaboration with the core items of your oral presentation (no slide copies, approx. 8 pages including references)
  • Attendance at the lectures of the other participants
  • Completion of the presentation slides at the latest one week before the lecture date
  • Completion of the written elaboration until the end of the semester
Comment:Registration with topic request by e-mail before lecture start; Distribution of presentation topics is First-Come-First-Served.
Audience:WPF CE-BA-SEM (from 2. Semester)
WPF CE-MA-SEM (from 1. Semester)
WPF INF-BA-SEM (from 2. Semester)
WPF INF-MA (from 1. Semester)
WF ICT-MA (from 1. Semester)
Literature:
  • T. M. Mitchell: Machine Learning, McGraw-Hill, 1997.
  • J. R. Quinlan, C4.5: Programs for Machine Learning, Morgan Kaufmann, 1993.
  • F. V. Jensen, An introduction to Bayesian Networks, UCL Press, 1996.
  • N. Lavrac und S. Dzeroski, Inductive Logic Programming, Techniques and Applications, Ellis Horwood, 1994.
  • J. A. Freeman, Simulating Neural Networks with Mathematica, Addison-Wesley, 1994.
  • J. Hertz, A. Krogh und R. G. Palmer, Introduction to the Theory of Neural Computation, Addison-Wesley, 1991.
  • R. Rojas, Theorie der neuronalen Netze, Springer, 1996.
  • W. Banzhaf, P. Nordin, R. E. Keller und D. Francone, Genetic Programming: An Introduction, Morgan Kaufmann und dpunkt, 1998.1994.
  • M. Mitchel, An Introduction to Genetic Algorithms, MIT-Press, 1996.
  • Michalewicz, Genetic Algorithms + Data Structures = Evolution Programs, Springer, 1992.
  • C. Bishop, Pattern Recognition and Machine Learning, Springer, 2007
  • I. Goodfellow, Y. Bengio, A. Courville, Deep Learning, MIT Press, 2016
Topics:

This seminar introduces the topic of machine learning. Machine learning addresses the question of how to construct computer programs that automatically increase their knowledge. The goal of the seminar is to present key algorithms in this area along with explanatory examples of their working methods and theory, which are at the heart of machine learning. Classical topics include learning with reward, evolutionary algorithms, and statistical methods. Over time, established methods such as Support Vector Machines, Hidden Markov Models or Artificial Neural Networks have evolved. The seminar will provide a comprehensive insight into the world of machine learning and its algorithms.

The preliminary discussion will take place on 24.10.2019 at 14:15 in 04.150!

Presentations

Topic Presenter Download
0.
Introduction
Tobias Feigl, Christopher Loeffler, Christopher Mutschler
Classic Algorithms I (??.??.???? in 04.150)
1.
Clustering
Mathias, Tahas -
2.
Dimensionality Reduction
Yao, Tong -
3.
Ensemble Learning and Boosting
Benkert, Johannes -
4.
Genetic and Evolutionary Algorithms
Hack, Vanessa -
Classic Algorithms II (??.??.???? in 04.150)
5.
Hidden Markov Models
Kindermann, Felix -
6.
Support Vector Machines
Habibullah, Haris -
7.
Artificial Neural Networks
Ehrl, Dorothea -
8.
Convolutional Neural Network
Fisch, Patrick -
Classic Algorithms III (??.??.???? in 04.150)
9.
Online Learning
Halm, Marita -
10.
Naïve Bayes
Biendl, Meike -
11.
Kalman and Particle Filter
Bacher, Valentin -
12.
Gaussian Processes
Cigdem, Meryem -
watermark seal