Pattern Recognition and Machine Learning

You don't have access to the course
Pattern Recognition and Machine Learning AE4B33RPZ
Credits 6
Semesters Winter
Completion Assessment + Examination
Language of teaching English
Extent of teaching 2P+2C
Annotation
The basic formulations of the statistical decision problem are presented. The necessary knowledge about the (statistical) relationship between observations and classes of objects is acquired by learning on the raining set. The course covers both well-established and advanced classifier learning methods, as Perceptron, AdaBoost, Support Vector Machines, and Neural Nets.
Study targets
To teach the student to formalize statistical decision making problems, to use machine learning techniques and to solve pattern recognition problems with the most popular classifiers (SVM, AdaBoost, neural net, nearest neighbour).
Course outlines
1.The pattern recognition problem. Overview of the Course. Basic notions.
2.The Bayesian decision-making problem, i.e. minimization of expected loss.
3.Non-bayesian decision problems.
4.Parameter estimation. The maximum likelihood method.
5.The nearest neighbour classifier.
6.Linear classifiers. Perceptron learning.
7.The Adaboost method.
8.Learning as a quadratic optimization problem. SVM classifiers.
9.Feed-forward neural nets. The backpropagation algorithm.
10.Decision trees.
11.Logistic regression.
12.The EM (Expectation Maximization) algorithm.
13.Sequential decision-making (Wald´s sequential test).
14.Recap.
Exercises outlines
Students solve four or five pattern recognition problems, for instance a simplified version of OCR (optical character recognition), face detection or spam detection using either classical methods or trained classifiers.
1.Introduction to MATLAB and the STPR toolbox, a simple recognition experiment
2.The Bayes recognition problem
3.Non-bayesian problems I: the Neyman-Pearson problem.
4.Non-bayesian problems II: The minimax problem.
5.Maximum likelihood estimates.
6.Non-parametric estimates, Parzen windows.
7.Linear classifiers, the perceptron algorithm
8.Adaboost
9.Support Vector Machines I
10.Support Vector Machines II
11.EM algoritmus I
12.EM algoritmus II
13.Submission of reports. Discussion of results.
14.Submission of reports. Discussion of results.
Literature
1.Duda, Hart, Stork: Pattern Classification, 2001.
2.Bishop: Pattern Recognition and Machine Learning, 2006.
3.Schlesinger, Hlavac: Ten Lectures on Statistical and Structural Pattern Recognition, 2002.
Requirements
Knowledge of linear algebra, mathematical analysis and
probability and statistics.
Responsible for the data validity: Study Information System (KOS)