Probability and Statistics

You don't have access to the course
This is a grouped Moodle course. It consists of several separate courses that share learning materials, assignments, tests etc. Below you can see information about the individual courses that make up this Moodle course.
Probability and Statistics (Main course) B0B01PST
Credits 7
Semesters Winter
Completion Assessment + Examination
Language of teaching undefined
Extent of teaching 4P+2S
Probability, Statistics, and Theory of Information A0B01PSI
Credits 6
Semesters Winter
Completion Assessment + Examination
Language of teaching Czech
Extent of teaching 4+2
Annotation
Basics of probability theory, mathematical statistics, information theory, and coding. Includes descriptions of probability, random variables and their distributions, characteristics and operations with random variables. Basics of mathematical statistics: Point and interval estimates, methods of parameters estimation and hypotheses testing, least squares method. Basic notions and results of the theory of Markov chains. Shannon entropy, mutual and conditional information.
Study targets
Basics of probability theory and their application in statistical estimates and tests.
The use of Markov chains in modeling.
Basic notions of information theory.
Course outlines
1. Basic notions of probability theory. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
2. Random variables and their description. Random vector. Probability distribution function.
3. Quantile function. Mixture of random variables.
4. Characteristics of random variables and their properties. Operations with random variables.
Basic types of distributions.
5. Characteristics of random vectors. Covariance, correlation. Chebyshev inequality. Law of large numbers. Central limit theorem.
6. Basic notions of statistics. Sample mean, sample variance.
Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood. EM algorithm.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains. Overview of applications.
12. Shannon entropy. Entropy rate of a stationary information source.
13. Fundamentals of coding. Kraft inequality. Huffman coding.
14. Mutual information, capacity of an information channel.
Exercises outlines
1. Elementary probability.
2. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
3. Mixture of random variables. Mean. Unary operations with random variables.
4. Dispersion (variance). Random vector, joint distribution. Binary operations with random variables.
5. Sample mean, sample variance. Chebyshev inequality. Central limit theorem.
6. Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains.
12. Shannon entropy. Entropy rate of a stationary information source.
13. Fundamentals of coding. Kraft inequality. Huffman coding.
14. Mutual information, capacity of an information channel.
Literature
[1] Papoulis, A.: Probability and Statistics, Prentice-Hall, 1990.
[2] Stewart W.J.: Probability, Markov Chains, Queues, and Simulation: The Mathematical Basis of Performance Modeling. Princeton University Press 2009.
[3] David J.C. MacKay: Information Theory, Inference, and Learning Algorithms, Cambridge University Press, 2003.
Requirements
Linear Algebra, Calculus, Discrete Mathematics
Probability, Statistics and Information Theory A8B01PSI
Credits 6
Semesters Winter
Completion Assessment + Examination
Language of teaching Czech
Extent of teaching 4P+2S
Annotation
Basics of probability theory, mathematical statistics, information theory, and coding. Includes descriptions of probability, random variables and their distributions, characteristics and operations with random variables. Basics of mathematical statistics: Point and interval estimates, methods of parameters estimation and hypotheses testing, least squares method. Basic notions and results of the theory of Markov chains. Shannon entropy, mutual and conditional information.
Study targets
Basics of probability theory and their application in statistical estimates and tests.
The use of Markov chains in modeling.
Basic notions of information theory.
Course outlines
1. Basic notions of probability theory. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
2. Random variables and their description. Random vector. Probability distribution function.
3. Quantile function. Mixture of random variables.
4. Characteristics of random variables and their properties. Operations with random variables.
Basic types of distributions.
5. Characteristics of random vectors. Covariance, correlation. Chebyshev inequality. Law of large numbers. Central limit theorem.
6. Basic notions of statistics. Sample mean, sample variance.
Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood. EM algorithm.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains. Overview of applications.
12. Shannon entropy. Entropy rate of a stationary information source.
13. Fundamentals of coding. Kraft inequality. Huffman coding.
14. Mutual information, capacity of an information channel.
Exercises outlines
1. Elementary probability.
2. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
3. Mixture of random variables. Mean. Unary operations with random variables.
4. Dispersion (variance). Random vector, joint distribution. Binary operations with random variables.
5. Sample mean, sample variance. Chebyshev inequality. Central limit theorem.
6. Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains.
12. Shannon entropy. Entropy rate of a stationary information source.
13. Fundamentals of coding. Kraft inequality. Huffman coding.
14. Mutual information, capacity of an information channel.
Literature
[1] Papoulis, A.: Probability and Statistics, Prentice-Hall, 1990.
[2] Stewart W.J.: Probability, Markov Chains, Queues, and Simulation: The Mathematical Basis of Performance Modeling. Princeton University Press 2009.
[3] David J.C. MacKay: Information Theory, Inference, and Learning Algorithms, Cambridge University Press, 2003.
Requirements
Linear Algebra, Calculus, Discrete Mathematics
Responsible for the data validity: Study Information System (KOS)