M3: Probability (2019-2020)

Dr James Martin
Course Term: 
Course Lecture Information: 

16 lectures

Course Overview: 

An understanding of random phenomena is becoming increasingly important in today's world within social and political sciences, finance, life sciences and many other fields. The aim of this introduction to probability is to develop the concept of chance in a mathematical framework. Random variables are introduced, with examples involving most of the common distributions.

Learning Outcomes: 

Students should have a knowledge and understanding of basic probability concepts, including conditional probability. They should know what is meant by a random variable, and have met the common distributions. They should understand the concepts of expectation and variance of a random variable. A key concept is that of independence which will be introduced for events and random variables.

Course Syllabus: 
Course Synopsis: 

Sample space, events, probability measure. Permutations and combinations, sampling with or without replacement. Conditional probability, partitions of the sample space, law of total probability, Bayes' Theorem. Independence.

Discrete random variables, probability mass functions, examples: Bernoulli, binomial, Poisson, geometric. Expectation, expectation of a function of a discrete random variable, variance. Joint distributions of several discrete random variables. Marginal and conditional distributions. Independence. Conditional expectation, law of total probability for expectations. Expectations of functions of more than one discrete random variable, covariance, variance of a sum of dependent discrete random variables.

Solution of first and second order linear difference equations. Random walks (finite state space only).

Probability generating functions, use in calculating expectations. Examples including random sums and branching processes.

Continuous random variables, cumulative distribution functions, probability density functions, examples: uniform, exponential, gamma, normal. Expectation, expectation of a function of a continuous random variable, variance. Distribution of a function of a single continuous random variable. Joint probability density functions of several continuous random variables (rectangular regions only). Marginal distributions. Independence. Expectations of functions of jointly continuous random variables, covariance, variance of a sum of dependent jointly continuous random variables.

Random sample, sums of independent random variables. Markov's inequality, Chebyshev's inequality, Weak Law of Large Numbers.

Reading List: 

1) G. R. Grimmett and D. J. A. Welsh, Probability: An Introduction (Oxford University Press, 1986), Chapters 1--4, 5.1--5.4, 5.6, 6.1, 6.2, 6.3 (parts of), 7.1--7.3, 10.4.

2) J. Pitman, Probability (Springer-Verlag, 1993).

3) S. Ross, A First Course In Probability (Prentice-Hall, 1994).

4) D. Stirzaker, Elementary Probability (Cambridge University Press, 1994), Chapters 1--4, 5.1--5.6, 6.1--6.3, 7.1, 7.2, 7.4, 8.1, 8.3, 8.5 (excluding the joint generating function).

Please note that e-book versions of many books in the reading lists can be found on SOLO and ORLO.