Stochastic Control and Dynamic Asset Allocation (SCDAA), 2019-20
Useful to know
The course lecturer and organiser is David Siska, School of Mathematics, Room 4611.
This is Semester 2 course.
The course timetable is here: timetable.
The course details are here: course details.
Partial lecture notes
The lecture notes can be downloaded here (last update 14th February 2020).
Inevitably, there are mistakes in the lecture notes.
Please report those to me! I will keep track of who reported how many mistakes and the "winner" will be announced after the exam. If anything is not
Past exam papers
The course assessment is 100% exam.
The exam takes place in May and all exam questions count towards final mark.
Books and other sources
It is recommended that you read / understand / tackle exercises in the relevant parts of:
Outline - expected, we'll see how fast we go:
- Week 1: Introductory examples.
- Week 2: Discrete space time (controlled Markov chains), Bellman princple and equation.
- Week 3: Value and policy iteration, Q-learning. Download the Qlearning-example.ipynb Python notebook for the simple example discussed.
- Week 4: Controlled SDEs, existence, uniquness, Markov and flow properties.
- Week 5: Bellman Principle / DPP for controlled SDEs.
- Flexible learning week.
- Week 6: Bellman PDE / HJB equation.
- Week 7: HJB verification theorem, solving the Merton problem, solving linear-quadratic control problem.
- Week 8: BSDEs - examples, existence, uniqueness.
- Week 9: Pontryiagin's optimality principle.
- Week 10: Solving minimum-variance for given return problem.
- Week 11: Revision.