# Table of Contents

- Page ID
- 6210

## 11: Markov Chains

Modern probability theory studies chance processes for which the knowledge of previous outcomes influences predictions for future experiments. In principle, when we observe a sequence of chance experiments, all of the past outcomes could influence our predictions for the next experiment. In a Markov process, the outcome of a given experiment can affect the outcome of the next experiment. This type of process is called a Markov chain.