Modelling

Hidden Markov Model

In a hidden markov model, the markov chain is hidden, we can only observe outcome values. Therefore, additional concepts such as observed states and emission probabilities are explained.

Markov Chains

Markov chains are a succession of random events. Using the example of chutes and ladders, concepts such as states, initial probability, and transition matrix are explained.