Markov chains
Markov chains
Using this distribution as a prior, compute the posterior distribution after running an experiment and observing some outcomes and not others.
Using this distribution as a prior, compute the posterior distribution after running an experiment and observing some outcomes and not others.