Bayes Theorem

From My Strategy
Jump to navigation Jump to search

Chapter 3 - Experiential Growth Method®

Previous page: Causal Thinking - Bayes Theorem - Next page: Complexity

Back to Book content or directly to Main Page or EGM

Bayes Theorem


Welcome to the Bayes Theorem page


Why does the experience of the train passengers never match the punctuality statistics of the train company? Because we update our knowledge about the world around us, not in 'new-formed chunks' which can be counted statistically, but in a Bayesian way, updating existing knowledge.

Bayes' theorem is a fundamental result in probability theory that describes the relationship between prior and posterior probabilities. Probability science would lead us far away from this wiki's core intent. So instead, we limit ourselves to Bayesian statistics because it best describes how we statistically think as humans. Bayes' theorem is a way of figuring out the probability of something happening based on what we already know. It's named after a man named Thomas Bayes, who was a mathematician and philosopher.



Here's a simple example of how Bayes' theorem works:

Imagine you have a jar of marbles. Some of the marbles are red and some of them are green. You want to figure out the probability that a marble you pick out of the jar will be red.

You know that there are 10 marbles in the jar, and 3 of them are red. So, you can say that the probability of picking a red marble out of the jar is 3/10, or 30%.

Now, imagine that you pick a marble out of the jar and it turns out to be red. You might want to update your estimate of the probability that the next marble you pick out of the jar will be red. Bayes' theorem helps you do this. Using Bayes' theorem, you can take into account the fact that you just picked a red marble and update your estimate of the probability that the next marble you pick will be red. This can be helpful if you want to be more accurate in your predictions: 2/9, or 22%


Bayes' Theorem is a simple mathematical formula used for calculating conditional probabilities. Bayes' theorem is particularly useful for inferring causes from their effects since it is often fairly easy to discern the probability of an effect given the presence or absence of a cause.

The objective is to discover what sorts of constraints experience tends to impose, and to explain how the person's prior opinions can be used to justify the choice of a posterior probability from among the many that might satisfy a given constraint.

This is a kind of "no jumping to conclusions" requirement. We explain it here as a natural result of the idea that rational learners should proportion their beliefs to the strength of the evidence they acquire. (1)


Deep dive


Basic formula

The probability of a hypothesis - conditional on a given body of data - is

"the ratio of the unconditional probability of the conjunction of the hypothesis with the data to the unconditional probability of the data alone."



Example 1


Bayes theorem formula

P(Model|Data) = P(Data|Model)*P(Model) / P(Data)


P(Model|Data) : What is the probability that the train will run on time | given | my experience?


P(Data|Model) : My opinion that the train will run on time expresed in a probability : 60 %


P(Model) : The probability that the train will run on time (Railway statistics) : 80%


P(Data) : "My personal certainty of my opinion/experience" expresed in a probability : 100% (I am very sure of myself)

The calculation gives us: (60% * 80%) / 100% = 48%

.In words:

  • My believe that only 60% of the trains run on time, and being very sure of this opinion, gives me a feeling of dealing with a railroad where 50% of the trains run on time.
  • Even if all trains (100%) would realy run on time, this would only enhance my feeling from 48% to 60%, not to the present railway statistics.
  • Only if I would syncronise my opinion and my certainty, I would come very close to the railway statistics.


Do you want to know more?

Source & External link
(1) Bayes’ Theorem (Stanford Encyclopedia of Philosophy)


Next page: Complexit