Bayesian Inference - Turning Data into Decisions

Bayesian Inference is a statistical method that updates the probability of a hypothesis as more evidence or information becomes available. It combines prior beliefs with new data to make decisions or predictions. It was developed by Thomas Bayes in the 18th century but gained prominence in the 20th century. It's use was increased with the advent of computers which made complex calculations feasible.

Bayesian Inference is a statistical method that updates the probability of a hypothesis as more evidence or information becomes available. It combines prior beliefs with new data to make decisions or predictions.
Image: Interference patterns generated by AI.
Bayesian inference uses a formula called Bayes' theorem to update the probability of something happening (like the dog stealing the cookie) based on new evidence (like the crumbs).

  • Prior probability: Your initial belief about how likely something is (e.g., how likely the dog is to steal cookies in general).
  • Likelihood: How likely the new evidence is if that thing is true (e.g., how likely you are to find crumbs by the dog's bed if the dog stole the cookie).
  • Posterior probability: Your updated belief after considering the evidence (e.g., how likely the dog stole the cookie given the crumbs).

Imagine you have two bags of marbles:

Bag A: Contains 10 red marbles and 30 blue marbles (75% blue).

Bag B: Contains 40 red marbles and 10 blue marbles (80% red).

You randomly pick a bag, but you don't know which one. Then, you draw a marble from the chosen bag, and it's red. The question is: Which bag did you likely pick from?

Here's how Bayesian inference helps us figure this out:

Prior Probability: Before drawing any marbles, you have a 50/50 chance of picking either bag. So:

P(Bag A) = 0.5

P(Bag B) = 0.5

Likelihood: This is the probability of drawing a red marble given you picked a specific bag:

P(Red | Bag A) = 10/40 = 0.25 (25% chance of red from Bag A)

P(Red | Bag B) = 40/50 = 0.80 (80% chance of red from Bag B)

Posterior Probability: This is what we want to find: the probability of having picked a specific bag given we drew a red marble. We use Bayes' theorem for this:

Concludingly, Even though you had a 50/50 chance of picking either bag initially, after drawing a red marble, it's much more likely (76%) that you picked from Bag B because Bag B has a higher proportion of red marbles.

Source: https://media.geeksforgeeks.org/wp-content/uploads/20240619133458/Bayes-Theorem-for-Conditional-Probability.png
In other words, Bayesian inference is a way of updating your beliefs based on new information. It's like constantly refining your detective work as you gather more clues.

Types of Bayesian Inference:

Posterior Distribution: The updated probability after considering new data.

Prior Distribution: The initial belief before seeing new data.

Likelihood: The probability of observing the data given a hypothesis.

Uses of Bayesian Inference:

Medical Diagnosis: Predicting the likelihood of a disease based on test results.

Weather Forecasting: Updating weather predictions with new data.

Spam Filtering: Determining if an email is spam based on words and patterns.

Examples:

  • Medical Diagnosis: Doctors update the probability of a disease as more test results come in.
  • Weather Forecasting: Meteorologists refine their weather predictions with new data.
  • Spam Filtering: Email systems identify spam by continuously learning from new emails.

0 Comments