Bayesian probability explained
Introduction
Bayesian Probability
Bayesian probability is a way of representing the degree of belief that an event will occur, based on both past data and personal judgment. It is named after Reverend Thomas Bayes, who developed a theorem for updating the probability of an event as new information becomes available.
Bayesian Inference
In Bayesian probability, an initial probability estimate, called the prior probability, is updated as new evidence, or data becomes available. This updated probability is called the posterior probability. The process of updating the probability estimate as new evidence becomes available is called Bayesian inference.
Examples
Imagine you are trying to estimate the probability that a certain medical treatment will be effective in reducing blood pressure. You have conducted a clinical trial in which 50% of the patients who received the treatment experienced a reduction in blood pressure. This clinical trial provides data that can be used to update your initial probability estimate or prior probability.
Let’s say your initial probability estimate, or prior probability, was that the treatment had a 30% chance of being effective. Based on the data from the clinical trial, you can use Bayesian inference to update your probability estimate, or posterior probability, to be more accurate.
To do this, you would calculate the posterior probability using Bayes’ theorem, which is:
Posterior probability = (likelihood of the data given the hypothesis) * (prior probability) / (normalizing constant)
Plugging in the values from the example, we get:
Posterior probability = (0.5) * (0.3) / (normalizing constant)
The normalizing constant is a value that is used to scale the posterior probability so that it adds up to 1. Since the posterior probability is a probability, it must be between 0 and 1.
In this example, the posterior probability would be the probability that the treatment is effective, given the data from the clinical trial and the prior probability of 30%.
Thanks for reading.
Comments ()