Bayes' Theorem
Bayes' Theorem provides a way to revise existing predictions or theories (update probabilities) given new or additional evidence. It is foundational in statistics, machine learning, and medical testing.
The Formula
P(A|B) = [P(B|A) × P(A)] / P(B)
Where P(B) can be expanded using the law of total probability:
P(B) = P(B|A)P(A) + P(B|not A)P(not A)
Variables Explained
- P(A) (Prior Probability): The initial probability of event A occurring before new evidence is considered.
- P(B|A) (True Positive Rate): The probability of observing evidence B given that A is true.
- P(B) (Marginal Probability): The total probability of observing evidence B under all possible scenarios.
- P(B|not A) (False Positive Rate): The probability of observing evidence B even when A is false.
- P(A|B) (Posterior Probability): The updated probability of event A given that evidence B has been observed.