# Probability Tips

  • The Probability Density Functions (PDF) describes the relative likelihood for a random variable to take on a given value
  • joint distribution representing the probability that the random variable X takes on the value x and that Y takes on the value y i.e. P(x,y) = P(x).P(y) (if independent)
  • conditional probability that describes the probability that the random variable X takes on the value x conditioned on the knowledge that Y for sure takes y. i.e. P(X|Y)
  • Theorem of Total probability builds on the above
  • Bayes Theorem builds on the above as well

# Bayes Theorem simplest case

P(AB)=P(AB)P(B)\mathbf { P(A|B) = \frac{P(A \cap B)}{P(B)} }

Read as - Probability of A given B is equal to Probability of A & B divided by probability of B.

You can combine the above with this formula derivation

P(BA)=P(AB)P(A)P(B|A) = \frac{P(A \cap B)}{P(A)}
P(AB)=P(A)P(BA)P(A \cap B) = P(A) * P(B|A)

And turn it into

P(AB)=P(A)P(BA)P(B)\mathbf {P(A|B) = \frac{P(A) * P(B|A)}{P(B)} }

which can be used to calculate posterior probability (probability based on prior condition)

# Theory of Total Probability

p(a)=bp(ab).p(b)db for continous probabilitiesp(a) = \int_{b} p(a|b).p(b)db \text{ for continous probabilities}
p(a)=bp(ab).p(b) for discrete probabilitiesp(a) = \sum_{b} p(a|b).p(b) \text{ for discrete probabilities}

You can also re-write the latter as

p(a)=p(ab).p(b)+p(a¬b).p(¬b)\mathbf {p(a) = p(a|b).p(b) + p(a| \lnot b).p(\lnot b) }

The above was in case of only two values of b (b and ) but you get the gist.

# Combining Bayes Theorem and Total Probability

P(AB)=P(A)P(BA)p(BA).p(A)+p(B¬A).p(¬A)\mathbf {P(A|B) = \frac{P(A) * P(B|A)}{p(B|A).p(A) + p(B| \lnot A).p(\lnot A) } }