Topic 1: Math Review
Probability Rules
Probability of an event:
\[
P(A) = \frac{\text{Number of favorable outcomes}}{\text{Total number of outcomes}}
\]
Complementary Rule:
\[
P(\neg A) = 1 - P(A)
\]
Addition Rule (Non-Mutually Exclusive Events):
\[
P(A \cup B) = P(A) + P(B) - P(A \cap B)
\]
Addition Rule (Mutually Exclusive Events):
\[
P(A \cup B) = P(A) + P(B)
\]
Law of Total Probability (Sum Rule):
\[
P(B) = \sum_{i=1}^{n} P(B \cap E_i)
\]
Multiplication Rule (Independent Events):
\[
P(A \cap B) = P(A) \times P(B)
\]
Multiplication Rule (Dependent Events):
\[
P(A \cap B) = P(A) \times P(B|A)
\]
Conditional Probability:
\[
P(B|A) = \frac{P(A \cap B)}{P(A)}
\]
Law of Total Probability:
\[
P(B) = \sum_{i=1}^{n} P(B \mid E_i) P(E_i)
\]
Chain Rule of Conditional Probability:
\[
P(E_1 \cap E_2 \cap \ldots \cap E_n) = \prod_{i=1}^n P(E_i \mid E_{i+1} \cap \ldots \cap E_n)
\]
Conditional Probability Distributions:
\[
\begin{align*} \sum_{i=1}^n P(E_i \mid B) &= \sum_{i=1}^n \frac{P(E_i \cap B)}{P(B)} & \text{by definition of conditional probability} \\ &= \frac{1}{P(B)} \sum_{i=1}^n P(E_i \cap B) & \text{factoring out the constant} \frac{1}{P(B)} \\ &= \frac{1}{P(B)} P(B) & \text{by the law of total probability} \\ &= 1\end{align*}
\]
\[
P(\neg E \mid B) = 1 - P(E \mid B), \\ P(A \cup B \mid C) = P(A \mid C) + P(B \mid C) - P(A \cap B \mid C)
\]
Two events are independent events iff
\[
P(A|B) = P(A)
\]
Bayes' Theorem:
\[
P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}
\]
Law of Total Probability:
\[
P(B) = P(B|A_1) \cdot P(A_1) + P(B|A_2) \cdot P(A_2) + \cdots + P(B|A_n) \cdot P(A_n)
\]
Expectation (Discrete Random Variables):
\[
E(X) = \sum_{i} x_i \cdot P(x_i), \quad E(f(X)) = \sum_{i} f(x_i) \cdot P(x_i)
\]
Expectation (Continuous Random Variables):
\[
E(X) = \int_{-\infty}^{\infty} x \cdot f_X(x) \, dx
\]
Variance:
\[
\text{Var}(X) = E(X^2) - [E(X)]^2, E[(X-E(X))^2]
\]
Conditional Probability for Multiple Variables:
\[
P(x \mid y, z) = \frac{P(y, z \mid x) P(x)}{P(y, z)}, \quad P(a \mid b, c, d) = \frac{P(b, c \mid a, d) P(a \mid d)}{P(b, c \mid d)}
\]
The ordering of variables is irrelevant as long as they stay on the same side of the \(|\) sign:
\[
\begin{align*}P(x, y \mid z) &= P(y, x \mid z) \\P(x \mid y, z) &= P(x \mid z, y)\\P(x \mid y, z) &\neq P(x, y \mid z)\end{align*}
\]
Conditional Independence:
\[
P(X = x, Y = y \mid Z = z) = P(X = x \mid Z = z) P(Y = y \mid Z = z)
\]
The integral of a probability density function (PDF) over all possible values of a continuous random variable \(x\), which must equal \(1\):
\[
\int_{-\infty}^{\infty} p(x) \, dx = 1
\]
Cheatsheet
Math Review Cheatsheet
Math Review Cheatsheet