H2 Maths Notes (JC 1-2): 6.2) Discrete Random Variables
Download printable cheat-sheet (CC-BY 4.0)07 Oct 2025, 00:00 Z
Study cadence\ Re-derive the expectation and variance formulas from first principles once per week so you remember why each term appears. Keep a small table template in your notes for probability mass functions (PMFs) so you can slot in values quickly during exams.
Core Concepts
- A discrete random variable \( X \) takes countable values \( x_1, x_2, \dots \) with probabilities \( P\( X = x_{\mathrm{i}} \) \).
- Expectation: \( E(X) = \sum x_i P\( X = x_{\mathrm{i}} \) \).
- Second moment: \( E(X^2) = \sum x_i^2 P\( X = x_{\mathrm{i}} \) \).
- Variance: \( \operatorname{Var}(X) = E(X^2) - [E(X)]^2 \).
- Standard deviation: square root of \( \operatorname{Var}(X) \).
Binomial Model
Use \( X \sim \operatorname{Bin}(n, p) \) when you have \( n \) independent Bernoulli trials with success probability \( p \).
- Probability mass: \( P\( X = r \) = \binom{n}{r} p^r (1 - p)^{n - r} \).
- Expectation: \( E(X) = n p \).
- Variance: \( \operatorname{Var}(X) = n p (1 - p) \).
- Conditions: fixed \( n \), constant \( p \), independent trials, success/failure outcomes.
Example -- Hiring round
A firm interviews \( n = 12 \) candidates. Each has a 0.3 probability of accepting an offer. Let \( X \) be the number who accept.
- \( P\( X = 4 \) = \binom{12}{4} 0.3^4 0.7^8 \approx 0.231 \).
- \( P(X \geq 2) = 1 - [P\( X = 0 \) + P\( X = 1 \)] \approx 0.896 \).
- \( E(X) = 12 \times 0.3 = 3.6 \).
- \( \operatorname{Var}(X) = 12 \times 0.3 \times 0.7 = 2.52 \).
Custom PMFs
When the PMF is tabulated (common in H2 questions):
\( x \) | \( 0 \) | \( 1 \) | \( 2 \) | \( 3 \) |
\( P\( X = x \) \) | \( 0.1 \) | \( 0.3 \) | \( 0.4 \) | \( k \) |
- Use the total probability condition to find \( k = 0.2 \).
- \( E(X) = 0 \times 0.1 + 1 \times 0.3 + 2 \times 0.4 + 3 \times 0.2 = 1.7 \).
- \( E(X^2) = 0^2 \times 0.1 + 1^2 \times 0.3 + 4 \times 0.4 + 9 \times 0.2 = 3.5 \).
- \( \operatorname{Var}(X) = 3.5 - 1.7^2 = 0.61 \).
Transformations
- For \( Y = aX + b \): \( E(Y) = a E(X) + b \) and \( \operatorname{Var}(Y) = a^2 \operatorname{Var}(X) \).
- Linear combinations: for independent \( X \) and \( Y \), \( \operatorname{Var}(X + Y) = \operatorname{Var}(X) + \operatorname{Var}(Y) \).
Example -- Bonus payouts
Let \( X \) be binomial \( \operatorname{Bin}(8, 0.4) \). Each success triggers a 200 dollar bonus plus a fixed 150 dollar stipend. Total payout \( T = 200 X + 150 \).
- \( E(T) = 200 \times 8 \times 0.4 + 150 = 790 \) dollars.
- \( \operatorname{Var}(T) = 200^2 \times 8 \times 0.4 \times 0.6 = 76800 \) dollars squared.
- Standard deviation \( \approx 277 \) dollars.
Calculator Workflows
- Casio:
BPD
(binomial probability distribution) for individual terms,BCD
(binomial cumulative distribution) for cumulative sums. Storen
andp
in variables for quick reuse. - TI:
binompdf(n, p, r)
andbinomcdf(n, p, r)
functions in the DISTR menu. For PMF tables, program a short list-based routine to multiplyL1
(values) andL2
(probabilities). - Cross-check expectation and variance using calculator built-ins but still show the manual summation in working.
Exam Watch Points
- State whether a binomial model is appropriate; include independence and constant probability statements.
- When a question mentions "at least one" or "no more than", use complements to reduce calculator input.
- For custom PMFs, ensure probabilities sum to 1 before computing expectation.
- Payout or cost problems often require a linear transformation-write the transformation before substituting.
Quick Revision Checklist
- [ ] Identify when binomial modelling fits and articulate the assumptions.
- [ ] Compute expectation and variance efficiently, including from tables.
- [ ] Execute linear transformations and interpret the resulting mean and variance.
- [ ] Use calculator functions accurately while documenting the method in words.
- [ ] Spot when to switch to Poisson/Normal approximations (preview for later topics).
Next steps: practise binomial hypothesis tests (bridging into Topic 6.5) and prepare to approximate discrete models with the normal distribution in Topic 6.3.