A random variable is a quantity that can take on several
different values, depending on chance. Examples include:
The number of football games that the Texas Longhorns will win this
fall.
The number of electoral votes that the Republican candidate will get in
the next presidential election.
Tomorrow's high temperature in Austin.
The $x$ coordinate of the spot when my next dart hits the dartboard.
The first two examples are discrete random variables, meaning
that you make a list of all the possible outcomes, and then assign a
number, called the probability, to each one. The last two
examples are continuous random variables, meaning that the
possible outcomes form a continuous range.
If $X$ is a continuous random variable, then the probability of any one
outcome is zero. Instead, we consider the probability of a range of outcomes.
The probability density function (pdf) $f_X(x)$ gives the probability
per unit length. In the following video, we show how to use such a function,
and we learn about three standard examples, called the uniform,
exponential, and normal distributions.
The probability of $X$ landing somewhere between $a$ and $b$
is
$$P(a \le X \le b) = \int_a^b f_X(x) dx.$$
More generally, the probability that $X$ lands somewhere in a region $R$
is
$$P(X \in R) = \int_R f_X(x) dx.$$
Of course, there is a 100% chance of $X$ taking on some
value, so
$$\int_{-\infty}^\infty f_X(x) dx = 1.$$
If $X$ is restricted
to a smaller range than $(-\infty, \infty)$, we just integrate over
that range.
Example 1:Let $T \ge 0$ be the lifetime (in years) of a
newly bought light bulb. Suppose that $f_T(t) = C e^{-t/3}$ for some
constant $C$. What is $C$?
Solution: Since $T$ can't be negative, we only integrate from 0 to
$\infty$ instead of from $-\infty$ to $\infty$.
$$1 \ = \ \int_0^\infty C e^{-t/3} dt = 3C,$$
so $C=1/3$.
If $X$ is a random variable, the expectation or mean value
of $X$ is the average value we get when we run the experiment over and over
again. In the following video, we explain how to use integrals to compute
expectations, variances, and standard deviations.
Since values between $x$ and $x + dx$ come up a fraction $f_X(x) dx$
times, the expectation of $X$ is
$$E(X) = \int_{-\infty}^\infty x f_X(x) dx.$$
This average value is often denoted with the Greek letter $\mu$.
Example 2:What is the average lifetime of a light bulb if its
probability density function is $f_T(t) = \frac{1}{3} e^{-t/3}$ for $t \ge 0$
(and zero if $t < 0$.)
Solution: We compute the integral
$$\mu\ = \ E(T) \ = \ \int_0^\infty t f_T(t) dt
\ = \ \int_0^\infty \frac{x e^{-x/3}}{3} dx
\ = \ 3.$$
(Do you remember how to evaluate that integral? Integrate by parts!)
These light bulbs last an average of 3 years.
If the mean value of $X$ is $\mu$, then the variance of
$X$ is the average value of $(X-\mu)^2$. This is a measure of how wide
the probability distribution is, and has units of (length)${}^2$,
and is denoted $Var(X)$ of $\sigma^2$.
$$\sigma^2 = Var(X) = E((X-\mu)^2) = \int_{-\infty}^\infty (x-\mu)^2 f_X(x) dx.$$
In practice, we often use a simpler formula:
$$\sigma^2 = Var(X) = E(X^2) - \mu^2 = \int_{-\infty}^\infty x^2 f_X(x) dx - \mu^2.$$
$\sigma = \sqrt{Var(X)}$ is
called the standard deviation of $X$.
Example 3:A random variable $X$ takes values between
$0$ and $1$ with
pdf $f_X(x) = 1$ when $0 \le x \le 1$ (and 0 otherwise). This is called
the uniform distribution. Find the mean, variance
and standard deviation of $X$.
Solution:We first check that $f_X(x)$ is a legitimate pdf.
$\int_0^1 f_X(x) dx = \int_0^1 dx = 1$, as it should be.
Next we compute the
mean:
$$\mu = E(X) = \int_0^1 x dx = \frac{1}{2}.$$
Then we compute
the variance:
$$ Var(X) = \int_0^1 x^2 dx - \mu^2 = \frac{1}{3} -
\frac{1}{4} = \frac{1}{12}.$$
Finally, we compute the standard
deviation:
$$\sigma = \sqrt{Var(X)} = \frac{1}{2\sqrt{3}} =
\frac{\sqrt{3}}{6}.$$