M408M Learning Module Pages
Main page Chapter 10: Parametric Equations and Polar CoordinatesChapter 12: Vectors and the Geometry of SpaceChapter 13: Vector FunctionsChapter 14: Partial DerivativesChapter 15: Multiple IntegralsLearning module LM 15.1: Multiple integralsLearning module LM 15.2: Multiple integrals over rectangles:Learning module LM 15.3: Double integrals over general regions:Learning module LM 15.4: Double integrals in polar coordinates:Learning module LM 15.5a: Multiple integrals in physics:Learning module LM 15.5b: Integrals in probability and statistics:Single integrals in probabilityDouble integrals in probability Learning module LM 15.10: Change of variables: |
Double integrals in probabilityOften, we are interested in two random variables X and Y, and want to understand how they are related. For instance, X might be tomorrow's high temperature, and Y might be tomorrow's barometric pressure. Or X might be the concentration of virus in a patient's blood, and Y might be the concentration of an antibody. Or X and Y might be the horizontal and vertical coordinates of the spot where a dart hits a board. When we have two random variables X and Y, the pair (x,y) takes values in the plane, and we speak of the probability per unit area of (x,y) landing in a certain region. This is called the joint probability density function, and is written fX,Y(x,y).
If we know the joint distribution of X and Y, we can compute the distribution of just X or just Y. These are called marginal distributions, because for discrete random variables they are often written on the margins of a table. We get f_X(x) by integrating f_{X,Y}(x,y) over y, and f_Y(y) by integrating f_{X,Y}(x,y) over x: \begin{eqnarray*} f_X(x) & = & \int_{-\infty}^\infty f_{X,Y}(x,y) dy \\ f_Y(y) & = & \int_{-\infty}^\infty f_{X,Y}(x,y) dx \end{eqnarray*} If \mu_X and \mu_Y are the average values of X and Y, then the covariance of X and Y is Cov(X,Y) = E((X-\mu_X)(Y-\mu_Y)) = \iint (x-\mu_x)(y-\mu_y) f_{X,Y}(x,y) dA. Just like the variance of one variable, this is more easily computed as Cov(X,Y) = E(XY) - \mu_X\mu_Y = \iint xy f_{X,Y}(x,y) dA - \mu_X\mu_Y. The correlation between X and Y is Cor(X,Y) = \frac{Cov(X,Y)}{\sigma_X\sigma_Y}. This is a number, often written r, between -1 and 1. If r is close to 1, then all the likely values of (X,Y) lie close to a straight line of positive slope. In that case we say that X and Y are positively correlated. Increases in X are then associated with increases in Y, and vice-versa. This is statistical evidence that X and Y are somehow related. Maybe X causes Y, or maybe Y causes X, or maybe something else causes both of them. If r is close to -1, then the likely values of (X,Y) lie close to a line of negative slope, so increasing X tends to decrease Y, and vice-versa. X and Y are said to be negatively correlated. Again, this is evidence that X and Y are related. If r is close to 0, then X and Y are said to be uncorrelated.
|