M362K, Smith, Fall 03

- Monday, Dec. 8: 2 - 3 pm
- Wednesday, Dec. 10: 3 - 4 pm
- Thursday, Dec. 11: 4 - 5 pm
- Friday, Dec. 12: 12:30 - 1:30 pm

- Everything covered on previous exams, plus topics covered since then: sums of independent random varaibales (including Bernoulli and normal), conditional distributions, expected value of a function of two random variables, expected value of a sum of random variables, covariance, variance of a sum of random variables, central limit theorem.
*Topics covered since the last exam will account for approximately one-third of the points on the final exam*, to make total exam coverage even throughout the semester.

- Questions on the exam will be of the same
*general*nature and level of difficulty as the questions on previous exams. (Do not expect them to be just the same with only minor changes, however.) - The exam will be about twice as long as a midsemester exam, but
you will have three hours to work on it.

1. Carefully study the midsemester exams and returned homework to understand things you may have done wrong, including inadequate explanation, improper use of notation, etc. Consult the solutions on reserve in the PMA library as needed.

2. Here are some suggested practice questions and problems.

- To get the most out of the self-test problems in the book, give
them a serious try
*before*looking at the solutions in the back of the book. - Remember that some of the "solutions" in the book include less detail (especially reasons) than I expect of you.
- These problems are intended to give you additional practice with
the concepts, vocabulary, notation, etc., and with problem solving involving
this material. Do
*not*assume that exam problems will be just like these (although some might be!).

I. REVIEW QUESTIONS ON MATERIAL SINCE THE LAST EXAM

1. E(aX + bY) = ___________________

2. If X and Y are random variables with joint pdf f_{X,Y}(x,y),
then for a function g(x,y),

E(g(X,Y)) =_________________.

3. If X and Y are random variables with joint pmf p_{X,Y}(x,y),
then for a function g(x,y),

E(g(X,Y)) =_________________.

4. Var (X + Y) = Var (X) + Var(Y) provided ______________________________.

5. If X and Y are random variables, then:

a. The definition of the covariance of X and Y is

Cov(X,Y) = ______________________________.

b. Another formula for Cov(X,Y) which is often easier to use is

Cov (X,Y) = ______________________________.

c. Intuitively, Cov(X,Y) measures ___________________________.

d. If X and Y are independent, then Cov(X,Y) = __________.

e. With no assumptions about X and Y, Var (X + Y) = __________________________________.

6. For discrete random variables X and Y the conditional probability density function of X given Y is _________________________. (Give your answer in two forms -- one involving the pmf and one not using the pmf.)

7. For discrete random variables X and Y the conditional probability density function of X given Y is _________________________.

8. The sum of n independent identically distributed Bernoulli random variables
with parameter p is a ___________________ random variable with parameters
____ and ______.

9. If X and Y are independent continuous random variables with pdf's f_{X}
and f_{Y} , then the pdf of their sum can be calculated by f_{X+Y}(u)
= ____________________________________.

10. If X and Y are independent discrete random variables with pmf's p_{X}
and p_{Y} , then the pmf of their sum can be calculated by
p_{X+Y}(u) = ____________________________________.

11. If X and Y are independent normal random variables with means µ_{X}
and µ_{Y}, respectively, and standard deviation sigma_{X}
and sigma_{Y}, respectively, then their sum is _______________________
with parameters ______ and ____________.

12. If X and Y are independent random variables, then E(XY) = _____________.

13. If X_{1}, X_{2}, ... , X_{n} are independent,
identically distributed random variables with mean µ and standard
deviation sigma, and if n is large enough, then their sum X_{1}+
X_{2}+ ... + X_{n} is approximately _______________ with
mean ____________ and standard deviation _____________.

14. If you forget what the mean and variance of a binomial random variable
are, how can you combine some things we've studied recently to easily figure
out what they are?

_{}

REVIEW PROBLEMS ON MATERIAL SINCE THE LAST EXAM

I. From the textbook:

- p. 301, # 12
- p. 383, #33
- p. 392, #19
- p. 428, #13a and b, 14
- p. 431, #7, 8, 9

II. More:

1. X and Y are independent exponential random variables, each with the same parameter lambda. Let U be the minimum of X and Y.

a. What is the joint pdf of X and Y?Why?

b. Find the cdf of U. [Hint: U ≤ u means either X ≤ u or Y ≤ u or both.]

c. Find the pdf of U.

2. In ethnic group A, 60% of voters prefer candidate C to candidate D. In ethnic group B, 40% of voters prefer C to D. You take a poll of 200 voters from group A and an independent poll of 300 voters from group D. Let P be the proportion of voters in your poll of group A who prefer C. Let Q be the proportion of voters in your poll of group B. Then P and Q are random variables. What is the standard deviation of P - Q, the difference in the proportions in the two polls?

3. Define Covariance. Use your definition to prove that

Cov(aX +bY , Z) = aCov(X , Z) + bCov(Y , Z)

4. Var(X) = 1, Var(Y) = 2, and Var(X + Y) = 3. Find Cov (X, Y).

5. Var(X) = 1, Var(Y) = 2, and Cov(X, Y) = 3. Find

a. Var(X + Y)

b. Var(X - Y)

6. True or false. If the statement is (always) true, prove it mathematically. If it is false, give a counter example (that is, an example where the statment is false).

a. Var (X + Y) = Var(X) + Var(Y)

b. If X and Y are independent, then Cov(X, Y) = 0.

c. If Cov(X,Y) = 0, then X and Y are independent.

7. For each step in the proof of Proposition 3.1 (p. 328), give the reason why the step is valid. If necessary, insert any intermediate steps needed so that each step has a single reason.

8. X is a certain random variable and Y = X

a. Find the cumulative distribution function F

b. Now suppose in addition that X is exponential with parameter 2. Find the probabiity density function f

9. X is a random variable and Y = cX

a. Find a formula for the cdf F

b. Find a formula for the pdf f

c. Now suppose X is normal with mean µ and standard deviation sigma. Use part (b) to get a formula for the pdf of Y.

d. Use algebra to put your answer to part (c) in a form that shows that Y is also normal.