M362K, Smith, Fall 03

Information for Final Exam

Date: Saturday, December 13, 9 a.m - 12 noon

Place: CPE 2.210 ( Not the usual classroom, and not the same room as the previous exams -- across Dean Keeton.)

Exam Week Office Hours:
Exam will cover:

Exam format:

Normal tables: You will be provided with a copy of the table on p. 203 giving probabilities for the normal distribution.

Suggestions for studying:

1. Carefully study the midsemester exams and returned homework to understand things you may have done wrong, including inadequate explanation, improper use of notation, etc. Consult the solutions on reserve in the PMA library as needed.

2. Here are some suggested practice questions and problems.

Please Note:

I. REVIEW QUESTIONS ON MATERIAL SINCE THE LAST EXAM


1. E(aX + bY) = ___________________

2. If X and Y are random variables with joint pdf fX,Y(x,y),  then  for a function g(x,y),
E(g(X,Y)) =_________________.

3. If X and Y are random variables with joint pmf  pX,Y(x,y),  then  for a function g(x,y),
E(g(X,Y)) =_________________. 
 
4. Var (X + Y) = Var (X) + Var(Y)  provided ______________________________.

5. If X and Y are random variables, then:

 a. The definition of the covariance of X and Y is

   Cov(X,Y)  = ______________________________.

 b. Another formula for Cov(X,Y) which is often easier to use is

   Cov (X,Y)  = ______________________________.

 c. Intuitively, Cov(X,Y) measures ___________________________.

 d. If X and Y are independent, then Cov(X,Y) = __________.

 e. With  no assumptions about X and Y, Var (X + Y) = __________________________________.

6. For discrete random variables X and Y the conditional probability density function of X given Y is _________________________. (Give your answer in two forms -- one involving the pmf and one not using the pmf.)

7. For discrete random variables X and Y the conditional probability density function of X given Y is _________________________.  

8. The sum of n independent identically distributed Bernoulli random variables with parameter p is a ___________________ random variable with parameters ____ and ______.

9. If X and Y are independent continuous random variables with pdf's fX and fY , then the pdf of their sum can be calculated by  fX+Y(u) = ____________________________________.

10. If X and Y are independent discrete random variables with pmf's pX and pY , then the pmf of their sum can be calculated by   pX+Y(u) = ____________________________________.

11. If X and Y are independent normal random variables with means µX and µY, respectively, and standard deviation sigmaX and sigmaY, respectively,  then their sum is _______________________ with parameters ______ and ____________.

12. If X and Y are independent random variables, then E(XY) = _____________.

13. If X1, X2, ... , Xn are independent, identically distributed random variables with mean µ and standard deviation sigma, and if n is large enough, then their sum  X1+ X2+ ... + Xn is approximately _______________ with mean ____________ and standard deviation _____________.

14. If you forget what the mean and variance of a binomial random variable are, how can you combine some things we've studied recently to easily figure out what they are?


REVIEW PROBLEMS ON MATERIAL SINCE THE LAST EXAM

I. From the textbook:

II. More:

1. X and Y are independent exponential random variables, each with the same parameter lambda. Let U be the minimum of X and Y.
    a. What is the joint pdf of X and Y?Why?
    b. Find the cdf of U. [Hint: U ≤ u means either X ≤ u or Y ≤ u or both.]
    c. Find the pdf of U.

2. In ethnic group A, 60% of voters  prefer candidate C to candidate D. In ethnic group B, 40% of voters prefer C to D. You take a poll of 200 voters from group A and an independent poll of 300 voters from group D. Let P be the proportion of voters  in your poll of group A who prefer C. Let Q be the proportion of voters  in your poll of group B. Then P and Q are random variables. What is the standard deviation of P - Q, the difference in the proportions in the two polls?

3. Define Covariance. Use your definition to prove that

    Cov(aX +bY , Z) = aCov(X , Z) + bCov(Y , Z)

4. Var(X) = 1, Var(Y) = 2, and Var(X + Y) = 3. Find Cov (X, Y).

5. Var(X) = 1, Var(Y) = 2, and Cov(X, Y) = 3. Find
    a. Var(X + Y)
    b. Var(X - Y)

6. True or false. If the statement is (always) true, prove it mathematically. If it is false, give a counter example (that is, an example where the statment is false).
    a. Var (X + Y) = Var(X) + Var(Y)
    b. If X and Y are independent, then Cov(X, Y) = 0.
    c. If Cov(X,Y) = 0, then X and Y are independent.

7. For each step in the proof of Proposition 3.1 (p. 328), give the reason why the step is valid. If necessary, insert any intermediate steps needed so that each step has a single reason.

8. X is a certain random variable and Y = X3.
    a. Find the cumulative distribution function FY(y) of Y in terms of the cumulative distribution function FX(x) of X. (Be sure to give its value for all possible values of y.)
    b. Now suppose in addition that X is exponential with parameter 2. Find the probabiity density function fY(y) of Y. (Be sure to give its value for all possible values of y.)

9.  X is a random variable and Y = cX
    a. Find a formula for the cdf FY(y) of Y in terms of the cdf FX(x) of X.
    b. Find a formula for the pdf fY(y) of Y in terms of the pdf fX(x) of X. [Use part (a)]
    c. Now suppose X is normal with mean µ and standard deviation sigma. Use part (b) to get a formula for the pdf of Y.
    d. Use algebra to put your answer to part (c) in a form that shows that Y is also normal.