课程名称︰统计学与计量经济学暨实习上
课程性质︰必修
课程教师︰张胜凯
开课学院:社会科学院
开课系所︰经济系
考试日期(年月日)︰2014/10/22
考试时限(分钟):180分钟
试题 :
Econ 2014 Statistics and Econometrices I
First Midterm Exam (10/22/2014)
Problem 1. (15 points (5,5,5))
Are the following functions legitimate probility mass functions?Why or why not.
x + 2
(a) f (x) = ────, x = -1,0,1,2
X 10
x
(b) f (x) = ──, x = -1,0,1,2
X 2
1
(c) f (x) = ──, x = -1,2,3,...
X x
Sol: Recall Axioms of Probability:
(1) P(ψ) = 0, P(S) = 1, ψ denotes empty set.
(2) P(A) ≧ 0 for all A ⊆ S.
(3) For all mutually exclusive events, A,B ⊆ S => P(A ∪ B) = P(A) + P(B)
(a)┌──┬──┐
│ X │P(X)│
├──┼──┤
│ -1 │1/10│
├──┼──┤
│ 0 │2/10│
├──┼──┤
│ 1 │3/10│
├──┼──┤
│ 2 │4/10│
└──┴──┘
(1) P(X = -1) + P(X = 0) + P(X = 1) + P(X = 2) = 1
(2) P(X=-1),P(X=0),P(X=1),p(X=2) > 0
=> P(X) follow axioms of probability.
(b)┌──┬──┐
│ X │P(X)│
├──┼──┤
│ -1 │-1/2│
├──┼──┤
│ 0 │ 0 │
├──┼──┤
│ 1 │ 1/2│
├──┼──┤
│ 2 │ 1 │
└──┴──┘
(1) P(X = -1) + P(X = 0) + P(X = 1) + P(X = 2) = 1
(2) P(X = -1) = -1/2 < 0
=> P(X) does not follow axioms of probability.
(c)┌──┬──┐
│ X │P(X)│
├──┼──┤
│ -1 │ -1 │
├──┼──┤
│ 2 │ 1/2│
├──┼──┤
│ 3 │ 1/3│
├──┼──┤
│ n │ 1/n│
└──┴──┘
(1) P(X = -1) + P(X = 2) + P(X = 3) + ... > 1
(2) P(X = -1) = -1 < 0
=> P(X) does not follow axioms of probability.
Problem 2. (20 points (5,5,4,3,3))
Given the following probability distribution, for the joint pmf of random
variables X and Y in the table below:
┌───┬───┬───┐
│ │ x = 1│ x = 2│
├───┼───┼───┤
│ y = 0│ 0.1 │ 0.2 │
├───┼───┼───┤
│ y = 1│ 0.1 │ 0.1 │
├───┼───┼───┤
│ y = 2│ 0.3 │ 0.2 │
└───┴───┴───┘
(a) Find the conditional expectation function E[Y|X].
(b) Find the best linear prediction E*[Y|X].
(c) Are X and Y stochastically independent? Why or why not.
(d) Is Y mean independent of X? Why or why not.
(e) Are X and Y uncorrelated? Why or why not.
P(Y=0,X=1) 0.1 1
Sol: (a) P(Y = 0 | X = 1) = ────── = ─── = ──
P(X=1) 0.5 5
P(Y=1,X=1) 0.1 1
P(Y = 1 | X = 1) = ────── = ─── = ──
P(X=1) 0.5 5
3
P(Y = 2 | X = 1) = ──
5
2
P(Y = 0 | X = 2) = ──
5
1
P(Y = 1 | X = 2) = ──
5
2
P(Y = 2 | X = 2) = ──
5
1 1 3 7
E[Y|X = 1] = 0‧── + 1‧── + 2‧── = ──
5 5 5 5
2 1 2
E[Y|X = 2] = 0‧── + 1‧── + 2‧── = 1
5 5 5
(b) E*[Y|X] = α + βx
E[X] = 1‧0.5 + 2‧0.5 = 1.5, Var(X) = 2.5 - (1.5)^2 = 0.25
Cov(X,Y) = E[XY] - E[X]E[Y]
= (1‧1‧0.1 + 1‧2‧0.1 + 2‧1‧0.3 + 2‧2‧0.2) - 1.5‧1.2
= 1.7 - 1.8 = -0.1
Cov(X,Y) 0.1
β = ───── = - ─── = -0.4 , α = E[Y] - βE[X] = 1.8
Var(X) 0.25
E*[Y|X] = 1.8 - 0.4x
(c) P(X = 1, Y = 0) = 0.1 ≠ 0.15 = P(X = 1)‧P(Y = 0)
∴ X and Y are not stochastically independent.
(d) E[Y] = 1.2 ≠ E[Y|X = 1], E[Y] ≠ E[Y|X = 2]
∴ Y is not mean independent of X.
(e) Cov(X,Y) = -0.1 ≠ 0 ∴ X and Y are not uncorrelated.
Problem 3. (15 points)
X and Y are two random variables which only take two values, 1 and 2.From the
Following joint probability table, calculte E[X], Var(X), Cov(X,Y), and E[X^3].
┌───┬───┬───┐
│ │ Y = 1│ Y = 2│
├───┼───┼───┤
│ X = 1│ 0.2 │ 0.2 │
├───┼───┼───┤
│ X = 2│ 0.1 │ 0.5 │
└───┴───┴───┘
Sol: E[X] = 1‧0.4 + 2‧0.6 = 1.6
Var(X) = E[X^2] - (E[X])^2 = 2.8 - (1.6)^2 = 0.24
Cov(X,Y) = E[XY] - E[X]E[Y] = 2.8 - 1.6‧1.7 = 0.08
E[X^3] = 1^3‧0.4 + 2^3‧0.6 = 5.2
Problem 4. (20 points (5,5,5,5))
The joint probability of X and Y is as follow.
╭
│ 2x + y
│ ──── , if x = 1, 2; y = 0, 2
P (x,y) = ﹤ 16
XY │
│ 0 , otherwise.
╰
(a) Develop a joint probability table.
(b) Find E[X] and E[Y].
(c) Find Var(X) and Var(Y).
(d) Find Cov(x,Y).
Sol: (a)┌───────┬───────┬───┐
│ │ Y │ │
│ ├───┬───┼───┤
│ │ 0 │ 2 │ f(x) │
├───┬───┼───┼───┼───┤
│ │ 1 │ 1/8 │ 2/8 │ 3/8 │
│ X ├───┼───┼───┼───┤
│ │ 2 │ 2/8 │ 3/8 │ 5/8 │
├───┼───┼───┼───┼───┤
│ │ f(y) │ 3/8 │ 5/8 │ 1 │
└───┴───┴───┴───┴───┘
(It's ok if you did not write the red part.)
(b) E[X] = 3/8 + 10/8 = 13/8, E[Y] = 10/8 = 5/4
(c) E[X^2] = 3/8 + 20/8 = 23/8, E[Y^2] = 5/2,
Var(X) = 23/8 - (13/8)^2 = 15/64, Var(Y) = 5/2 - (5/4)^2 = 15/16
(d) E[XY] = 2˙(2/8) + 4˙(3/8) = 16/8 = 2,
Cov(X,Y) = E[XY] - E[X]E[Y] = -1/32
Problem 5. (10 points)
Suppose Z = XY, where X and Y are independent random variables. Show that
Var(Z) = Var(X)Var(Y) + [(E[X])^2]Var(Y) + [(E[Y])^2]Var(X).
(Hint: if X and Y are independent, then E[XY] = E[X]E[Y] and
E[X^2 Y^2] =E[X^2]E[Y^2] )
Sol: Part A:
Var(Z) = Var(XY) = E[(XY)^2] - (E[XY])^2 = E[X^2 Y^2] -(E[X]E[Y])^2
= E[X^2]E[Y^2] - (E[X]E[Y])^2 (∵ X and Y are independent.)
Part B:
Var(X)Var(Y) + (E[X])^2˙Var(Y) + (E[Y])^2˙Var(X)
= {[E[X^2]-(E[X])^2][E[Y^2]-(E[Y])^2]}
+ [(E[X])^2][E[Y^2]-(E[Y])^2] + (E[Y])^2˙[E[X^2]-(E[X])^2]
= E[X^2]E[Y^2] - E[X^2](E[Y])^2 - (E[X])^2˙E[Y^2] + (E[X])^2˙(E[Y])^2
+ (E[X])^2˙E[Y^2] - (E[X])^2˙(E[Y])^2 + E[X^2](E[Y])^2
- (E[X])^2˙(E[Y])^2 = E[X^2]E[Y^2] - (E[X])^2˙(E[Y])^2
∵ Part A = Part B
∴ Var(Z) = Var(X)Var(Y) + [(E[X])^2]Var(Y) + [(E[Y])^2]Var(X)
Problem 6.(20 points (10,10))
Let X_i ~ Bernoulli(p), i = 1,2,3 and X_i s are i.i.d. random variables.
X_1 + 2X_2 + 3X_3
Moreover, let Y = ────────── .
6
(a) Find E[X_1] and Var(X_2) in terms of p.
(b)Find E[Y] and Var(Y) in terms of p.
(Hint: if X ~ Bernoulli(p), f_X(x) = (p^x)(1-p)^(1-x) , X = 0, 1)
Sol: (a) E[X_1] = E[X] = p, Var(X_2) = Var(X) = p(1-p) (X_i s are i.i.d.)
X_1 + 2X_2 + 3X_3
(b) E[Y] = E[ ────────── ] = (1/6)E[X_1 + 2X_2 + 3X_3]
6
= (1/6)(E[X_1] + E[2X_2] + E[3X_3]) = (1/6)(p+2p+3p) = p
X_1 + 2X_2 + 3X_3
Var(Y) = Var( ────────── ) = (1/6)^2˙Var(X_1 + 2X_2 + 3X_3)
6
= (1/36)[Var(X_1) + Var(2X_2) + Var(3X_3)]
= (1/36)[p(1-p) + 2^2˙p(1-p) + 3^3˙p(1-p)] = (14/36)p(1-p)
= (7/18)p(1-p)