1) The document proves that if random variables X and Y are jointly distributed and independent, then their joint probability distribution equals the product of their marginal probability distributions (Fx,y(x,y) = Fx(x)FY(y)) and their expected value equals the product of their individual expected values (E(XY) = E(X)E(Y)).
2) It also shows that if X and Y are independent then their covariance is equal to 0 (Cov(X,Y)=0).
3) However, the covariance can equal 0 even if X and Y are not independent, as shown with a counterexample.
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
PLease show work thank you Suppose X and Y art-jointly distributed a.pdf
1. PLease show work thank you Suppose X and Y art-jointly distributed and independent. Prove the
following is true. Fx,y(x,y) = Fx(x)FY(y). E(XY) = E(X)E(Y). x,y(x,y) = x(r) y(g). Cov(X, Y)
= 0. In it possible for Cov(X, Y) = 0 when X ami Y are not independent? If so give an example.
Solution
To obtain E(XY), in each cell of the joint probability distribution table, we multiply
each joint probability by its corresponding X and Y values: E(XY) = x1y1p(x1,y1) +
x1y2p(x1,y2) + x2y1p(x2,y1) + x2y2p(x2,y2). At this point, the assumption of statistical
independence of X and Y is utilized. If X and Y are statistically independent, then p(xi,yj) =
P(xi)P(yj). hence E(XY) = x1y1P(x1)P(y1) + x1y2P(x1)P(y2) + x2y1P(x2)P(y1) +
x2y2P(x2)P(y2). Factoring out the x1P(x1) expression that is common to the first two terms and
the x2P(x2) expression common to the third and fourth terms, we have E(XY)=
x1P(x1)[y1P(y1) + y2P(y2)] + x2P(x2)[y1P(y1) + y2P(y2)]. Factoring out the common term,
[y1P(y1) + y2P(y2)]. E(XY) = [x1P(x1) + x2P(x2)][y1P(y1) + y2P(y2) = E(X)E(Y).