PX1 P3 16 X5 PX Let X your earnings X 1001 99 X 1 PX99 112 3 1220 PX1 11220 219220 EX 1001220 Let X be a random variable assuming the values x1 ID: 832055
Download Pdf The PPT/PDF document "Ex. How many heads would you expect if y..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Ex. How many heads would you expect if y
Ex. How many heads would you expect if you flipped a coin twice? X = number of heads = {0, P(X=1) = P({3}) =1/6 X=5 P(XLet X = your earnings X = 100-1 = 9
9 X = -1 P(X=99) = 1/(12 3) = 1/220 P(X
9 X = -1 P(X=99) = 1/(12 3) = 1/220 P(X=-1) = 1-1/220 = 219/220 E(X) = 100*1/220Let X be a random variable assuming the values x1, x2, x3, ... with correspo
nding probabilities p(x1), p(x2), p(x3),
nding probabilities p(x1), p(x2), p(x3),..... For any function g, the mean or expected value of g(X) is defined by E(g(X)) = sum g(xk) p(xk). Ex. Roll a fair
die. Let X = number of dots on the side
die. Let X = number of dots on the side that comes up. Calculate E(X2). E(X2) = sum_{i=1}^{6} i2 p(i) = 12 p(1) p(i) (Do at home) Ex. An indicator variabl
e for the event A is defined as the rand
e for the event A is defined as the random variable that takes on the value 1-2 with prob. 1/3 -1 with prob. 1/6 1 with prob. 1/6 2 with prob. 1/
3 Both X and Y have the same expected v
3 Both X and Y have the same expected value, but are quite different in other respects. One such respect is in their spread. We would like a measure of spread
. Definition: If X is a random variabl
. Definition: If X is a random variable with mean E(X), then the variance of X, denoted by Var(X), is defined by Var(X) = E((X-E(X))2). A small variance ind
i2-2x E(X)+ E(X Later weÕll see an
i2-2x E(X)+ E(X Later weÕll see an even easier way to calculate these moments, by using the fact that a binomial X is the sum of N i.i.d. simpler (Bernoul
Proposition: If X and Y have a join
Proposition: If X and Y have a joint probability mass function pXY(x,y), then If X and Y have a joint probabilixpectation of sums of random variables E
x. Let X and Y be continuous random vari
x. Let X and Y be continuous random variables with joint pdf fXY(x,y). Assume that E(X) and E(Y) are finite. Calculate E(X+Y). Same result holds in discrete
case. Proposition: In general if E(Xi
case. Proposition: In general if E(Xi) are finite for all i=1,É.n, then . Proof: Use the example above and prove by induction. Let X1, É.. Xn be indepe
ndent and identically distributed random
ndent and identically distributed random variables having distribution function FX and expected value µ. Such a sequence of random variables is said to consti
tute a sample from the distribution FX.
tute a sample from the distribution FX. The quantity, defined by is called the sample mean. Calculate E(). We know that . When the mean of a distributi
on is unknown, the sample mean is often
on is unknown, the sample mean is often used in statisti Ex. A group of N people throw their hats into the center of a room. The hats are mixed, and each pers
on randomly selects one. Find the expect
on randomly selects one. Find the expected number of people that select their own hat. Let X = the number of people who select thei Fact: The moment generati
ng function of the sum of independent ra
ng function of the sum of independent random variables equals the product of the individual moment gener Definition: The covariance between X and Y, denoted
by Cov(X,Y), is defined by . Similarl
by Cov(X,Y), is defined by . Similarly as with the variance, we ca Define X and Y so that, P(X=0) = P(X=1) = P(X=-1) = 1/3 and X and Y are clearly depen
dent. XY=0 so we have that E(X (i)
dent. XY=0 so we have that E(X (i) (ii) (iii) (iv) Proof: (i) Ð (iii) Verify yourselves. (iv). Let and Then and Proposition: . In particular
, V(X+Y)=V(X)+ Number the people from 1
, V(X+Y)=V(X)+ Number the people from 1 to N. Let then We showed last time that E(X)=1. Calculate V Recall that since each person is equally likely
to select any ofSchwarzÓ inequality.
to select any ofSchwarzÓ inequality. Proof: It suffices to prove (E(XY))^2=E(X^2)E(Y^2). The basic idea is to look at the expectations E[(aX+bY)^2] and E[
(aX-bY)^2]. We use the usual rules for
(aX-bY)^2]. We use the usual rules forDefinition: If X and Y are discrete random variables, the conditional expectation of X, given Y=y, is defined for all y