/
MATHEMATICAL EXPECTATION MATHEMATICAL EXPECTATION

MATHEMATICAL EXPECTATION - PDF document

madison
madison . @madison
Follow
377 views
Uploaded On 2021-08-22

MATHEMATICAL EXPECTATION - PPT Presentation

CHAPTER 4242MathematicalExpectationDefinition41IfXisarandomvariablethentheexpectedvalueforXisdefinedasNoteExpectedvalueofXmeanforXthefirstmomentforX3Definition42IfwisafunctionofXandtheprobabilityfunct ID: 868891

random probability variable function probability random function variable var find expected variables variance proof joint moment theorem definition note

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "MATHEMATICAL EXPECTATION" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

1 MATHEMATICAL EXPECTATION CHAPTER 4 2 4 .
MATHEMATICAL EXPECTATION CHAPTER 4 2 4 . 2 Mathematical Expectation Definition 4 . 1 If X is a random variable , then the expected value for X is d

2 efined as Note : Expected value of X = m
efined as Note : Expected value of X = mean for X = the first moment for X = 3 Definition 4 . 2 If w is a function of X , and the probability funct

3 ion for X is f ( x ), then the expected
ion for X is f ( x ), then the expected value for w ( X ) is : Definition 4 . 3 : The variance of a random variable X is denoted by and is defined

4 by = 4 Example 4 . 1 : Suppose the prob
by = 4 Example 4 . 1 : Suppose the probability distribution for X is given in the following table : Calculate a) E(X) b) E(X 2 ) c) Var(X) x - 1 0

5 1 P(X = x) 1/18 16/18 1/18 5 Solution :
1 P(X = x) 1/18 16/18 1/18 5 Solution : a) b) c) 6 4 . 3 Properties of Mathematical Expectation 1 ) If c is a constant, then E(c) = c Proof 2) If

6 c is a constant and u is a function of X
c is a constant and u is a function of X , then E(cu(X))= cE (u(X)) Proof 7 3) If c and d are constants and u and v are functions of X , then Proof

7 The third property can be extended as t
The third property can be extended as the following : Note : If the random variable is continuous then replace the summation with an integral symb

8 ol . 8 Example 4 . 2 : If X is a random
ol . 8 Example 4 . 2 : If X is a random variable with probability function Determine a) E(X) b) E(X 2 ) c) E(X( 5 - X)) d) Var (X) 9 Solution : a)

9 b) = = = 10 c) d) = 11 Quiz 4 . 1 : Su
b) = = = 10 c) d) = 11 Quiz 4 . 1 : Suppose the probability function for random variable X is Find a) E(X) b) E(X – 2 ) c) d) E[X(X + 4 )] e) V

10 ar (X) 12 4 . 4 Properties of Variance 1
ar (X) 12 4 . 4 Properties of Variance 1) If c is a constant, Var (c) = 0 Proof 2) If c is a constant and X is a random variable, then Proof 13 3)

11 If c and d are constants, then Proof 14
If c and d are constants, then Proof 14 Example 4 . 3 : The probability density function for the random variable X is Calculate a) E(X) b) Var(X) c

12 ) Var( 4 X - 2 ) 15 Solution : a) b) The
) Var( 4 X - 2 ) 15 Solution : a) b) Therefore the variance for X is c) 16 Quiz 4 . 2 : The probability density function for the random variable W

13 is Determine a) E(W) b) c) Var (W) d) Va
is Determine a) E(W) b) c) Var (W) d) Var (W+ 10 ) e) 17 4 . 5 Moment and Variance for One Variable Definition 4 . 4 : If a function of X is x k ,

14 where k is a positive integer, we obtain
where k is a positive integer, we obtain the kth moment about the origin of X and is defined as : Note : The first moment that is for k = 1 , it is

15 just the mean of X and the relationship
just the mean of X and the relationship is as follows : 18 The second moment , therefore the variance of X can be obtained from the following equa

16 tion : Definition 4 . 5 : The k - th mom
tion : Definition 4 . 5 : The k - th moment about the mean of a random variable X , denoted by is Note : 19 Definition 4 . 6 : Variance of X is the

17 second moment about the mean that is 20
second moment about the mean that is 20 4 . 6 Chebyshe’s Inequality Theorem 4 . 1 The probability that any random variable X will assume a va

18 lue within k standard deviations of the
lue within k standard deviations of the mean is at least that is 21 Note : The Chebyshe’s inequality is usually used when the probability func

19 tion is unknown or too complicated and t
tion is unknown or too complicated and therefore difficult to find the exact probability value . The fraction of the area between any two values sy

20 mmetric about the mean is related to the
mmetric about the mean is related to the standard deviation . For example, for k = 2 , the theorem states that the random variable X has a probabil

21 ity of at least of falling within 2 stan
ity of at least of falling within 2 standard deviations of the mean . 22 Example 4 . 4 : A random variable X has a mean , and variance and an unkno

22 wn probability distribution function . F
wn probability distribution function . Find a) b) 23 Solution : 1. We know that, Thus, Therefore, 24 b) Let us look at Thus, Therefore, 25 Quiz 4 .

23 3 : A machine produces rulers with a me
3 : A machine produces rulers with a mean length of 100 cm and standard deviation of 0 . 5 cm . If rulers with length exceeded 101 cm and shorter

24 than 99 cm are considered defective, wha
than 99 cm are considered defective, what is the percentage of defective rulers are produced by this machine? 26 Quiz 4 . 4 : If X is a random vari

25 able with and , find the lower boundary
able with and , find the lower boundary for using the Chebyshe’s Inequality theorem . 27 4 . 7 Mathematical Expectation for Two Random Variabl

26 es Definition 4 . 6 If X and Y are discr
es Definition 4 . 6 If X and Y are discrete random variables with joint probability function ,then the marginal probability function of X and Y are

27 and 28 Given a joint probability distri
and 28 Given a joint probability distribution for discrete random variables X and Y , the expected values can be calculated using the following fo

28 rmulas : where are the marginal probabil
rmulas : where are the marginal probability functions for the random variables X and Y 29 4 . 8 Expected Value of a Function of Two Discrete Random

29 Variables Theorem 4 . 2 : If X and Y ar
Variables Theorem 4 . 2 : If X and Y are discrete random variables with joint probability function f(x,y), and W = u(x,y) is a function of X and Y

30 , then the expected value for W is 30 T
, then the expected value for W is 30 Theorem 4 . 3 : If X and Y are discrete random variables with joint probability function f(x,y), and if u(x,

31 y) = XY , then the expected value for u(
y) = XY , then the expected value for u(x,y) is Proof 31 Quiz 4 . 5 : Prove Theorem 4 . 4 . Theorem 4 . 4 If X and Y are discrete random variables

32 with joint probability function f(x,y),
with joint probability function f(x,y), and if u(x,y) = aX + bY , where a and b are constants, then prove that the expected value for u(x,y) is 32

33 Example 4 . 5 : Given is a joint distrib
Example 4 . 5 : Given is a joint distribution for X and Y , Find the a) E( X ) b) E( Y ) c) E( XY ) d) E( X+Y ) e) E( X 2 ) X f( x,y ) Y 1 2 1 1/8

34 3/8 2 4/8 0 33 Solution : a) The alterna
3/8 2 4/8 0 33 Solution : a) The alternative way is to first find the marginal probability distribution for X : Note : Both methods should give you

35 the same answer . 34 b) As we did in pa
the same answer . 34 b) As we did in part (a), first find the marginal probability distribution for Y : 35 c) d) e) 36 Quiz 4 . 6 : The joint prob

36 ability function for X and Y is given as
ability function for X and Y is given as Find the a) E(X) b) E(Y) c) E(XY) d) E(X+Y) e) E(X 2 ) X f( x,y ) 0 1 2 Y 0 1/35 6/35 3/35 1 6/35 12/35 2/