/
10. Joint Moments and Joint Characteristic Functions 10. Joint Moments and Joint Characteristic Functions

10. Joint Moments and Joint Characteristic Functions - PDF document

vivian
vivian . @vivian
Follow
345 views
Uploaded On 2021-03-17

10. Joint Moments and Joint Characteristic Functions - PPT Presentation

1 Following section 6 in this section we shall introduce various parameters to compactly represent the information contained in the joint pdf of two rvs Given two rvs Xand Yand a function ID: 832058

gaussian xand independent random xand gaussian random independent pillai yare variables joint jointly characteristic variance function define uncorrelated sum

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "10. Joint Moments and Joint Characterist..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

110. Joint Moments and Joint Characteri
110. Joint Moments and Joint Characteristic FunctionsFollowing section 6, in this section we shall introduce various parameters to compactly represent the information contained in the joint p.d.f of two r.vs. Given two r.vs Xand Yand a function define the r.vUsing (6-2), we can define the mean of Zto be(10-1)),( YXgZ),,(yxg.)( )( dzzfzZEZZ(10-2)PILLAI2However, the situation here is similar to that in (6-13), and it is possible to express the mean of

in termsof withoutc
in termsof withoutcomputing To see this, recall from (5-26) and (7-10) that where is the region in xyplane satisfying the above inequality. From (10-3), we getAs covers the entire zaxis, the corresponding regions are nonoverlapping, and they cover the entire xyplane. zDyxXYZyxyxfzzYXgzPzzfzzZzP),(),( ),()((10-3)zD),(YXgZ),(yxfXY).(zfZ(,) ()(,)(,).zZXYxyDzfzzgxyfxyxy(10-4)zDzPILLAI3By integrating (10-4),

we obtain the useful formulaorIf Xand Ya
we obtain the useful formulaorIf Xand Yare discrete-type r.vs, thenSince expectation is a linear operator, we also get () ()(,)(,).ZXYEZzfzdzgxyfxydxdy(10-5)(10-6) [(,)](,)(,).XYEgXYgxyfxydxdy[(,)](,)(,).ijijijEgXYgxyPXxYy(10-7)(,)[(,)].kkkkkkEagXYaEgXY(10-8)PILLAI4If Xand Yare independent r.vs, it is easy to see that and are always independent of each other. In thatcase using (10-7), we get the interesting resultHowever (10-9) is in

general not true (if Xand Yare not inde
general not true (if Xand Yare not independent). In the case of one random variable (see (10-6)), we defined the parameters mean and variance to represent its average behavior. How does one parametrically represent similar cross-behavior between two random variables? Towards this, we can generalize the variance definition given in (6-16) as shown below: )(XgZ)].([)]([)()()()( )()()()()]()([ YhEXgEdyyfyhdxxfxgdxdyyfxfyhxgYhXgEYXYX(10-9))(YhWPILLAI

5Covariance: Given any two r.vs Xand Y
5Covariance: Given any two r.vs Xand Y, defineBy expanding and simplifying the right side of (10-10), we also getIt is easy to see thatTo see (10-12), let so that . )()()()(),(________YXXYYEXEXYEXYEYXCovYX(10-10)(10-12). )()(),(YXYXEYXCov(10-11),YaXU. )()(),(YVarXVarYXCov. 0)(),( 2)( )()()(22YVarYXCovaXVaraYXaEUVarYX(10-13)PILLAI6The right side of (10-13) represents a quadratic in the variable athat has no distinct real root

s (Fig. 10.1). Thus the roots are imagin
s (Fig. 10.1). Thus the roots are imaginary (or double) and hence the discriminantmust be non-positive, and that gives (10-12). Using (10-12), we may define the normalized parameterorand it represents the correlation coefficient between Xand Y.)( )(),(2 YVarXVarYXCov,11 ,),()()(),(XYYXXYYXCovYVarXVarYXCov(10-14)a)(UVar(10-15)YXXYYXCov),(Fig. 10.1PILLAI7Uncorrelated r.vs: If then Xand Yare said to be uncorrelated r.vs. From

(11), if Xand Yare uncorrelated, thenOrt
(11), if Xand Yare uncorrelated, thenOrthogonality: Xand Yare said to be orthogonal if From (10-16) -(10-17), if either Xor Yhas zero mean, then orthogonality implies uncorrelatedness also and vice-versa. Suppose Xand Yare independent r.vs. Then from (10-9) with we getand together with (10-16), we conclude that the random variables are uncorrelated, thus justifying the original definition in (10-10). Thus independence implies uncorrelatedness

.(10-16),0XY,)( ,)(YYhXXg).()()(YEXEXYE
.(10-16),0XY,)( ,)(YYhXXg).()()(YEXEXYE.0)(XYE(10-17)),()()(YEXEXYEPILLAI8Naturally, if two random variables are statistically independent, then there cannot be any correlation between them However, the converse is in general not true. As the next example shows, random variables can be uncorrelated without being independent.Example 10.1: Let Suppose Xand Yare independent. Define Z= X+ Y, W= X -Y. Show that Zand Ware dependent, but uncorrelated r.vs.

Solution:
Solution: gives the only solution set to beMoreover and),1,0( UX).1,0( UY|| ,2 ,2 ,11 ,20wzwzwzwz.2/1|),(|wzJ.2 ,2wzywzxyxwyxz ,).0(XYPILLAI9,otherwise,0,|| ,2 ,2 ,11 ,20,2/1),(zwwzwzwzwzfZW(10-18)Thus (see the shaded region in Fig. 10.2)and henceor by direct computation ( Z = X + Y )Fig. 10.211wz2,21,2 21,10 , 21),()(2 2 zzdwzzdwdwwzfzf-zz-zzZWZPILLAI

10andClearly
10andClearly Thus Zand Ware not independent. Howeverandand henceimplying that Zand Ware uncorrelated random variables.,otherwise,0,21,2,10,)()()(zzzzzfzfzfYXZ(10-20)).()(),(wfzfwzfWZZW(10-21),0)()())(()(22YEXEYXYXEZWE0)()()(),(WEZEZWEWZCov(10-22).otherwise,0,11|,|1 21),()(||2 wwdzdzwzfwfw|w|ZWW,0)()(YXEWE(10-19)PILLAI11Example 10.2: Let Determine the variance of Zin terms of and

Solution: and using (10-15)In pa
Solution: and using (10-15)In particular if Xand Yare independent, then and (10-23) reduces toThus the variance of the sum of independent r.vs is the sum of their variances(10-23),0XY.bYaXZYX,.XY()()ZXYEZEaXbYab22222222222()()()() ()2()()() 2.ZZXYXXYYXXYXYYVarZEZEaXbYaEXabEXYbEYaabb.22222YXZba(10-24)).1(baPILLAI12Moments:represents the joint moment of order (k,m) for Xand Y. Following the one random variable case, we can define the joint characte

ristic function between two random varia
ristic function between two random variables which will turn out to be useful for moment calculations.Joint characteristic functions: The joint characteristic function between Xand Yis defined asNote that, ),(][ dydxyxfyxYXEXYmkmk(10-25) ()() (,)(,).jXuYvjXuYvXYXYuvEeefxydxdy(10-26).1)0,0(),(XYXYvuPILLAI13It is easy to show thatIf Xand Yare independent r.vs, then from (10-26), we obtainAlsoMore on Gaussian r.vs: From Lecture 7, Xand Yare said to be jointly Gaussian

as
as if their joint p.d.f has the form in (7-23). In that case, by direct substitution and simplification, we obtain the joint characteristic function of two jointly Gaussian r.vs to be .),(1)(0,0 22vuXYvuvujXYE(10-27)).()()()(),(vueEeEvuYXjvYjuXXY(10-28)()(,0), ()(0,).XXYYXYuuvv(10-29)),,,,,(22YXYXNPILLAI14.)(),()2(21)()(2222vuvuvujYvXujXYYYXXYXeeEvu(10-30)Equation (10-14) can be used to make various conclusions. Letting in

(10-30), we getand it agrees with (6-47)
(10-30), we getand it agrees with (6-47).From (7-23) by direct computation using (10-11), it is easy to show that for two jointly Gaussian random variablesHence from (10-14), in represents the actual correlation coefficient of the two jointly Gaussian r.vs in (7-23). Notice that implies 0v,)0,()(2221uujXYXXXeuu(10-31)),,,,(22YXYXN. ),(YXYXCov0PILLAI15).()(),(yfxfYXfYXXYThus if Xand Yare jointly Gaussian, uncorrelatedness does

imply independence between the two rando
imply independence between the two random variables. Gaussian case is the only exception where the two concepts imply each other.Example 10.3: Let Xand Ybe jointly Gaussian r.vs with parameters Define Determine Solution: In this case we can make use of characteristic function to solve this problem. ).,,,,(22YXYXN.bYaXZ).(zfZ).,( )()()()()(buaueEeEeEuXYjbuYjauXubYaXjjZuZ(10-

32)PILLAI16From (10-30) with uand vre
32)PILLAI16From (10-30) with uand vreplaced by auand burespectively we getwhereNotice that (10-33) has the same form as (10-31), and hence we conclude that is also Gaussian with mean and variance as in (10-34) -(10-35), which also agrees with (10-23). From the previous example, we conclude that any linear combination of jointly Gaussian r.vs generate a Gaussian r.v.,)(222222221)2(21)(uujubabaubajZZZYYXXYXeeu(10-33)(10-34)(10-35)bYaXZ.2,22222YYXXZYXZbababaPILL

AI17In other words, linearity preserve
AI17In other words, linearity preserves Gaussianity. We can use the characteristic function relation to conclude an even more general result.Example 10.4: Suppose Xand Yare jointly Gaussian r.vs as in the previous example. Define two linear combinationswhat can we say about their joint distribution? Solution: The characteristic function of Zand Wis given byAs before substituting (10-30) into (10-37) with uand vreplaced by au + cvand bu + dvrespectively, we get. ,

dYcXWbYaXZ(10-36)).,()( )
dYcXWbYaXZ(10-36)).,()( )()(),()()()()()(dvbucvaueEeEeEvuXYdvbujYcvaujXvdYcXjubYaXjWvZujZW(10-37)PILLAI18,),()2(21)(2222vuvuvujZWWYXZWZWZevu(10-38)whereandFrom (10-38), we conclude that Zand Ware also jointly distributed Gaussian r.vs with means, variances and correlation coefficient as in (10-39) -(10-43).(10-39),2,2,,2222222222YYXXWYYXXZYXWYXZdcdcbabadcba(10-40)(10-41)(10-42).)(22WZYYXXZWbdbcadac(10-43)PILLAI19To summarize, any two linear combination

s of jointly Gaussian random variables (
s of jointly Gaussian random variables (independent or dependent) are also jointly Gaussian r.vs.Of course, we could have reached the same conclusion by deriving the joint p.d.f using the technique developed in section 9 (refer (7-29)). Gaussian random variables are also interesting because of the following result:Central Limit Theorem: Suppose are a set of zero mean independent, identically distributed (i.i.d) random Linear operatorGaussian

inputGaussian output),(wzfZWnXXX,,,21Fig
inputGaussian output),(wzfZWnXXX,,,21Fig. 10.3PILLAI20variables with some common distribution. Consider their scaled sumThen asymptotically (as )Proof: Although the theorem is true under even more general conditions, we shall prove it here under the independence assumption. Let represent their common variance. Sincewe have.21nXXXYn(10-44)n).,0(2NY2,0)(iXE.)()(22iiXEXVar(10-45)(10-46)(10-47)PILLAI21Consider where we have made use of the independence of the

r.vs
r.vs Butwhere we have made use of (10-46) -(10-47). Substituting (10-49) into (10-48), we obtainand asniXninujXnuXXXjjYuYnueEeEeEuiin11//)()/( )()()(21(10-48).,,,21nXXX,121!3!21)(2/3222/3333222/nonunuXjnuXjnujXEeEiiinujXi(10-49),121)( 2/322nYnonuu(10-50)(10-51)22/2lim(),uYnuePILLAI22since.1limxnnenx(10-52)[Note that terms in (10-50) decay faster than But (10-51) represents the characteristic function of

a zero mean normal r.v with variance
a zero mean normal r.v with variance and (10-45) follows.The central limit theorem states that a large sum of independent random variables each with finite variance tends to behave like a normal random variable. Thus the individual p.d.fs become unimportant to analyze the collective sum behavior. If we model the noise phenomenon as the sum of a large number of independent random variables (eg: electron motion in resistor components), then this theorem allows us to conclude t

hat noise behaves like a Gaussian r.v.3/
hat noise behaves like a Gaussian r.v.3/2(1/)on3/21/].n2PILLAI23It may be remarked that the finite variance assumption is necessary for the theorem to hold good. To prove its importance, consider the r.vs to be Cauchy distributed, and letwhere each Then sincesubstituting this into (10-48), we getwhich shows that Yis still Cauchy with parameter In other words, central limit theorem doesn’t hold good for a set of Cauchy r.vs as their variances are undefin

ed..21nXXXYn(10-53)).( CXi,)(||uXeui
ed..21nXXXYn(10-53)).( CXi,)(||uXeui(10-54)n||/1()(/) ~ (),nunYXiuuneCn(10-55).nPILLAI24Joint characteristic functions are useful in determining the p.d.f of linear combinations of r.vs. For example, with Xand Yas independent Poisson r.vs with parameters and respectively, letThenBut from (6-33)so that i.e., sum of independent Poisson r.vs is also a Poisson random variable..YXZ(10-56)).()()(uuuYXZ(10-57))1()1(21)( ,)(jujueYeXeueu(10-58))( )(21)1)((21Pe