Topic EnergyPowerSpectraandCorrelation In Lecture  we reviewed the notion of average signal power in a periodic signal and related it to the and coecients of a Fourier series giving a method of calcu
167K - views

Topic EnergyPowerSpectraandCorrelation In Lecture we reviewed the notion of average signal power in a periodic signal and related it to the and coecients of a Fourier series giving a method of calcu

Similar presentations


Tags : this lecture
Download Pdf

Topic EnergyPowerSpectraandCorrelation In Lecture we reviewed the notion of average signal power in a periodic signal and related it to the and coecients of a Fourier series giving a method of calcu




Download Pdf - The PPT/PDF document "Topic EnergyPowerSpectraandCorrelation I..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.



Presentation on theme: "Topic EnergyPowerSpectraandCorrelation In Lecture we reviewed the notion of average signal power in a periodic signal and related it to the and coecients of a Fourier series giving a method of calcu"— Presentation transcript:


Page 1
Topic5 Energy&PowerSpectra,andCorrelation In Lecture 1 we reviewed the notion of average signal power in a periodic signal and related it to the and coecients of a Fourier series, giving a method of calculating power in the domain of discrete frequencies. In this lecture we want to revisit power for the continuous time domain, with a view to expressing it in terms of the frequency spectrum. First though we should review the derivation of average power using the complex Fourier series. 5.1 Review of Discrete Parseval for the Complex Fourier Series Youdidthisasapartof1sttutesheet Recall that the average power in a periodic signal with period = 2  =! is Ave sig pwr T= T= T= T= ) d t : Now replace ) with its complex Fourier series ) = 1 n!t It follows that Ave sig pwr T= T= 1 n!t 1 m!t 1 (because of orthogonality) 1 + 2 =1 using = (
Page 2
5/2 5.1.1 A quick check It is worth checking this using the relationships found in Lecture 1: ) for m > 2 for = 0 + i ) for m < For 0 the quantities are = 2 + i ) = in agreement with the expression in Lecture 1. 5.2 Energy signals vs Power signals When considering signals in the continuous time domain, it is necessary to dis- tinguish between \ nite energy signals", or \energy signals" for short, and \ nite power signals". First let us be absolutely clear that All signals ) are such that is a power. An energy signal is one where the total energy is nite: Tot 1 < E Tot It is said that ) is \square integrable". As Tot is nite, dividing by the in nite duration indicates that energy signals have zero average power. To summarize before knowing what all these terms mean: An Energy signal always has a Fourier transform always has an energy spectral density (ESD) given by ff ) = always has an autocorrelation ff ) = 1 )d always has an ESD which is the FT of the autocorrelation ff ,E ff always has total energy Tot ff (0) = 1 ff )d always has an ESD which transfers as gg ) = ff
Page 3
5/3 A p ower signal is one where the total energy is in nite, and we consider average power Ave = lim !1 < P Ave A Power signal may have a Fourier transform may have an power spectral density (PSD) given ff ) = always has an autocorrelation ff ) = lim !1 )d always has a PSD which is the FT of the autocorrelation ff ,S ff always has integrated average power Ave ff (0) always has a PSD which transfers through a system as gg ) = ff The distinction is all to do with avoiding in nities, but it results in the autocorrela- tion having di erent dimensions. Instinct tells you this is going to be a bit messy. We discuss nite energy signals rst. 5.3 Parseval's theorem revisited Let us assume an energy signal, and recall a general result from Lecture 3: where ) and ) are the Fourier transforms of ) and ). Writing the Fourier transform and the convolution integral out fully gives 1 )e !t 1 ) d p ; where is a dummy variable used for integration. Note that is not involved in the integrations above | it just a free variable on both the left and right of the above equation | and we can give it any value we wish to. Choosing = 0, it must be the case that 1 )d 1 ) d p :
Page 4
5/4 Now suppose ) = ). We know that 1 )e !t 1 )e +i !t 1 )e !t This is, of course, a quite general result which could have been stuck in Lecture 2, and which is worth highlighting: The Fourier Transform of a complex conjugate is 1 )e !t Take care with the Back to the argument. In the earlier expression we had 1 )d 1 )d 1 )d 1 )d Now is just any parameter, so it is possible to tidy the expression by replacing it with . Then we arrive at the following important result Parseval's Theorem: The total energy in a signal is Tot 1 1 1 NB! The d = d ! = , and is nothing to do with the signal being called ). 5.4 The Energy Sp ectral Density If the integral gives the total energy, it must be that is the energy per Hz. That is: The ENERGY Sp ectral Density of a signal is de ned as ff ) =
Page 5
5/5 5.5 Example [Q] Determine the energy in the signal ) = )e (i) in the time domain, and (ii) by determining the energy spectral density and integrating over frequency. [A] Part (i): To nd the total energy in the time domain ) = ) exp( 1 exp( )d exp( = 0 Part (ii): In the frequency domain ) = 1 ) exp( ) exp( ! t )d exp( (1 + i ))d exp( (1 + i )) (1 + i (1 + i Hence the energy spectral density is 1 + Integration over all frequency (not remember!!) gives the total energy of 1 1 1 + Substitute tan 1 = = 1 + tan sec = = w h i c h i s n i c e
Page 6
5/6 5.6 Correlation Correlation is a tool for analysing whether processes considered random apriori are in fact related. In signal processing, cross-correlation fg is used to assess how similar two di erent signals ) and ) are. fg is found by multiplying one signal, ) say, with time-shifted values of the other ), then summing up the products. In the example in Figure 5.1 the cross-correlation will low if the shift = 0, and high if = 2 or = 5. 1 2 3 4 5 6 f(t) g(t) Low High High Figure5.1:Thesignal )wouldhaveahighercross-correlationwithpartsof )thatlook similar. One can also ask how similar a signal is to itself . Self-similarity is described by the auto-correlation ff , again a sum of products of the signal ) and a copy of the signal at a shifted time ). An auto-correlation with a high magnitude means that the value of the signal ) at one instant signal has a strong bearing on the value at the next instant. Correlation can be used for both deterministic and random signals. We will explore random processes this in Lecture 6. The cross- and auto-correlations can be derived for both nite energy and nite power signals, but they have di erent dimensions (energy and power respectively) and di er in other more subtle ways. We continue by looking at the auto- and cross-correlations of nite energy signals. 5.7 The Auto-correlation of a nite energy signal The auto-correlation of a nite energy signal is de ned as follows. We shall deal with real signals , so that the conjugate can be omitted.
Page 7
5/7 The auto-correlation of a signal of nite energy is de ned ff ) = 1 )d (forrealsignals) 1 )d The result is an energy There are two ways of envisaging the process, as shown in Figure 5.2. One is to shift a copy of the signal and multiply vertically (so to speak). For positive this is a shift to the \left". This is most useful when calculating analytically. then sum f(t) f(t ) + f(t) f(t) Figure5.2: )and )forapositiveshift 5.7.1 Basic prop erties of auto-correlation 1. Symmetry. The auto-correlation function is an even function of ff ) = ff Pro of: Substitute into the de nition, and you will get ff ) = 1 )d p : But is just a dummy variable. Replace it by and you recover the expression for ff ). (In fact, in some texts you will see the autocorrelation de ned with a minus sign in front of the .)
Page 8
5/8 2. For a non-zero signal, ff (0) Pro of: For any non-zero signal there is at least one instant for which = 0, and 0. Hence 1 )d t > 0. 3. The value at = 0 is largest: ff (0) ff Pro of: Consider any pair of real numbers and . As ( 0, we know that . Now take the pairs of numbers at random from the function ). Our result shows that there is no rearrangement, random or ordered, of the function values into ) that would make )d t > . Using ) = ) is an ordered rearrangement, and so for any 1 1 )d 5.8 Applications 5.8.1 Synchronising to heartb eats in an ECG (DIY search and read) 5.8.2 The search for Extra Terrestrial Intelligence Figure5.3:Chattyaliens For several decades, the SETI organization have been look- ing for extra terrestrial intelligence by examining the auto- correlation of signals from radio telescopes. One project scans the sky around nearby (200 light years) sun-like stars chopping up the bandwidth between 1-3 GHz into 2 billion channels each 1 Hz wide. (It is assumed that an attempt to communicate would use a single frequency, highly tuned, signal.) They determine the autocorrelation each chan- nel's signal. If the channel is noise, one would observe a very low autocorrelation for all non-zero . (See white noise in Lecture 6.) But if there is, say, a repeated message, one would observe a periodic rise in the autocorrelation. increasing Figure5.4: ff at =0isalwayslarge,butwilldroptozeroifthesignalisnoise.Ifthemessages aligntheautocorrelationwithrise.
Page 9
5/9 5.9 The Wiener-Khinchin Theorem Let us take the Fourier transform of the cross-correlation )d , then switch the order of integration, FT 1 )d 1 1 ) d ! 1 1 ) e ! Notice that is a constant for the integration wrt (that's how ) oated through the integral sign). Substitute into it, and the integrals become separable FT 1 )d 1 1 ) e !p +i !t 1 )e !t 1 ) e !p If we specialize this to the auto-correlation, ) gets replaced by ). Then For a nite energy signal The Wiener-Khinchin Theorem says that The FT of the Auto-Correlation is the Energy Sp ectral Density FT ff )] = ff NorbertWiener(1894-1964)andAleksandrKhinchin(1894-1959) (This method of proof is valid only for nite energy signals, and rather trivializes the Wiener-Khinchin theorem. The fundamental derivation lies in the theory of stochastic processes.) 5.10 Corollary of Wiener-Khinchin This corollary just con rms a result obtained earlier. We have just shown that ff ,E ff ). That is ff ) = 1 ff )e ! where is used by convention. Now set = 0
Page 10
5/10 Auto-correlation at = 0 is ff (0) = 1 ff )d Tot But this is exactly as expected! Earlier we de ned the energy spectral density as Tot 1 ff )d ! ; and we know that for a nite energy signal ff (0) = 1 Tot 5.11 How is the ESD a ected by passing through a system? If ) and ) are in the input and output of a system with transfer function ), then ) = But ff ) = , and so gg ) = ff 5.12 Cross-correlation The cross-correlation describes the dependence between two di erent signals. Cross-correlation fg ) = 1 )d 5.12.1 Basic prop erties 1. Symmetries The cross-correlation does not in general have a de nite re ection symmetry. However, fg ) = gf ). 2. Indep endent signals The auto-correlation of even white noise has a non-zero value at = 0. This is not the case for the cross-correlation. If fg ) = 0, the signal ) and have no dependence on one another.
Page 11
5/11 5.13 Example and Application [Q] Determine the cross-correlation of the signals ) and ) shown. f(t) g(t) 2a 3a 4a 2a [A] Start by sketching ) as function of 4a+ 4a+ 4a+ 2a+ 2a+ 2a+ 4a+ 2a+ 0 2a t t 0 0 0 2a 2a 2a ) is made of sections with = 0, , then = 0. ) is made of = 0, 2 + = 1, then = 0. The left-most non-zero con guration is valid for 0 , so that For  fg ) = 1 )d 1 d (4 For  fg ) = 1 )d 2 + 1 d For  fg ) = 1 )d 2 + 1 d For 0: fg ) = 1 )d 2 + Working out the integrals and nding the maximum is left as a DIY exercise.
Page 12
5/12 Figure5.5: 5.13.1 Application It is obvious enough that cross-correlation is useful for detecting occurences of a \model" signal ) in another signal ). This is a 2D example where the model signal x ; y ) is the back view of a footballer, and the test signals x ; y are images from a match. The cross correlation is shown in the middle. 5.14 Cross-Energy Sp ectral Density The Wiener-Khinchin Theorem was actually derived for the cross-correlation. It said that The Wiener-Khinchin Theorem shows that, for a nite energy signal, the FT of the Cross-Correlation is the Cross-Energy Sp ectral Density FT fg )] = ) = fg 5.15 Finite Power Signals Let us use ) = sin to motivate discussion about nite power signals. All periodic signals are nite power, in nite energy, signals. One cannot evaluate 1 sin However, by sketching the curve and using the notion of self-similarity, one would wish that the auto-correlation is positive, but decreasing, for small but increasing ; then negative as the the curves are in anti-phase and dissimilar in an \organized" way, then return to being similar. The autocorrelation should have the same period as its parent function, and large when = 0 | so ff proportional to cos( would seem right. We de ne the autocorrelation as an average power. Note that for a periodic
Page 13
5/13 increasing Figure5.6: function the limit over all time is the same as the value over a period ff ) = lim !1 sin( ) sin( ))d 2( 2) sin( ) sin( ))d =! =! sin( ) sin( ))d sin( ) sin( ))d sin ) cos( ) + sin( ) cos( ) sin( cos( For a nite energy signal, the Fourier Transform of the autocorrelation was the energy spectral density. What is the analogous result now? In this example, FT ff ] = ) + )] This is actually the power spectral density of sin , denoted ff ). The functions are obvious enough, but to check the coecient let us integrate over all frequency 1 ff )d 1 ) + )] d 1 ) + )] [1 + 1] =
Page 14
5/14 This does indeed return the average power in a sine wave. We can use Fourier Series to conclude that this results must also hold for any periodic function. It is also applicable to any in nite energy \non square-integrable" function. We will justify this a little more in Lecture 6 To nish o , we need only state the analogies to the nite energy formulae, replacing Energy Spectral Density with Power Spectral Density, and replacing Total Energy with Average Power. The auto correlation of a nite p ower signal is de ned as ff ) = lim !1 )d t : The auto correlation function and Power Sp ectral Density are a Fourier Transform Pair ff ff The average p ower is Ave ff (0) The p ower sp ectrum transfers across a system as gg ) = ff This result is proved in the next lecture. 5.16 Cross-correlation and p ower signals Two power signals can be cross-correlated, using a similar de nition: fg ) = lim !1 )d fg fg 5.17 Input and Output from a system One very last thought. If one applies an nite power signal to a system, it cannot be converted into a nite energy signal | or vice versa. ToreallynailitwouldrequireustounderstandWiener-Khinchinintoomuchdepth.