/
DSP-CIS Part-III :  Optimal & Adaptive DSP-CIS Part-III :  Optimal & Adaptive

DSP-CIS Part-III : Optimal & Adaptive - PowerPoint Presentation

paisley
paisley . @paisley
Follow
65 views
Uploaded On 2023-10-27

DSP-CIS Part-III : Optimal & Adaptive - PPT Presentation

Filters Chapter7 Wiener Filters and the LMS Algorithm Marc Moonen Dept EEESATSTADIUS KU Leuven marcmoonenesatkuleuvenbe wwwesatkuleuvenbe stadius PartIII Optimal amp Adaptive Filters ID: 1025668

lms filtering adaptive wiener filtering lms wiener adaptive optimal filter filters algorithm set general squares slide equations appendixskip signal

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "DSP-CIS Part-III : Optimal & Adapti..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

1. DSP-CISPart-III : Optimal & Adaptive Filters Chapter-7 : Wiener Filters and the LMS AlgorithmMarc Moonen Dept. E.E./ESAT-STADIUS, KU Leuvenmarc.moonen@esat.kuleuven.bewww.esat.kuleuven.be/stadius/

2. Part-III : Optimal & Adaptive Filters Wieners Filters & the LMS AlgorithmIntroduction / General Set-UpApplicationsOptimal Filtering: Wiener FiltersAdaptive Filtering: LMS Algorithm Recursive Least Squares AlgorithmsLeast Squares EstimationRecursive Least Squares (RLS)Square Root AlgorithmsFast RLS Algorithms Kalman FiltersIntroduction – Least Squares Parameter EstimationStandard Kalman FilterSquare-Root Kalman FilterChapter-7Chapter-8Chapter-9

3. Introduction / General Set-UpNorbert Wiener (1894-1964)See Part-IIrealizations of1. ‘Classical’ Filter Design2. ‘Optimal’ Filter Design

4. Introduction / General Set-Up

5. Introduction / General Set-Up3. ‘Adaptive’ Filters

6. Introduction / General Set-Up

7. ApplicationsOptimal/adaptive filter provides mathematical model for input/output-behavior of the plant ‘plant’ can be any system

8. Applicationsecho pathnear-end signal + echonear-end signal

9. Applicationsto/from network‘Hybrid’ is never ideally matched to line impedance, hence generates echo of transmitted signal into received signal

10. Applicationssignal+ noisenoise

11. Applicationssignal+ noisenoise

12. Applications

13. Applications

14. Applications

15. Applications

16. Optimal Filtering : Wiener Filters12Have to decide on 2 things..

17. Optimal Filtering : Wiener Filters1u[k]y[k]d[k]e[k]PS: Shorthand notation uk=u[k], yk=y[k], dk=d[k], ek=e[k], Filter coefficients (‘weights’) are wl (replacing bl of previous chapters) For adaptive filters wl also have a time index wl[k]

18. Optimal Filtering : Wiener Filters

19. Optimal Filtering : Wiener FiltersPS: Can generalize FIR filter to ‘multi-channel FIR filter’ example: see page 11

20. Optimal Filtering : Wiener FiltersFIR filter may then also be viewed as special case of ‘linear combiner’ where input signals are delayed versions of each otherPS: Special case of ‘multi-channel FIR filter’ is ‘linear combiner’

21. Optimal Filtering : Wiener Filters2

22. Optimal Filtering : Wiener FiltersMMSE cost function can be expanded as…

23. Optimal Filtering : Wiener FiltersCorrelation matrix has a special structure…

24. Optimal Filtering : Wiener FiltersMMSE cost function can be expanded as…(continued)This is the ‘Wiener Filter’ solution

25. Optimal Filtering : Wiener FiltersHow do we solve the Wiener–Hopf equations?= used intensively in applications, e.g. in speech codecs, etc. details omitted, see Appendix( L+1 linear equations in L+1 unknowns)O(L3)O(L2)

26. Adaptive Filtering: LMS AlgorithmHow do we solve the Wiener–Hopf equations?Alternatively, an iterative steepest descent algorithm can be usedThis will be the basis for the derivation of the Least Mean Squares (LMS) adaptive filtering algorithm…Bernard Widrow 1965 (https://www.youtube.com/watch?v=hc2Zj55j1zU)

27. Adaptive Filtering: LMS AlgorithmHow do we compute the Wiener filter? here n is iteration index2) Can also apply iterative procedure to minimize MMSE criterion, e.g. μ is ‘stepsize’ (to be tuned..) 1) Cfr supra: By solving Wiener-Hopf equations (L+1 equations in L+1 unknowns)

28. Adaptive Filtering: LMS AlgorithmBound on stepsize ?

29. Adaptive Filtering: LMS Algorithmsmall λ_i implies slow convergence λ_min <<λ_max (hence small μ) implies *very* slow convergenceConvergence speed?

30. Adaptive Filtering: LMS Algorithmkas followsThen replace iteration index n by time index k (i.e. perform 1 iteration per sampling interval) Replace n+1 by n for convenience… Then leave out expectation operators (i.e. replace expected values by instantaneous estimates)

31. Adaptive Filtering: LMS AlgorithmSimple algorithm, can even draw signal flow graph (=realization)…

32. Adaptive Filtering: LMS AlgorithmWhenever LMS has reached the WF solution, the expected value of (=estimated gradient in update formula) is zero, but the instantaneous value is generally non-zero (=noisy), and hence LMS will again move away from the WF solution!

33. Adaptive Filtering: LMS Algorithmmeans step size has to be much smaller…!LLLL

34. Adaptive Filtering: LMS AlgorithmLMS is an extremely popular algorithm many LMS-variants have been developed (cheaper/faster/…)…K is block index, LB is block size(see p.35)

35. Adaptive Filtering: LMS Algorithm= LMS with normalized step size(mostly used in practice)

36. AppendixSkip this slide

37. AppendixSkip this slide

38. AppendixSkip this slide

39. AppendixSkip this slide

40. AppendixSkip this slide