/
Wireless and Mobile Systems Wireless and Mobile Systems

Wireless and Mobile Systems - PowerPoint Presentation

bety
bety . @bety
Follow
342 views
Uploaded On 2022-06-08

Wireless and Mobile Systems - PPT Presentation

for the IoT Nirupam Roy MW 200315pm CHM 1224 CMSC 715 Fall 2021 Lecture 31 Machine Learning for IoT Happy or sad Happy or sad Happy or sad Happy or sad Past experience P The dolphin is happy ID: 915094

sunny rainy hidden markov rainy sunny markov hidden model rule probability bayes likelihood state robot toy localization sequence definition

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Wireless and Mobile Systems" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Wireless and Mobile Systemsfor the IoT

Nirupam Roy

M-W 2:00-3:15pm

CHM 1224

CMSC 715 : Fall 2021

Lecture

3.1: Machine Learning for IoT

Slide2

Happy or sad?

Slide3

Happy or sad?

Slide4

Happy or sad?

Slide5

Happy or sad?

Past experience

P (

The dolphin is happy

|

Experience

)

Slide6

Inference from sensor data

Tracking arm motion

Slide7

Inference from sensor data

Partial information from sensors

Tracking arm motion

Slide8

Inference from sensor data

Partial information from sensors

+

Tracking arm motion

Knowledge about the arm’s motion

(probabilistic models)

Slide9

Inference from sensor data

Partial information from sensors

+

Arm’s gesture

and movement

tracking

Tracking arm motion

Knowledge about the arm’s motion

(probabilistic models)

Slide10

Slide11

Sensors are not perfect

Slide12

The physical world

Statistical models, learning, perception

Meaningful information

Slide13

The physical world

Statistical models, learning, perception

Meaningful information

Neither perfect nor adequate

Inherently probabilistic

in nature

Slide14

Probability refresher

Slide15

A few basic probability rules

Likelihood of multiple events occurring simultaneously

Joint probability: P (

A

,

B

) = P (B

, A)

Conditional probability: P (

A

|B) = P(A

,

B)/P(B

)

Probability of an event, given another event has occurred

Marginal probability: P (

B

) =

 

Slide16

A few basic probability rules

Likelihood of multiple events occurring simultaneously

Joint probability: P (

A

,

B

) = P (B

, A)

Conditional probability: P (

A

|B) = P(A

,

B)/P(B

)

Probability of an event, given another event has occurred

Marginal probability: P (

B

) =

 

A1

A2

A3

B

Slide17

A few basic probability rules

Likelihood of multiple events occurring simultaneously

Joint probability: P (

A

,

B

) = P (B

, A)

Conditional probability: P (

A

|B) = P(A

,

B)/P(B

)

Probability of an event, given another event has occurred

Marginal probability: P (

B

) =

 

Chain rule: P (

A

,

B

,

C

) = P(

A

|

B

,

C

) P(

B

,

C

) = P(

A

|

B

,

C

) P(

B

|

C

) P(

C

)

Slide18

Bayes rule

Bayes rule: P (

A

i|

B) =

 

Slide19

Bayes rule

Posterior

Prior

Likelihood

Bayes rule: P (

A

i

|

B

) =

 

Slide20

Bayes rule

Posterior

Prior

Likelihood

=

 

Bayes rule: P (

A

i

|

B

) =

 

Slide21

Bayes rule

=

 

Posterior

Prior

Likelihood

=

 

Bayes rule: P (

A

i

|

B

) =

 

Slide22

Bayes rule

Posterior

Prior

Likelihood

Relates inverse representations of the probabilities concerning two events

=

 

=

 

Bayes rule: P (

A

i

|

B

) =

 

Slide23

Bayes rule: View 1

Healthy: A1

Diabetes: A2

Cancer: A3

Smoker: B

Slide24

Bayes rule: View 1

Healthy: A1

Diabetes: A2

Cancer: A3

Smoker: B

Posterior

Prior

Likelihood

Bayes rule: P (

A

i

|

B

) =

 

Slide25

Bayes rule: View 1

Posterior

Prior

Likelihood

Relates inverse representations of the probabilities concerning two events

Healthy: A1

Diabetes: A2

Cancer: A3

Smoker: B

=

 

=

 

Bayes rule: P (

A

i

|

B

) =

 

Slide26

Bayes rule: View 2

Not cancer: ~A

Cancer: A

Smoker: B

Slide27

Bayes rule: View 2

Not cancer: ~A

Cancer: A

Smoker: B

Posterior

Prior

Likelihood

Bayes rule: P (

A

|

B

) =

 

Slide28

Bayes rule: View 2

Posterior

Prior

Likelihood

Updates the believes, based on evidences

Not cancer: ~A

Cancer: A

Smoker: B

=

 

Bayes rule: P (

A

|

B

) =

 

Slide29

Happy or sad?

Slide30

Happy or sad?

Given the “sequence” of tricky questions asked today, what should be the answer?

Slide31

Markov Model

Slide32

Markov Model

Sunny

Sunny

Rainy

Sunny

Rainy

Rainy

t1 t2 t3 t4 t5 t6 (time)

Slide33

Markov Model

Sunny

day

Rainy

day

Sunny

Sunny

Rainy

Sunny

Rainy

Rainy

t1 t2 t3 t4 t5 t6 (time)

Slide34

Markov Model

Sunny

day

Rainy

day

Sunny

Sunny

Rainy

Sunny

Rainy

Rainy

P (sunny after sunny)

P (rainy after sunny)

P (sunny after rainy)

P (rainy after rainy)

t1 t2 t3 t4 t5 t6 (time)

M

th

order Markov assumption: Current state depends on past M states

Slide35

Markov Model

Sunny

day

Rainy

day

Sunny

Sunny

Rainy

Sunny

Rainy

Rainy

P (sunny after sunny)

P (rainy after sunny)

P (sunny after rainy)

P (rainy after rainy)

The future depends on the present only, not on the past

t1 t2 t3 t4 t5 t6 (time)

Slide36

Markov Model

Sunny

day

Rainy

day

Sunny

Sunny

Rainy

Sunny

Rainy

Rainy

P (sunny after sunny)

P (rainy after sunny)

P (sunny after rainy)

P (rainy after rainy)

t1 t2 t3 t4 t5 t6 (time)

Slide37

Markov Model

Sunny

day

Rainy

day

Sunny

Sunny

Rainy

Sunny

Rainy

Rainy

P (sunny after sunny)

P (rainy after sunny)

P (sunny after rainy)

P (rainy after rainy)

Temp.

Humidity

Wind

Temp.

Humidity

Wind

t1 t2 t3 t4 t5 t6 (time)

Slide38

Hidden Markov Model

Slide39

Hidden Markov Model: Toy robot localization example

Find location of the robot

(Hidden information)

S

1

S

2

S

3

Observations =

sensor measurement

Slide40

t=0

1

0

Prob

Hidden Markov Model: Toy robot localization example

Slide41

t=1

1

0

Prob

Hidden Markov Model: Toy robot localization example

Slide42

t=1

1

0

Prob

Hidden Markov Model: Toy robot localization example

Slide43

t=2

1

0

Prob

Hidden Markov Model: Toy robot localization example

Slide44

t=3

1

0

Prob

Hidden Markov Model: Toy robot localization example

Slide45

t=4

1

0

Prob

Hidden Markov Model: Toy robot localization example

Slide46

t=5

1

0

Prob

Hidden Markov Model: Toy robot localization example

Slide47

Hidden Markov Model: Toy robot localization example

State = location on the grid = S

i

S

1

S

2

S

3

Slide48

Hidden Markov Model: Toy robot localization example

State = location on the grid = S

i

S

1

S

2

S

3

Observations =

sensor measurement = M

i

Slide49

Hidden Markov Model: Toy robot localization example

State = location on the grid = S

i

S

1

S

2

S

3

Observations =

sensor measurement = M

i

Depends on the current state

only

(Emission)

S

i

Slide50

Hidden Markov Model: Toy robot localization example

State = location on the grid = S

i

S

1

S

2

S

3

Observations =

sensor measurement = M

i

Depends on the current state

only

(Emission)

S

i

S

i

S

i+1

S

i+2

States change over time

(Transition)

Slide51

Hidden Markov Model: Definition

S

2

M

1

S

1

S

3

S

4

M

2

M

3

M

4

Slide52

S

2

M

1

S

1

S

3

S

4

M

2

M

3

M

4

Hidden states

Hidden Markov Model: Definition

Slide53

S

2

M

1

S

1

S

3

S

4

M

2

M

3

M

4

Hidden states

Observations

Hidden Markov Model: Definition

Slide54

M

1

S

1

Emission

S

2

S

1

Transition

Hidden Markov Model: Definition

Slide55

1st order Markov assumption:

Transition probability depends on the current state only.

Output independence assumption:

Output/Emission probability depends on the current state only.

M

1

S

1

Emission

S

2

S

1

Transition

Hidden Markov Model: Definition

Slide56

Probability of a sequence of hidden states, given a sequence of observations

Hidden Markov Model: Definition

Slide57

Probability of a sequence of hidden states, given a sequence of observations

Hidden Markov Model: Definition

Slide58

Probability of a sequence of hidden states, given a sequence of observations

Hidden Markov Model: Definition

Slide59

Chain rule

Hidden Markov Model: Definition

Slide60

Observation depends on the current state only

Hidden Markov Model: Definition

Slide61

Observation depends on the current state only

Future state depends on the current state only

Hidden Markov Model: Definition

Slide62

Hidden Markov Model: Definition

For N hidden states and a sequence of T observations, N

T

different combinations

Slide63

Problems solved by HMM

1) Likelihood:

Determine the likelihood of an observation sequence.

2) Decoding:

Given an observation sequence, determine the best sequence of hidden states.

3) Learning:

Given an observation sequence and a sequence of states, learn the HMM parameters:

i

) Transition probabilities

ii) Emission probabilities

[Viterbi algorithm]

[Forward-backward algorithm]

[Baum-Welch algorithm]