/
Playing with features for Playing with features for

Playing with features for - PowerPoint Presentation

natalia-silvester
natalia-silvester . @natalia-silvester
Follow
384 views
Uploaded On 2016-04-13

Playing with features for - PPT Presentation

learning and prediction Jongmin Kim Seoul National University Problem statement Predicting outcome of surgery Predicting outcome of surgery Ideal approach Training Data Predicting outcome ID: 280137

feature features data learning features feature learning data encoder predicting surgery representations outcome selection machine boosting decoder image deep

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Playing with features for" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Playing with features forlearning and prediction

Jongmin

Kim

Seoul National UniversitySlide2

Problem statement

Predicting outcome of surgerySlide3

Predicting outcome of surgery

Ideal approach

. . . .

?

Training Data

Predicting outcome

surgerySlide4

Predicting outcome of surgery

Initial approach

Predicting partial features

Predict witch features?Slide5

Predicting outcome of surgery

4 Surgery

DHL+RFT+TAL+FDO

flexion of the knee

( min / max )

dorsiflexion

of the ankle

( min )

rotation of the foot

( min / max )Slide6

Predicting outcome of surgery

Is it good features?

Number of Training data

DHL+RFT+TAL : 35 data

FDO+DHL+TAL+RFT : 33 dataSlide7

Machine learning and feature

Data

Feature

representation

Learning

algorithm

Feature

representation

Learning

algorithmSlide8

Joint position / angleVelocity / acceleration

Distance between body parts

Contact status

Features in motionSlide9

Features in computer vision

SIFT

Spin image

HoG

RIFT

Textons

GLOHSlide10

Machine learning and featureSlide11

Outline

Feature selection

- Feature ranking

- Subset selection: wrapper, filter, embedded

- Recursive Feature Elimination

- Combination of weak prior (Boosting)

- ADAboosting(clsf) / joint boosting (

clsf)/ Gradientboost (regression)Prediction result with feature selection

Feature learning?Slide12

Feature selection

Alleviating the effect of the curse of dimensionality

Improve the prediction performance

Faster and more cost-effective

Providing a better understanding of the dataSlide13

Subset selection

Wrapper

Filter

EmbeddedSlide14

Feature learning?

Can

we automatically learn a good feature representation?

Known as: unsupervised

feature learning, feature learning, deep learning, representation learning, etc.

Hand-designed features (by human):

1. need expert knowledge

2. requires time-consuming hand-tuning.

When it’s unclear how to hand design features: automatically learned features (by machine)Slide15

Learning Feature Representations

Key

idea:

Learn statistical structure or correlation of the data from unlabeled data

–The learned representations can be used as features in supervised and semi-supervised

settingsSlide16

Learning Feature Representations

Encoder

Decoder

Input (Image/ Features)

Output Features

e.g.

Feed-back /

generative /

top-down

path

Feed-forward /

bottom-up pathSlide17

Learning Feature Representations

σ

(

Wx

)

Dz

Input Patch

x

Sparse Features

z

e.g.

Predictive Sparse

Decomposition

[

Kavukcuoglu

et al

., ‘09]

Encoder filters W

Sigmoid function

σ

(.)

Decoder filters D

L

1

SparsitySlide18

Stacked Auto-Encoders

Encoder

Decoder

Input Image

Class label

Features

Encoder

Decoder

Features

Encoder

Decoder

[Hinton

&

Salakhutdinov

Science ‘06] Slide19

At Test Time

Encoder

Input Image

Class label

Features

Encoder

Features

Encoder

[Hinton

&

Salakhutdinov

Science ‘06]

Remove decoders

Use feed-forward path

Gives standard(Convolutional)

Neural Network

Can fine-tune with

backpropSlide20

Status & plan

Data

파악

/ learning technique survey…

Plan : 11

월 실험 끝

12

월 논문 writing1월 시그랩

submit8월에 미국에서 발표

But before all of that….Slide21

Deep neural net vs. boosting

Deep Nets

:

- single highly non-linear system

- “deep” stack of simpler modules

- all parameters are subject to

learning

Boosting & Forests:- sequence of “weak” (simple) classifiers that are linearly combined to produce a powerful classifier

- subsequent classifiers do not exploit representations of earlier classifiers, it's a “shallow” linear mixture- typically features are not learnedSlide22

Deep neural net vs. boostingSlide23

Feature learning for motion data

Learning representations of temporal data

-

Model complex, nonlinear

dynamics such as style

Restricted Boltzmann machine

- didn’t understand the concept..

- the result is not impressiveSlide24

Restricted Boltzmann machine

Model complex, nonlinear dynamics

Easily and exactly infer the latent binary state given the observations