/
Diagnosing Error in Object Detectors Diagnosing Error in Object Detectors

Diagnosing Error in Object Detectors - PowerPoint Presentation

marina-yarberry
marina-yarberry . @marina-yarberry
Follow
424 views
Uploaded On 2016-03-21

Diagnosing Error in Object Detectors - PPT Presentation

Department of Computer Science University of Illinois at UrbanaChampaign UIUC Derek Hoiem Yodsawalai Chodpathumwan Qieyun Dai W ork supported in part by NSF awards IIS1053768 and IIS0904209 ONR MURI ID: 264118

occlusion object performance dpm object occlusion dpm performance impact average mkl 2010 top vedaldi objects case aspect 2009 characteristics

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Diagnosing Error in Object Detectors" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Diagnosing Error in Object Detectors

Department of Computer ScienceUniversity of Illinois at Urbana-Champaign (UIUC)

Derek HoiemYodsawalai ChodpathumwanQieyun Dai

W

ork supported in part by

NSF awards IIS-1053768 and IIS-0904209, ONR MURI

Grant N000141010934

, and a research award from

GoogleSlide2

Object detection is a collection of problems

Distance

Shape

Occlusion

Viewpoint

Intra-class

Variation for “Airplane”Slide3

Object detection is a collection of problems

Localization ErrorBackgroundDissimilar Categories

Similar Categories

Confusing

Distractors for

“Airplane”Slide4

How to evaluate object detectors?

Average Precision (AP)

Good summary statistic for quick comparisonNot a good driver of researchWe propose tools to evaluate where detectors failpotential impact of particular improvements

Typical evaluation through comparison of AP numbers

figs from Felzenszwalb et al. 2010Slide5

Detectors Analyzed as Examples on VOC 2007

Deformable Parts Model (DPM)

Felzenszwalb et al. 2010 (v4)

Sliding window

Mixture of HOG templates with latent HOG parts

Multiple Kernel Learning (MKL)

Vedaldi et al. 2009

Jumping window

Various spatial pyramid bag of words features combined with MKL

x

x

xSlide6

Top false

positives: Airplane (DPM)

3

27

37

1

4

5

30

33

2

6

7

Other Objects

11%

Background

27%

Similar Objects

33%

Bird, Boat, Car

Localization

29%

Impact of Removing/Fixing FPs

AP = 0.36Slide7

Top false positives: Dog (DPM)

Similar Objects

50%

Person, Cat, Horse

1

6

16

4

2

5

8

22

Background

23%

9

3

10

Localization

17%

Impact of Removing/Fixing FPs

Other Objects

1

0%

AP = 0.03Slide8

Top false positives: Dog (MKL)

Similar Objects74%Cow, Person, Sheep, Horse

Background4%Localization17%Other Objects5%

AP = 0.17

Impact of Removing/Fixing FPs

Top 5 FPSlide9

Summary of False Positive Analysis

DPM v4

(FGMR 2010)

MKL

(Vedaldi et al. 2009)Slide10

Analysis of object characteristics

Additional annotations for seven categories: occlusion level, parts visible, sides visible

Occlusion LevelSlide11

Normalized Average Precision

Average precision is sensitive to number of positive examplesNormalized average precision: replace variable Nj with fixed N

 

 

Number of object examples in subset j Slide12

Object characteristics: AeroplaneSlide13

Object characteristics: Aeroplane

Occlusion: poor robustness to occlusion, but little impact on overall performance

Easier (None)

Harder (Heavy)Slide14

Size

: strong preference for average to above average sized airplanesObject characteristics: Aeroplane

Easier

Harder

X-Small

Small

X-Large

Medium

LargeSlide15

Aspect Ratio

: 2-3x better at detecting wide (side) views than tall views

Object characteristics: Aeroplane

Tall

X-Tall

Medium

Wide

X-Wide

Easier (Wide)

Harder (Tall)Slide16

Sides/Parts

: best performance = direct side view with all parts visibleObject characteristics: Aeroplane

Easier (Side)

Harder (Non-Side)Slide17

Summarizing Detector Performance

Avg. Performance of Best Case

Avg. Performance of Worst Case

Avg. Overall Performance

DPM (v4): Sensitivity and ImpactSlide18

DPM (FGMR 2010)

MKL (Vedaldi et al. 2009)

occlusion

trunc

s

ize

v

iew

part_vis

aspect

Sensitivity

Impact

Summarizing Detector Performance

Best, Average, Worst CaseSlide19

DPM (FGMR 2010)

MKL (Vedaldi et al. 2009)

occlusion

trunc

s

ize

v

iew

part_vis

aspect

Occlusion: high sensitivity, low potential impact

Summarizing Detector Performance

Best, Average, Worst CaseSlide20

DPM (FGMR 2010)

MKL (Vedaldi et al. 2009)

occlusion

trunc

s

ize

v

iew

part_vis

aspect

MKL more sensitive to size

Summarizing Detector Performance

Best, Average, Worst CaseSlide21

DPM (FGMR 2010)

MKL (Vedaldi et al. 2009)

occlusion

trunc

s

ize

v

iew

part_vis

aspect

DPM more sensitive to aspect

Summarizing Detector Performance

Best, Average, Worst CaseSlide22

Conclusions

Most errors that detectors make are reasonableLocalization error and confusion with similar objectsMisdetection of occluded or small objectsLarge improvements in specific areas (e.g., remove all background FPs or robustness to occlusion) has small impact in overall APMore specific analysis should be standardOur code and annotations are available online

Automatic generation of analysis summary from standard annotationswww.cs.illinois.edu/homes/dhoiem/publications/detectionAnalysis_eccv12.tar.gzSlide23

Thank you!

Similar Objects74%Cow, Person, Sheep, Horse

Background4%Localization17%Other Objects5%

AP = 0.17

Impact of Removing/Fixing FPs

Top 5 FP

Top Dog False Positives

www.cs.illinois.edu/homes/dhoiem/publications/detectionAnalysis_eccv12.tar.gzSlide24