/
By Roger Ballard Tanqiuhao By Roger Ballard Tanqiuhao

By Roger Ballard Tanqiuhao - PowerPoint Presentation

mitsue-stanley
mitsue-stanley . @mitsue-stanley
Follow
347 views
Uploaded On 2018-10-13

By Roger Ballard Tanqiuhao - PPT Presentation

Chen Support Vector Machines The Basic Method Support vector machines are a type of supervised binary linear classifier The idea behind support vector machines is to draw a hyperplane between two linearly separable groups of ID: 688897

support classification vector credit classification support credit vector machines image method hyperplane linear kernel http svm separable linearly improvement

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "By Roger Ballard Tanqiuhao" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

ByRoger BallardTanqiuhao Chen

Support

Vector MachinesSlide2

The Basic Method

Support vector machines are a type of supervised binary linear

classifier

The idea behind support vector machines is to draw a hyperplane between two linearly separable groups of vectorsThe hyperplane is drawn to maximize the distance from the hyperplane to the nearest vectorsThese vectors are called the support vectors, giving the method its name

Image

credit:

https://commons.wikimedia.org/wiki/File:Svm_max_sep_hyperplane_with_margin.pngSlide3

Limitations of the Basic Method

Does not work if the data is not linearly separable

Can only be used to classify between two classes

Can only perform linear classificationSlide4

Improvement: Working with Non-Linearly Separable Classes

Soft margin SVM

Hinge loss function

Penalize going over the line proportional to the distance overAdd a tuning parameterWeights how important the correct side is compared to creating a large margin

Image

credit:

http://efavdb.com/svm-classification/Slide5

Improvement: Classification with More Than Two Classes

Create multiple binary SVMs and have a vote

Method 1: one vs all

N classifiers for class contains point or class doesn’t contain pointMost sure classifier winsMethod 2: one vs oneN2

classifiers: one for each pair of classesClass that is voted for by the greatest number of classifiers wins

Image credit:

http://courses.media.mit.edu/2006fall/mas622j/Projects/aisen-project

/Slide6

Improvement: Performing Non-Linear Classification

The kernel trick

Map your data into a higher-dimensional space using some kernel

In this example, the radial basis kernel is usedZ value is Gaussian(radius from origin)Perform linear classification in the higher-dimensional space

Image credit: http://www.bindichen.co.uk/post/AI/Nonlinear-Support-Vector-Machines.html