/
Classification: Probabilistic Generative Model Classification: Probabilistic Generative Model

Classification: Probabilistic Generative Model - PowerPoint Presentation

callie
callie . @callie
Follow
68 views
Uploaded On 2023-10-04

Classification: Probabilistic Generative Model - PPT Presentation

Classification Credit Scoring Input income savings profession age past financial history Output accept or refuse Medical Diagnosis Input current symptoms age gender past medical history ID: 1023063

output class gaussian pokemon class output pokemon gaussian probability function points covariance likelihood means target find water data input

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Classification: Probabilistic Generative..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

1. Classification: Probabilistic Generative Model

2. ClassificationCredit ScoringInput: income, savings, profession, age, past financial history ……Output: accept or refuseMedical DiagnosisInput: current symptoms, age, gender, past medical history ……Output: which kind of diseasesHandwritten character recognitionFace recognition Input: image of a face, output: person金Input:output:Function Class n

3. Example Application   

4. Example ApplicationHP: hit points, or health, defines how much damage a pokemon can withstand before faintingAttack: the base modifier for normal attacks (eg. Scratch, Punch)Defense: the base damage resistance against normal attacksSP Atk: special attack, the base modifier for special attacks (e.g. fire blast, bubble beam)SP Def: the base damage resistance against special attacksSpeed: determines which pokemon attacks first each roundCan we predict the “type” of pokemon based on the information?355540505090pokemon games (NOT pokemon cards or Pokemon Go)

5. Example Application

6. Ideal AlternativesFunction (Model):Loss function:Find the best function:Example: Perceptron, SVMNot Today Output = class 1    Output = class 2The number of times f get incorrect results on training data.

7. How to do ClassificationTraining data for Classification    Training: Class 1 means the target is 1; Class 2 means the target is -1Classification as Regression?Binary classification as example Testing: closer to 1 class 1; closer to -1 class 2  

8. Multiple class: Class 1 means the target is 1; Class 2 means the target is 2; Class 3 means the target is 3 …… problematicPenalize to the examples that are “too correct” …(Bishop, P186)Class 1x1x1x2x2Class 2Class 1Class 211-1-1y = b + w1x1 + w2x2 b + w1x1 + w2x2 = 0 >>1errorto decrease error

9. Two BoxesBox 1Box 2P(B1) = 2/3P(B2) = 1/3P(Blue|B1) = 4/5P(Green|B1) = 1/5P(Blue|B1) = 2/5P(Green|B1) = 3/5Where does it come from?P(B1 | Blue)  from one of the boxes

10. Two ClassesGiven an x, which class does it belong to     Generative ModelP(C1)P(C2)P(x|C1)P(x|C2)Estimating the ProbabilitiesFrom training dataClass 1Class 2

11. PriorClass 1Class 2P(C1)P(C2)WaterNormalWater and Normal type with ID < 400 for training, rest for testingTraining: 79 Water, 61 NormalP(C1) = 79 / (79 + 61) =0.56P(C2) = 61 / (79 + 61) =0.44

12. Probability from ClassP( |Water)79 in totalWaterType= ?Each Pokémon is represented as a vector by its attribute. featureP(x|C1) = ?

13. Probability from Class - FeatureConsidering Defense and SP Defense   P(x|Water)=?0?WaterType …… , Assume the points are sampled from a Gaussian distribution.

14.  Input: vector x, output: probability of sampling x The shape of the function determines by mean and covariance matrix  Gaussian Distribution https://blog.slinuxer.com/tag/pca       

15.  Input: vector x, output: probability of sampling x The shape of the function determines by mean and covariance matrix  Gaussian Distribution https://blog.slinuxer.com/tag/pca       

16. Probability from ClassAssume the points are sampled from a Gaussian distributionFind the Gaussian distribution behind themWaterTypeProbability for new points …… ,    New xHow to find them?

17. The Gaussian with any mean and covariance matrix can generate these points.  Maximum Likelihood      Likelihood of a Gaussian with mean and covariance matrix  Different Likelihood …… , = the probability of the Gaussian samples …… ,  

18. Maximum Likelihood  …… , We have the “Water” type Pokémons: We assume …… , generate from the Gaussian () with the maximum likelihood       average

19. Maximum LikelihoodClass 1: WaterClass 2: Normal    

20. Now we can do classification   If  x belongs to class 1 (Water)P(C1) = 79 / (79 + 61) =0.56P(C2) = 61 / (79 + 61) =0.44      

21. How’s the results?Testing data: 47% accuracy All: hp, att, sp att, de, sp de, speed (6 features)64% accuracy … Blue points: C1 (Water), Red points: C2 (Normal)     6-dim vector  6 x 6 matrices 

22. Modifying ModelClass 1: Water    Class 2: NormalThe same  Less parameters

23. Modifying ModelMaximum likelihood …… , “Water” type Pokémons: …… , “Normal” type Pokémons:     Find , , maximizing the likelihood      Ref: Bishop, chapter 4.2.2 and is the same 

24. Modifying ModelThe same covariance matrixAll: hp, att, sp att, de, sp de, speed54% accuracy73% accuracyThe boundary is linear

25. Function Set (Model):Goodness of a function:The mean and covariance that maximizing the likelihood (the probability of generating data) Find the best function: easy Three Steps  If , output: class 1 Otherwise, output: class 2

26. Probability DistributionYou can always use the distribution you like  If you assume all the dimensions are independent, then you are using Naive Bayes Classifier.    …………1-D GaussianFor binary features, you may assume they are from Bernoulli distributions.

27. Posterior Probability     Sigmoid function

28. Warning of Math

29. Posterior Probability      sigmoid  

30.        

31.              

32. End of Warning

33.            In generative model, we estimate , , ,  Then we have w and bHow about directly find w and b?

34. ReferenceBishop: Chapter 4.1 – 4.2Data: https://www.kaggle.com/abcsds/pokemonUseful posts:https://www.kaggle.com/nishantbhadauria/d/abcsds/pokemon/pokemon-speed-attack-hp-defense-analysis-by-typehttps://www.kaggle.com/nikos90/d/abcsds/pokemon/mastering-pokebars/discussionhttps://www.kaggle.com/ndrewgele/d/abcsds/pokemon/visualizing-pok-mon-stats-with-seaborn/discussion

35. Acknowledgment 感謝 江貫榮 同學發現課程網頁上的日期錯誤感謝 范廷瀚 同學提供寶可夢的 domain knowledge感謝 Victor Chen 發現投影片上的打字錯誤