/
Logistic Regression Classification Logistic Regression Classification

Logistic Regression Classification - PowerPoint Presentation

helene
helene . @helene
Follow
66 views
Uploaded On 2023-07-22

Logistic Regression Classification - PPT Presentation

Machine Learning Classification Email Spam Not Spam Online Transactions Fraudulent Yes No Tumor Malignant Benign 0 Negative Class eg benign tumor 1 Positive Class eg malignant tumor ID: 1009852

regression gradient theta class gradient regression class theta learning cost tumor logistic function predict classification jval probability computecode output

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Logistic Regression Classification" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

1. LogisticRegressionClassificationMachine Learning

2. ClassificationEmail: Spam / Not Spam?Online Transactions: Fraudulent (Yes / No)?Tumor: Malignant / Benign ?0: “Negative Class” (e.g., benign tumor) 1: “Positive Class” (e.g., malignant tumor)

3. Tumor SizeThreshold classifier output at 0.5:If , predict “y = 1”If , predict “y = 0”Tumor SizeMalignant ?(Yes) 1(No) 0

4. Classification: y = 0 or 1can be > 1 or < 0Logistic Regression:

5. LogisticRegressionHypothesisRepresentationMachine Learning

6. Sigmoid functionLogistic functionLogistic Regression ModelWant10.50

7. Interpretation of Hypothesis Output= estimated probability that y = 1 on input x Tell patient that 70% chance of tumor being malignant Example: If “probability that y = 1, given x, parameterized by ”

8. LogisticRegressionDecision boundaryMachine Learning

9. Logistic regression Suppose predict “ “ if predict “ “ ifz1

10. x1x2Decision Boundary123123Predict “ “ if

11. Non-linear decision boundariesx1x2Predict “ “ if x1x21-1-11

12. LogisticRegressionCost functionMachine Learning

13. Training set:How to choose parameters ?m examples

14. Cost functionLinear regression:“non-convex”“convex”

15. Logistic regression cost functionIf y = 110

16. Logistic regression cost functionIf y = 010

17. LogisticRegressionSimplified cost function and gradient descentMachine Learning

18. Logistic regression cost function

19. Output Logistic regression cost functionTo fit parameters : To make a prediction given new :

20. Gradient DescentWant :Repeat(simultaneously update all )

21. Gradient DescentWant :(simultaneously update all )RepeatAlgorithm looks identical to linear regression!

22. LogisticRegressionAdvanced optimizationMachine Learning

23. Optimization algorithmCost function . Want .Given , we have code that can compute (for )RepeatGradient descent:

24. Optimization algorithmGiven , we have code that can compute (for )Optimization algorithms:Gradient descentConjugate gradientBFGSL-BFGSAdvantages:No need to manually pick Often faster than gradient descent.Disadvantages:More complex

25. Example: function [jVal, gradient] = costFunction(theta)jVal = (theta(1)-5)^2 + ... (theta(2)-5)^2;gradient = zeros(2,1);gradient(1) = 2*(theta(1)-5);gradient(2) = 2*(theta(2)-5);options = optimset(‘GradObj’, ‘on’, ‘MaxIter’, ‘100’);initialTheta = zeros(2,1);[optTheta, functionVal, exitFlag] ... = fminunc(@costFunction, initialTheta, options);

26. gradient(1) = [ ];function [jVal, gradient] = costFunction(theta)theta = jVal = [ ];gradient(2) = [ ];gradient(n+1) = [ ];code to computecode to computecode to computecode to compute

27. LogisticRegressionMulti-class classification: One-vs-allMachine Learning

28. Multiclass classificationEmail foldering/tagging: Work, Friends, Family, HobbyMedical diagrams: Not ill, Cold, FluWeather: Sunny, Cloudy, Rain, Snow

29. x1x2x1x2Binary classification:Multi-class classification:

30. x1x2One-vs-all (one-vs-rest):Class 1:Class 2:Class 3:x1x2x1x2x1x2

31. One-vs-allTrain a logistic regression classifier for each class to predict the probability that .On a new input , to make a prediction, pick the class that maximizes