/
Third Generation Machine Intelligence Third Generation Machine Intelligence

Third Generation Machine Intelligence - PowerPoint Presentation

emily
emily . @emily
Follow
0 views
Uploaded On 2024-03-13

Third Generation Machine Intelligence - PPT Presentation

Christopher M Bishop Microsoft Research Cambridge Microsoft Research Summer School 2009 First Generation Artificial Intelligence GOFAI Within a generation the problem of creating artificial intelligence will largely be solved ID: 1047501

bayesian graphs inference research graphs bayesian research inference graphical theory outcome match local yf3 zf4 message theme statistical factor

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Third Generation Machine Intelligence" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

1. Third Generation Machine IntelligenceChristopher M. BishopMicrosoft Research, CambridgeMicrosoft Research Summer School 2009

2. First Generation“Artificial Intelligence” (GOFAI)Within a generation ... the problem of creating ‘artificial intelligence’ will largely be solvedMarvin Minsky (1967)Expert Systemsrules devised by humansCombinatorial explosionGeneral theme: hand-crafted rules

3. Second GenerationNeural networks, support vector machinesDifficult to incorporate complex domain knowledgeGeneral theme: black-box statistical models

4. Third GenerationGeneral theme: deep integration of domainknowledge and statistical learningProbabilistic graphical modelsBayesian frameworkfast inference using local message-passingOrigins: Bayesian networks, decision theory, HMMs, Kalman filters, MRFs, mean field theory, ...

5. Bayesian LearningConsistent use of probability to quantify uncertaintyPredictions involve marginalisation, e.g.

6. Why is prior knowledge important?yx?

7. Probabilistic Graphical ModelsNew insights into existing modelsFramework for designing new modelsGraph-based algorithms for calculation and computation (c.f. Feynman diagrams in physics)Efficient software implementationDirected graphs to specify the modelFactor graphs for inference and learningProbability theory + graphs

8. Directed Graphs

9. Example: Time Series Modelling

10.

11. Manchester Asthma and Allergies StudyChris BishopIain BuchanMarkus SvensénVincent TanJohn Winn

12.

13. Factor Graphs

14. From Directed Graph to Factor Graph

15. Local message-passingEfficient inference by exploiting factorization:

16. Factor Trees: Separationvwxf1(v,w)f2(w,x)yf3(x,y)zf4(x,z)

17. Messages: From Factors To Variableswxf2(w,x)yf3(x,y)zf4(x,z)

18. Messages: From Variables To Factorsxf2(w,x)yf3(x,y)zf4(x,z)

19. What if marginalisations are not tractable?True distributionMonte CarloVariational BayesLoopy belief propagationExpectation propagation

20. Illustration: Bayesian RankingRalf HerbrichTom MinkaThore Graepel

21. Two Player Match Outcome Modely1212s1s2

22. Two Team Match Outcome Modely12t1t2s2s3s1s4

23. Multiple Team Match Outcome Models1s2s3s4t1y12t2t3y23

24. Efficient Approximate Inferences1s2s3s4t1y12t2t3y23Gaussian Prior FactorsRanking Likelihood Factors

25. Convergence0510152025303540Level0100200300400Number of Gameschar (Elo)SQLWildman (Elo)char (TrueSkill™)SQLWildman (TrueSkill™)

26. TrueSkillTM

27. John WinnChris Bishop

28. research.microsoft.com/infernetTom MinkaJohn WinnJohn GuiverAnitha Kannan

29. SummaryNew paradigm for machine intelligence built on:a Bayesian formulationprobabilistic graphical modelsfast inference using local message-passingDeep integration of domain knowledge and statistical learningLarge-scale application: TrueSkillTMToolkit: Infer.NET

30. http://research.microsoft.com/~cmbishop