概率圖模型導(dǎo)論概率論與圖論相結(jié)合_第1頁
概率圖模型導(dǎo)論概率論與圖論相結(jié)合_第2頁
概率圖模型導(dǎo)論概率論與圖論相結(jié)合_第3頁
概率圖模型導(dǎo)論概率論與圖論相結(jié)合_第4頁
概率圖模型導(dǎo)論概率論與圖論相結(jié)合_第5頁
已閱讀5頁,還剩25頁未讀, 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進(jìn)行舉報(bào)或認(rèn)領(lǐng)

文檔簡介

1、第十講 概率圖模型導(dǎo)論 Chapter 10 Introduction to Probabilistic Graphical Models Weike Pan, and Congfu Xupanweike, xucongfu Institute of Artificial Intelligence College of Computer Science, Zhejiang UniversityOctober 12, 2006浙江大學(xué)計(jì)算機(jī)學(xué)院人工智能引論課件ReferencesAn Introduction to Probabilistic Graphical Models. Michael

2、I. Jordan. OutlinePreparationsProbabilistic Graphical Models (PGM)Directed PGMUndirected PGMInsights of PGMOutlinePreparationsPGM “is” a universal modelDifferent thoughts of machine learningDifferent training approachesDifferent data typesBayesian FrameworkChain rules of probability theoryConditiona

3、l IndependenceProbabilistic Graphical Models (PGM)Directed PGMUndirected PGMInsights of PGMDifferent thoughts of machine learningStatistics (modeling uncertainty, detailed information) vs. Logics (modeling complexity, high level information)Unifying Logical and Statistical AI. Pedro Domingos, Univer

4、sity of Washington. AAAI 2006.Speech: Statistical information (Acoustic model + Language model + Affect model) + High level information (Expert/Logics)Different training approachesMaximum Likelihood Training: MAP (Maximum a Posteriori) vs. Discriminative Training: Maximum Margin (SVM)Speech: classic

5、al combination Maximum Likelihood + Discriminative TrainingDifferent data typesDirected acyclic graph (Bayesian Networks, BN)Modeling asymmetric effects and dependencies: causal/temporal dependence (e.g. speech analysis, DNA sequence analysis)Undirected graph (Markov Random Fields, MRF)Modeling symm

6、etric effects and dependencies: spatial dependence (e.g. image analysis)PGM “is” a universal modelTo model both temporal and spatial data, by unifyingThoughts: Statistics + LogicsApproaches: Maximum Likelihood Training + Discriminative Training Further more, the directed and undirected models togeth

7、er provide modeling power beyond that which could be provided by either alone.Bayesian FrameworkWhat we care is the conditional probability, and its is a ratio of two marginal probabilities.A posteriori probabilityLikelihoodPriori probabilityClass iNormalization factorObservationProblem description

8、Observation Conclusion (classification or prediction)Bayesian ruleChain rules of probability theoryConditional IndependenceOutlinePreparationsProbabilistic Graphical Models (PGM)Directed PGMUndirected PGMInsights of PGMPGMNodes represent random variables/statesThe missing arcs represent conditional

9、independence assumptions The graph structure implies the positionDirected PGM (BN)RepresentationConditional IndependenceProbability DistributionQueriesImplementationInterpretationProbability DistributionDefinition of Joint Probability DistributionCheck:RepresentationGraphical models represent joint

10、probability distributions more economically, using a set of “l(fā)ocal” relationships among variables.Conditional Independence (basic)Assert the conditional independence of a node from its ancestors, conditional on its parents.Interpret missing edges in terms of conditional independenceConditional Indep

11、endence (3 canonical graphs) Classical Markov chain“Past”, “present”, “future”Common causeY “explains” all the dependencies between X and ZMarginal Independence Common effect Multiple, competing explanationConditional IndependenceConditional Independence (check)One ing arrow and one outgoing arrowTw

12、o outgoing arrowsTwo ing arrowsCheck through reachabilityBayes ball algorithm (rules)OutlinePreparationsProbabilistic Graphical Models (PGM)Directed PGMUndirected PGMInsights of PGMUndirected PGM (MRF)RepresentationConditional IndependenceProbability DistributionQueriesImplementationInterpretationPr

13、obability Distribution(1)CliqueA clique of a graph is a fully-connected subset of nodes.Local functions should not be defined on domains of nodes that extend beyond the boundaries of cliques.Maximal cliquesThe maximal cliques of a graph are the cliques that cannot be extended to include additional n

14、odes without losing the probability of being fully connected.We restrict ourselves to maximal cliques without loss of generality, as it captures all possible dependencies.Potential function (local parameterization) : potential function on the possible realizations of the maximal clique Probability D

15、istribution(2)Maximal cliquesProbability Distribution(3)Joint probability distribution Normalization factorBoltzman distributionConditional IndependenceIts a “reachability” problem in graph theory.RepresentationOutlinePreparationsProbabilistic Graphical Models (PGM)Directed PGMUndirected PGMInsights

16、 of PGMInsights of PGM (Michael I. Jordan)Probabilistic Graphical Models are a marriage between probability theory and graph theory. A graphical model can be thought of as a probabilistic database, a machine that can answer “queries” regarding the values of sets of random variables. We build up the database in pieces, using probability theory to ensure that the pieces have a consistent overall interpretation. Probability theory also justifies the inferential machinery that allows the pieces to be put together “on the fly” to answer the queries. I

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫網(wǎng)僅提供信息存儲(chǔ)空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。

評論

0/150

提交評論