did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780470743041

Bayesian Networks An Introduction

by ;
  • ISBN13:

    9780470743041

  • ISBN10:

    0470743042

  • Edition: 1st
  • Format: Hardcover
  • Copyright: 2009-11-02
  • Publisher: Wiley
  • Purchase Benefits
  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $131.14 Save up to $0.66
  • Buy New
    $130.48
    Add to Cart Free Shipping Icon Free Shipping

    PRINT ON DEMAND: 2-4 WEEKS. THIS ITEM CANNOT BE CANCELLED OR RETURNED.

Supplemental Materials

What is included with this book?

Summary

A self-contained introduction to the theory and applications of Bayesian NetworksBayesian networks are a topic of interest and importance for statisticians, computer scientists and those involved in modelling and the learning of complex data sets. The material included in this introductory guide has been extensively tested in classroom teaching and assumes a basic knowledge of probability and statistics course and basic mathematics. All notions are explained carefully with an extensive set of exercises throughout the book as well as computer exercises. A solutions manual is also provided online.

Author Biography

Timo Koski Institutionen fr Matematik, Kungliga Teknisha Hgskolan, Stockholm, Sweden John M. Noble Matematiska Institutionen, Linkpings Tekniska Hgskola, Linkpings universiter, Linkping, Sweden

Table of Contents

Prefacep. ix
Graphical models and probabilistic reasoningp. 1
Introductionp. 1
Axioms of probability and basic notationsp. 4
The Bayes update of probabilityp. 9
Inductive learningp. 11
Bayes' rulep. 12
Jeffrey's rulep. 13
Pearl's method of virtual evidencep. 13
Interpretations of probability and Bayesian networksp. 14
Learning as inference about parametersp. 15
Bayesian statistical inferencep. 17
Tossing a thumb-tackp. 20
Multinomial sampling and the Dirichlet integralp. 24
Notesp. 28
Exercises: Probabilistic theories of causality, Bayes' rule, multinomial sampling and the Dirichlet densityp. 31
Conditional independence, graphs and d-separationp. 37
Joint probabilitiesp. 37
Conditional independencep. 38
Directed acyclic graphs and d-separationp. 41
Graphsp. 41
Directed acyclic graphs and probability distributionsp. 45
The Bayes ballp. 50
Illustrationsp. 51
Potentialsp. 53
Bayesian networksp. 58
Object oriented Bayesian networksp. 63
d-Separation and conditional independencep. 66
Markov models and Bayesian networksp. 67
I-maps and Markov equivalencep. 69
The trek and a distribution without a faithful graphp. 72
Notesp. 73
Exercises: Conditional independence and d-separationp. 75
Evidence, sufficiency and Monte Carlo methodsp. 81
Hard evidencep. 82
Soft evidence and virtual evidencep. 85
Jeffrey's rulep. 86
Pearl's method of virtual evidencep. 87
Queries in probabilistic inferencep. 88
The chest clinic problemp. 89
Bucket eliminationp. 89
Bayesian sufficient statistics and prediction sufficiencyp. 92
Bayesian sufficient statisticsp. 92
Prediction sufficiencyp. 92
Prediction sufficiency for a Bayesian networkp. 95
Time variablesp. 98
A brief introduction to Markov chain Monte Carlo methodsp. 100
Simulating a Markov chainp. 103
Irreducibility, aperiodicity and time reversibilityp. 104
The Metropolis-Hastings algorithmp. 108
The one-dimensional discrete Metropolis algorithmp. 111
Notesp. 112
Exercises: Evidence, sufficiency and Monte Carlo methodsp. 113
Decomposable graphs and chain graphsp. 123
Definitions and notationsp. 124
Decomposable graphs and triangulation of graphsp. 127
Junction treesp. 131
Markov equivalencep. 133
Markov equivalence, the essential graph and chain graphsp. 138
Notesp. 144
Exercises: Decomposable graphs and chain graphsp. 145
Learning the conditional probability potentialsp. 149
Initial illustration: maximum likelihood estimate for a fork connectionp. 149
The maximum likelihood estimator for multinomial samplingp. 151
MLE for the parameters in a DAG: the general settingp. 155
Updating, missing data, fractional updatingp. 160
Notesp. 161
Exercises: Learning the conditional probability potentialsp. 162
Learning the graph structurep. 167
Assigning a probability distribution to the graph structurep. 168
Markov equivalence and consistencyp. 171
Establishing the DAG isomorphic propertyp. 173
Reducing the size of the searchp. 176
The Chow-Liu treep. 177
The Chow-Liu tree: A predictive approachp. 179
The K2 structural learning algorithmp. 183
The MMHC algorithmp. 184
Monte Carlo methods for locating the graph structurep. 186
Women in mathematicsp. 189
Notesp. 191
Exercises: Learning the graph structurep. 192
Parameters and sensitivityp. 197
Changing parameters in a networkp. 198
Measures of divergence between probability distributionsp. 201
The Chan-Darwiche distance measurep. 202
Comparison with the Kullback-Leibler divergence and euclidean distancep. 209
Global bounds for queriesp. 210
Applications to updatingp. 212
Parameter changes to satisfy query constraintsp. 216
Binary variablesp. 218
The sensitivity of queries to parameter changesp. 220
Notesp. 224
Exercises: Parameters and sensitivityp. 225
Graphical models and exponential familiesp. 229
Introduction to exponential familiesp. 229
Standard examples of exponential familiesp. 231
Graphical models and exponential familiesp. 233
Noisy 'or' as an exponential familyp. 234
Properties of the log partition functionp. 237
Fenchel Legendre conjugatep. 239
Kullback-Leibler divergencep. 241
Mean field theoryp. 243
Conditional Gaussian distributionsp. 246
CG potentialsp. 249
Some results on marginalizationp. 249
CG regressionp. 250
Notesp. 251
Exercises: Graphical models and exponential familiesp. 252
Causality and intervention calculusp. 255
Introductionp. 255
Conditioning by observation and by interventionp. 257
The intervention calculus for a Bayesian networkp. 258
Establishing the model via a controlled experimentp. 262
Properties of intervention calculusp. 262
Transformations of probabilityp. 265
A note on the order of 'see' and 'do' conditioningp. 267
The 'Sure Thing' principlep. 268
Back door criterion, confounding and identifiabilityp. 270
Notesp. 273
Exercises: Causality and intervention calculusp. 275
The junction tree and probability updatingp. 279
Probability updating using a junction treep. 279
Potentials and the distributive lawp. 280
Marginalization and the distributive lawp. 283
Elimination and domain graphsp. 284
Factorization along an undirected graphp. 288
Factorizing along a junction treep. 290
Flow of messages initial illustrationp. 292
Local computation on junction treesp. 294
Schedulesp. 296
Local and global consistencyp. 302
Message passing for conditional Gaussian distributionsp. 305
Using a junction tree with virtual evidence and soft evidencep. 311
Notesp. 313
Exercises: The junction tree and probability updatingp. 314
Factor graphs and the sum product algorithmp. 319
Factorization and local potentialsp. 319
Examples of factor graphsp. 320
The sum product algorithmp. 323
Detailed illustration of the algorithmp. 329
Notesp. 332
Exercise: Factor graphs and the sum product algorithmp. 333
Referencesp. 335
Indexp. 343
Table of Contents provided by Ingram. All Rights Reserved.

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program