Home / IT Books / Learning Probabilistic Graphical Models in R

Learning Probabilistic Graphical Models in R

Learning Probabilistic Graphical Models in R

Machine learning is the study of algorithms that can learn and adapt from data and observation, reason, and perform tasks using learned models and algorithms. As the world we live in is inherently uncertain, in the sense that even the simplest observation such as the color of the sky is impossible to determine absolutely, we needed a theory that can encompass this uncertainty.

Chapter 1: Probabilistic Reasoning
Machine learning
Representing uncertainty with probabilities
Beliefs and uncertainty as probabilities
Conditional probability
Probability calculus and random variables
Sample space, events, and probability
Random variables and probability calculus
Joint probability distributions
Bayes’ rule
Interpreting the Bayes’ formula
A first example of Bayes’ rule
A first example of Bayes’ rule in R
Probabilistic graphical models
Probabilistic models
Graphs and conditional independence
Factorizing a distribution
Directed models
Undirected models
Examples and applications
Chapter 2: Exact Inference
Building graphical models
Types of random variable
Building graphs
Probabilistic expert system
Basic structures in probabilistic graphical models
Variable elimination
Sum-product and belief updates
The junction tree algorithm
Examples of probabilistic graphical models
The sprinkler example
The medical expert system
Models with more than two layers
Tree structure
Chapter 3: Learning Parameters
Learning by inference
Maximum likelihood
How are empirical and model distribution related?
The ML algorithm and its implementation in R
Learning with hidden variables – the EM algorithm
Latent variables
Principles of the EM algorithm
Derivation of the EM algorithm
Applying EM to graphical models
Chapter 4: Bayesian Modeling – Basic Models
The Naive Bayes model
Learning the Naive Bayes model
Bayesian Naive Bayes
The prior distribution 11
The posterior distribution with the conjugacy property
Which values should we choose for the Beta parameters?
The Gaussian mixture model
Chapter 5: Approximate Inference
Sampling from a distribution
Basic sampling algorithms
Standard distributions
Rejection sampling
An implementation in R
Importance sampling 142
An implementation in R
Markov Chain Monte-Carlo
General idea of the method
The Metropolis-Hastings algorithm
MCMC for probabilistic graphical models in R
Installing Stan and RStan
A simple example in RStan
Chapter 6: Bayesian Modeling – Linear Models
Linear regression
Estimating the parameters
Bayesian linear models
Over-fitting a model
Graphical model of a linear model
Posterior distribution
Implementation in R
A stable implementation
More packages in R
Chapter 7: Probabilistic Mixture Models
Mixture models
EM for mixture models
Mixture of Bernoulli
Mixture of experts
Latent Dirichlet Allocation
The LDA model
Variational inference

Learning Probabilistic Graphical Models in R

Top books

About huydam

Check Also

[UWP] The program could not build Windows Universal Samples

If you get this error like this: “Type universe cannot resolve assembly: System.Runtime, Version=, Culture=neutral, …

Leave a Reply

Your email address will not be published. Required fields are marked *