# A Recipe of Probabilistic Graphical Models and Neural Networks

**
Karthik Ragunath A (~karthik_ragunath)
** |
** **

**Description:**

A Probabilistic Graphical Model (PGM) is probabilistic model in which a graph expresses the conditional dependence across Random Variables. Probabilistic Graphical Models in combination with Neural Networks opens up a whole new paradigm of abstraction (classification) and generative capabilities in Deep Learning domain.

State of the art neural networks which are used for Image restoration, Contextual Scene Understanding use concepts of Probabilistic Graphical Models.

**What can the audience expect to takeaway from this workshop ?**

At the end of this workshop, audience will have a good conceptual understanding of Probabilistic Graphical Models and how it could be used with Neural Networks to train abstraction models (for Image Classification) and generative models (for Denoising Images). Further we will also be divulging on how the training of RBM based Neural Networks can be optimized using K - Contrasive Divergence technique.

**An outline of what this workshop is about:**

- Representing relationships between random variables as directed graphs (Bayesian Networks).
- Representing relationships between random variables as undirected graphs (Markov Networks).
- Introduction of concepts of latent variables (Hidden Variables).
- Introduction of concepts of Restricted Boltzmann Machines (RBM).
- Representing RBM as Neural Networks with Maximum Log Likelihood Estimation (MLE) being used as cost function to perform unsupervised learning .
- Introduction of Markov Chains (how Neural Networks transition matrix can be modelled using Markov Chains).
- Optimizing the training of RBM Neural Networks using sampling technique based on Monte Carlo Markov Chain algorithm (Gibbs Sampling).
- Training RBM Neural Networks with Gibbs Sampling (applying batching techniques) for both abstraction (image classification) and generation task (denoising of images).
- Using K - Contrastive divergence technique to further optimize (quicken) the process of training RBM Neural Networks.

Happy Coding !!!

**Prerequisites:**

**Mathematical prerequisites:**

- Knowledge of Probability theory.
- Basic knowledge of Random Variables.
- Basic knowledge of differentiation.

**Programming prerequisites:**

- Python

**Recommended prerequisites:**

- Basic knowledge of PyTorch Framework.

**Content URLs:**

A Gist of what this workshop is about

Click to learn about Bayesian Classifier which is a pre-requisite (jupyter notebook)

Click to learn about Hidden Markov Models and their implementation (jupyter notebook kernal)

**Speaker Info:**

I am a Machine Learning Engineer in Mad Street Den building algorithms to solve complex problems.

I have an avid interest in Mathematics and Computer Science. I am also pretty active in competitive programming platforms like CodeChef.

**Speaker Links:**

My Linkedin Profile - https://www.linkedin.com/in/karthik-ragunath-a-706a37105/

My CodeChef Profile - https://www.codechef.com/users/karthik6995

My Github Profile - https://github.com/Karthik-Ragunath