From scratch to ML - The machine learning library you really understand and explaining its predictions with LIME.

Mohit Rathore (~markroxor)


146

Votes

Description:

The aim of this workshop is to give a hands on coding experience for writing machine learning / deep learning algorithms from scratch without using external frameworks alongside visualising the model and explaining its predictions using LIME.

from-scratch-to-ml

The primary goals of this library is -
- This framework is intended to be an educational tool to learn deep Learning.
- To bridge the gap between the theoretical and coding aspects of machine learning algorithms.
- To write intuitive blogs as python notebooks so as to juxtapose theory and code. Explaining the fundamentals of the algorithm from the very basics.
- To minimise the use of external dependencies except the fundamental ones like numpy and matplotlib. - To make sure that the developed algorithms are coherent with already existing machine learning frameworks.

The library is still in a nascent stage but will take shape in a couple of months. Given that the commit frequency is huge. The audience is requested to be patient.

LIME (Local Interpretable Model-Agnostic Explanations) -

When you are writing a machine algorithm from scratch you want to make sure that your results are coherent and your model is learning the features it is meant to learn. LIME explains why your model behaved the way it did.

KDD promo video
I will quote excerpts from their blog below -

Imagine we want to explain a classifier that predicts how likely it is for the image to contain a tree frog. We take the image on the left and divide it into interpretable components (contiguous superpixels).

As illustrated below, we then generate a data set of perturbed instances by turning some of the interpretable components “off” (in this case, making them gray). For each perturbed instance, we get the probability that a tree frog is in the image according to the model. We then learn a simple (linear) model on this data set, which is locally weighted—that is, we care more about making mistakes in perturbed instances that are more similar to the original image. In the end, we present the superpixels with highest positive weights as an explanation, graying out everything else.

Even from a human's perspective these explanations do make sense.

SOURCE

TIME-LINE

Introduction to deep neural network(DNN) 15 minutes-

  • What DNN really is? Why is it called so? is there shallow neural network?
  • DNN were first introduced in 1959? How and why they gained momentum in 2000s?

Neural-Networks (NN) 45 minutes -

  • Explain architecture and write code from scratch.
  • Writing the corresponding optimizers (SGD - batch/online) with loss functions (CE/MSE).
  • Brief discussion on which optimizers suit NN, how to tune the hyperparameters.*
  • Explaining our NN predictions on MNIST dataset.
    Eg.

Recurrent neural Networks (RNN) 45 minutes-

  • Explaining why we need RNN at all, why not NN?
  • Explain architecture and write code from scratch. Difference between RNN and NN architecture.
  • Explaining RNN predictions on text8/20 news group dataset with LIME.
    Eg.

Convolutional Neural Network (CNN) 45 minutes -

  • Why we need it for images? Why not use NN?
  • What convolution is and certain other mathematical prerequisites.
  • Explain architecture and write code from scratch.
  • Explaining predictions on real world images using LIME.
    Eg.

*Will be covered only if time permits.
Keeping in mind that workshop will be of 3hrs. I have kept a buffer of half hour in case things don't go as expected.

Prerequisites:

  • Basic python libraries like numpy and matplotlib.
  • High school level understanding of mathematics and calculus
  • and of course - python.

Content URLs:

The work in progress repository of all the associated code - fromscratchtoml.
The official website of fromscratchtoml.
The work in progress python notebooks.
The author's github profile.
Sample slides will be uploaded here.

Speaker Info:

I have graduated from IIT ISM Dhanbad in 2017.

Formerly I worked for a London based startup - ALIS labs, currently I am a research fellow at CVIT Lab IIIT Hyderabad alongside being the author of fromscratchtoml.

I am also RaRe's incubator program member - the same organization which looks after the reputed topic modelling library gensim.

I have given prep talks and mentored dev sprint on the same in Hyderabad Python Meetup group twice.

Speaker Links:

Author's open source contribution can be seen at his github profile where it all started.
Author's current blog where he discussed a 'bit' about the impact of AI.
Author's old blog archive where he talked about random developer stuff.
Author's another delusional repository which he has trouble explaining to people.
Author sometimes also blogs for RaRe technologies.
Author is omnipresent on the web by the handle markroxor.

Section: Data science
Type: Workshops
Target Audience: Intermediate
Last Updated:

I would personally love to attend this, so would others as Machine Learning literally being the ultimate buzzword of 2018.

Vipul Gupta (~vipulgupta2048)

I am definitely going!

Adrish Chatterjee (~adrish)

Interesting.

Anand B Pillai (~pythonhacker)

Login to add a new comment.