Tips, Tricks and Common Pitfalls in training Neural Networks

nickil21


Description:

When you break or misconfigure code you will often get some kind of an exception. You plugged in an integer where something expected a string. The function only expected 3 arguments. This import failed. That key does not exist. The number of elements in the two lists isn’t equal. In addition, it’s often possible to create unit tests for a certain functionality.

This is just a start when it comes to training neural nets. Everything could be correct syntactically, but the whole thing isn’t arranged properly, and it’s really hard to tell. The "possible error surface" is large, logical (as opposed to syntactic), and very tricky to unit test.

For example, perhaps you forgot to flip your labels when you left-right flipped the image during data augmentation. Your net can still (shockingly) work pretty well because your network can internally learn to detect flipped images and then it left-right flips its predictions. Or maybe your autoregressive model accidentally takes the thing it’s trying to predict as an input due to an off-by-one bug. Or you tried to clip your gradients but instead clipped the loss, causing the outlier examples to be ignored during training. Or you initialized your weights from a pretrained checkpoint but didn’t use the original mean. Or you just screwed up the settings for regularization strengths, learning rate, its decay rate, model size, etc.

Therefore, your misconfigured neural net will throw exceptions only if you’re lucky; Most of the time it will train but silently work a bit worse.

Outline

  • Data Augmentation in images and text [4-5 Mins]
  • How to apply different pre-processing techniques [4-5 Mins]
  • How to perform weight initialization [4-5 Mins]
  • Understand how regularization can prevent Overfitting [4-5 Mins]
  • How to tune parameters smarter and quicker [4-5 Mins]
  • QnA [4-5 Mins]

Outcome

The talk should usher you with all the major ingredients for successfully training Neural Networks.

Prerequisites:

  • Basic familiarity with deep learning concepts. Nice to have but not essential.
    • Curious mindset.

Speaker Info:

Have over 3+ years of industrial experience in Data Science. Currently working as a data scientist (NLP) at niki.ai, where I use Python extensively to carry out my day to day ML tasks. I've also participated in numerous data science competitions across Kaggle, AnalyticsVidhya, Topcoder, Crowdanalytix etc and finished in the top 10 in atleast a dozen of those.

Specialties: data science, machine learning, predictive modelling, natural language processing, deep learning, big data, artificial intelligence.

Speaker Links:

Section: Data Science, Machine Learning and AI
Type: Talks
Target Audience: Intermediate
Last Updated: