Advanced ML: Learn how to Improve Accuracy by optimizing Hyper-Parameters using Hyperopt





Hands on Experience with Advanced Hyper-parameter Optimization Techniques, using Hyperopt

We'll go step by step, starting with what Hyper-parameter optimization is, we'll then implement a simple exhaustive search from scratch and do some exercises, after that we'll try SkLearn's Grid Search, we'll compare it with the more effective Hyper-Parameter Optimization TPE Algorithm implemented in Hyperopt. We've included exercise for every part so that, you guys get a good understanding on what you are doing. We'll also go through on how to parallelize the evaluations using MongoDB making the optimization even more effective.

A Docker Image will be provided, so that participants won't have to waste time in setting up the environment.

The Workflow of the Workshop would be:

  • We will start with a slide presentation so that participants get some insight on what they are going to do.
  • After that we'll shift on to Juypter Notebooks(pre-installed in the docker environment, so you can just focus on the implementation part), here they will implement the code, and see the best algorithms of hyperparameter optimization working.
  • After that we'll show a working demo of a problem that we were working on and solved using Hyperopt during our Summer Intern at MateLabs.

After attending this workshop you will be able to apply Hyper-parameter optimization using better algorithms which decides the hyper-parameters based on information. In short much much efficient model training.


Basic Python Coding and a little familiarity with Machine Learning/Data Science.

Content URLs:

So, Slides can be seen here:

Full content is available here:

You can also have a look at my article:

In the Repo

  • iris.csv is the dataset that we'll work on.
  • docker folder contains the scripts to setup Environment
  • "Hyperparameter_optimization_good_and_bad_hps.ipynb" we'll see the importance of hyper-parameter tuning, with examples and exercises.
  • In "hyperparameter_tuning.ipynb" we'll see use python loops to tune hyper-parameters and do some exercises as well.
  • Then we have another notebook "using_scikitlearn.ipynb", where we'll use the scikit-learn library for hyper-parameter tuning, and make our life easier.
  • "Introduction to Hyperopt.ipynb" is iPython Notebook which contains the use of advance hyper-parameter tuning algorithms.
  • "link_to_slides.txt" contains the link to our presentation

Note: We'll also work on Boston dataset, but call it from sklearn.

Speaker Info:

Tanay Agrawal

Working on Machine Learning/Deep Learning and also an Open Source Enthusiast. Currently in Final Year of his Engineering. He is working as Deep Learning Intern at Matelabs. He along with team at MateLabs is creating Meta Algorithms, so that user even with minimum or no knowledge of Machine Learning would be able to use it. Also he is a contributor at SymPy. He has previously worked on state of the art Classification and Object detection Models as well. He has previously conducted Python workshop at SFD-SMVDU and also he conduct the session of AI Circle at his College regularly.

Anubhav Kesari

Currently at fInal year of engineering from IIIT Guwahati. Two worked on the same problem and solved it using Hyperopt. Anubhav is the summer intern at MateLabs as well. He has worked at Cadence Design Systems in summer of 2017 as Software Development Intern. He has also been working on development of blockchain based distributed neural networks at MateLabs

Id: 867
Section: Data science
Type: Workshops
Target Audience: Intermediate
Last Updated: