Advance ML- On Improving Performance of Machine Learning Models by Optimizing Hyper-parameters
Objective and Summary:
We'll go step by step, starting with what Hyper-parameter optimization is, we'll then implement a simple exhaustive search from scratch and do some exercises, after that we'll try Scikit-Learn's Grid Search and Random Search, we'll compare it with the more effective Hyper-Parameter Optimization Algorithm implemented in Hyperopt Library, TPE. We've included exercise for every part so that, you guys get a good understanding on what you are doing. We'll also go through on how to parallelize the evaluations using MongoDB making the optimization even more effective, and discuss the solution to general difficulties faced while making it work.
A Docker Image will be provided, so that participants won't have to waste time in setting up the environment.
The Workflow and Content:
- We will start with a slide presentation so that participants get some insight on what they are going to do: 15 Minutes
- We'll now setup the environment to work further, we know it's an easy task but sometimes some unexpected problems occur and participants struggles a lot later while the workshop is going at it's full pace, so we'll wait until everyone is ready: at least 10 minutes
After that we'll shift to Juypter Notebooks, here you will implement the code, and see the algorithms of hyperparameter optimization in action. You can check out the notebooks here on this github repo.
- First we'll have a look at "Hyperparameter_optimization_good_and_bad_hps.ipynb". We'll see in this what hyper-parameters are and how important it is to choose good hyper-parameters, all this using Scikit-Learn Library. In case if the attendee is not familiar with Scikit-learn they will get the gist of it. We have a blank space with comments in the notebook where participants can try things for themselves: 15 minutes + 5 minutes(doubts)
- We'll look at one way of hyper-parameter tuning now that we know the importance of it(in file "hyperparameter_tuning.ipynb"), it'll cover implementation of grid search from scratch, we'll try it out on some datasets and algorithms: 20 minutes + 5 minutes(doubts)
- Now moving on to Scikit-learn library's algorithms for hyper-parameter tuning, GridSearchCV and RandomSearchCV("using_scikitlearn.ipynb"). We'll look at examples and do some exercises: 20 minutes + 5 minutes(doubts)
"Introduction to Hyperopt.ipynb" last notebook in the session, is on how to use hyperopt and understanding how to make a search space in Hyperopt where people generally struggles and compare it with the exhaustive methods like Grid Search: 25 minutes + 5(doubts)
We'll try to keep the workshop as interactive as possible
After that we'll explain some code which uses hyperopt wrapper for Keras, to optimize hyper-parameter in Neural Networks: 10 minutes
- Also we'll show a demo of a problem that we were working on and solved using Hyperopt while at MateLabs: 10 minutes
After attending this workshop you will be able to apply Hyper-parameter optimization using better algorithms which decides the hyper-parameters based on previous information instead of just brute force techniques. In short much much efficient model training.
So, better attend it! ;)
Attendees should know the basics of Python and a bit familiarity with Scikit-Learn would certainly help.
Paricipants must also know what exactly is machine learning and why we use it. Also, it would be even better if you know what are the basic kind of classification algorithms used(like Linear Regression, Decision Tree, Support Vector Machine) and how we choose them on the basis of different problem statements in Machine Learning.
So, Slides can be seen here: https://slides.com/tanayagrawal/efficient-hyperparameter-optimization#/
Full content for workshop is available here: https://github.com/tanayag/tutorial_hyperparameter_optimization
In the Repo
- iris.csv is the dataset that we'll work on.
- docker folder contains the scripts to setup Environment
- "Hyperparameter_optimization_good_and_bad_hps.ipynb" we'll see the importance of hyper-parameter tuning, with examples and exercises.
- In "hyperparameter_tuning.ipynb" we'll see use python loops to tune hyper-parameters and do some exercises as well.
- Then we have another notebook "using_scikitlearn.ipynb", where we'll use the scikit-learn library for hyper-parameter tuning, and make our life easier.
- "Introduction to Hyperopt.ipynb" is iPython Notebook which contains the use of advance hyper-parameter tuning algorithms.
- "link_to_slides.txt" contains the link to our presentation
Note: We'll also work on Boston dataset, but call it from sklearn.
You can also have a look at my article where I have discussed Hyperopt which received a good response in data science community and helped lots of data scientists: https://blog.goodaudience.com/on-using-hyperopt-advanced-machine-learning-a2dde2ccece7
Working on Machine Learning/Deep Learning and also an Open Source Enthusiast. I am currently working in Curl Analytics as Deep Learning Researcher for OCR engine, SARA. Last summer I worked as Deep Learning Intern at Matelabs. We were creating Meta Algorithms, so that user even with minimum or no knowledge of Machine Learning would be able to use it. Also I have been a contributor at SymPy. I have previously taken workshop at SFD-SMVDU and also I conducted the session of AI Circle in my College regularly.
Currently working with Exzeo as Deep Learning Engineer, graduated this year from IIIT Guwahati. I have previously worked with MateLabs as a Machine Learning Intern , where we were creating an AutoML platform using Meta Learning, Mateverse which would help the Data Science community a lot. We have worked on the problem of Hyperparameter Optimization and optimized the existing pipeline. I have been a constant contributor in the open source world . I have worked at Cadence Design Systems in summer of 2017 as Software Development Intern.
If you want to know more about my work, check my resume here
Link to some of my blogs: