Tanay Agrawal
Tanay Agrawal is Deep Learning Engineer, currently working with Curl HG. He specializes in Computer Vision and Deep Learning. He has extensively worked on Hyperparameter Optimization. He has published a book on the same; "Hyperparameter Optimization in Machine Learning" with Apress.
Sessions
The tutorial aims to introduce the audience to the power of Hyperparameter Optimization. It will help them learn; how using simple python libraries one can make a huge difference in their ML model behavior.
We start with understanding the importance of hyperparameters, and the different distributions they are selected from. We then review some basic methods of optimizing hyperparameters, moving on to distributed methods and then to bayesian optimization methods. We'll use these algorithms hands-on, and play around with search spaces. We'll try out packages like Hyperopt, Dask, Optuna, to tune hyperparameters.
This tutorial will help beginner-level ML practitioners and working professionals use these methods in their applied ML tasks. They will be able to enhance the model quality and tune hyperparameters for bulky experiments more effectively.
Prior Knowledge Expected - Basic Python, a very basic understanding of Machine learning.\
Good to have - worked with libraries like scikit-learn(just knowing model.fit()
should be enough)