Skip to content

A didactic tool to show how hyperparameter tuning works for an MLP

Notifications You must be signed in to change notification settings

goelakash/Hyperparameter-Tuning-With-Voila

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Experiment with Hyperparameter tuning for a neural-net

This is a project to demonstrate how tuning different hyperparameters of a neural-network lead to a change in accuracy and time-taken. This uses the widgets library of IPython for Jupyter notebooks, and uses Voila to synthesize it into a static web-site, free of code.

The neural-net example here takes the vectorized MNIST dataset as input and passes it through 3 hidden layers to predict the input image class (i.e., value from 0-9) using a cross-entropy loss function.

Using Voíla, we synthesize the notebook into a static web-page with active widgets, and thus get a web-page where we can play around with the inputs and get the desired output.

Get Started:

To get this project working as intented, do the following:

voila MNIST_widgets.ipynb

Demo:

Here's what the dashboard finally looks like:

alt text

About

A didactic tool to show how hyperparameter tuning works for an MLP

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published