Hyperparameters

Is a variable that we need to set to a value before we can train a Neural Network. There are no magic numbers, it all depends on the architecture, data and problem to solve, etc. Optimizer Hyperparameters learning rate Is the most important hyperparameter of all, typical values are: 0.1, 0.01, 0.001, 0.0001, 0.00001 and…