How To Avoid Overfitting [For Beginners] [Deep Learning]

Source: Deep Learning on Medium

How To Avoid Overfitting [For Beginners] [Deep Learning]

Best way to avoid overfitting is to avoid using deep learning and use random forest. But why random forest? Random forest doesn’t overfit? Ofcourse random forest overfits too. But if you increase the number of trees overfitting should decrease.

Neural networks tends to overfit more than classical machine learning algorithms. But what are simple techniques to prevent overfitting when training neural networks for beginners.

Less layers, less neurons => Less overfitting

If you simplify your model by decreasing number of layers and number of neurons per layer you can decrease overfitting but prediction accuracy can also decrease.

Add dropout layers to generalize your model

Dropout layers randomly drops some of connections between layers and that decrease overfitting by randomness of the selected connections.

Stopping earlier than initial epochs

Stop learning process when validation loss starts rising up.

Conclusion

These are just basic techniques to prevent your first neural network from overfitting. You can test these techniques easily using keras.