Cornerstones of AI: 5 Deep Learning Papers You Must Read

Original article can be found here (source): Deep Learning on Medium

The Dropout Layer Paper

“Dropout: A Simple Way to Prevent Neural Networks from Overfitting”

The dropout layer is, now, a standard layer in almost every successful deep convolutional neural network. The dropout layer randomly masks a certain percentage of the neurons in a layer, forcing the model to generalize and represent more information in less connections.

This incredibly simple idea has shown to drastically reduce overfitting and is a core component in successful convolutional neural networks like AlexNet and VGG16.

Source: Dropout paper (linked below)

Written by Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov in 2014, the original Dropout paper has been cited over 2000 times. The 27-page paper (not counting references and acknowledgements) describes various variations of the Dropout layer and its performance on several datasets.

The paper can be accessed here.