Source: Deep Learning on Medium

# Different Optimization Methods in Deep Learning

## A technical guide to train neural networks more efficiently

In this article we will go over different types of optimization techniques which will train neural networks much faster and in a more accurate manner.

They are —

- Batch Gradient Descent
- Mini-Batch Gradient Descent
- Stochastic Gradient Descent
- Gradient Descent with Momentum
- Gradient Descent with RMSProp
- Adam

Note, in each section we will have the code snippet. The final comprehensive code will be at the end of the article.

## Batch Gradient Descent

Say we have “m” examples in our dataset. In general lingo it is also called as Gradient Descent.