# Breakthrough of The Decade in Evolutionary Algorithms for Deep Neural Networks

Source: Deep Learning on Medium

# What is Gradient Descent?

Gradient descent is a first-order iterative optimization algorithm for finding the minimum of a function. To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient of the function at the current point.

The Gradient of a Function gives the Direction in which the Function ‘Increases’ most rapidly. And hence the ‘Negaitive’ of the Gradient of the Function gives the Direction which ‘Decreases’ the Function most rapidly.

# What is Stochastic Gradient Descent?

Stochastic gradient descent (often abbreviated SGD) is an iterative method for optimizing an objective function with suitable smoothness properties (e.g. differentiable or subdifferentiable). It can be regarded as a stochastic approximation of gradient descent optimization, since it replaces the actual gradient (calculated from the entire data set) by an estimate thereof.

(calculated from a randomly selected subset of the data)