Source: Deep Learning on Medium
Video : Derivative of the Sigmoid Activation function
The sigmoid activation function is a widely used function in Deep Learning which essentially squashes values between zero and one & is represented as follows.
Its derivative has advantageous properties, which is why it is widely used in neural networks.
The derivative of the sigmoid function looks something like this :
In this video, I’ll walk through each step of the derivation and discuss why people generally use the above mentioned form of the derivative instead of any other form.
<iframe width=”560″ height=”315″
src=”//www.youtube.com/embed/G6djH3I0rG0” frameborder=”0″ allowfullscreen>
I hope you found this video informative 😊