Understanding Activation Function for Beginners…

Source: Deep Learning on Medium

Below you will see code snippets for multiple ACTIVATION FUNCTIONS in TensforFlow.

Here I’ve create two variable, where W is the weights having 2 neurons of weights 0.4172 and -1.74 and b is the bias value which will gonna be added to the Matrix Multiplication of b and one output.

Creating the simple Feed Forward Network without an Activation Function

I have created variable X having to random values 0.85 and -0.3074. The values are then matrix multiplied with the weights which are 0.4172 and -1.74. The network the returns the output [0.8896] converted into Numpy Array.

Putting the output value in the out variable so that it can be used further.

Feed Forward network with a Activation Functions.

SIGMOID ACTIVATION FUNCTION:

Range(0,1)

Used in the model where the output is in the binary form predicting whether the output is a CAT or a Dog.

TanH or HyperBolic Tangent FUNCTION

Range(-1,1)

Used in the output where the output is in the form of classification i-e Cat or Dog

ReLu or RECTIFIED LINEAR UNIT Function

RANGE(0,infinite(+)), it is one of the most used activation function. if you are not sure which activation function you should use then you can go for RELU, as it is non-saturated which means it doesn’t converge towards a threshold.

Softmax Activation Function

Range(0,1)

It returns a vector of probabilities depending on the number of classes you provided. The total sum of the vector will give you 1. It is used when there is multi-class classification.

Conclusion

As this tutorial is for the beginners, so I hope it may clear your concepts or some confusions that most people faces regarding the use of an Activation Function while creating a Network.

Clap it, share it, follow it, spread knowledge…..