Original article can be found here (source): Deep Learning on Medium
Neuron in a Neural Network
What is a Neuron and how it works?
The neuron is the basic working unit of the brain, a specialized cell designed to transmit information to other nerve cells, muscle, or gland cells.
Within an artificial neural network, a neuron also known as perceptron is a mathematical function that model the functioning of a biological neuron.Typically, a neuron compute the weighted average of its input, and this sum is passed through a nonlinear function, often called activation function, such as the sigmoid, relu, tanh etc.The weights and the bias of a perceptron can be learnt by using Peceptron Learning Rule (PLR) or Delta Rule.
In the biological neuron, an electric signal is given as an output when it receives an input with a higher influence. To map that functionality in the mathematical neuron, we need to have a function that operates on the sum of input multiplied by the corresponding weights (denoted as f(z) in the following visual) and responds with an appropriate value based on the input. If a higher- influence input is received, the output should be higher, and vice versa. It is in a way analogous to the activation signal (i.e., higher influence -> then activate, otherwise deactivate). The function that works on the computed input data is called the activation function.
An activation function is the function that takes the combined input as shown in the preceding illustration, applies a function on it, and passes the output value, thus trying to mimic the activate/deactivate function. The activation function, therefore, determines the state of a neuron by computing the activation function on the combined input.