Understanding Wide and Deep Learning based Recommendation System

Source: Deep Learning on Medium

Understanding Wide and Deep Learning based Recommendation System

Recommendation systems are algorithms that suggest relevant items(Videos, Images, Text, product, etc) to users. This can be a vital competitive edge for Applications such as Netflix, Amazon Prime, Youtube, Google Play, Pinterest, etc. These applications have a humongous amount of collections of items that need to reach out to each user. Accurate recommendations help improve user experience.

Introduction

Wide and Deep Neural Network combines benefits from both generalization and memorization initially introduced for App recommendation in Google play. Generalized linear models with nonlinear feature transformations are widely used for large-scale regression and classification problems with sparse inputs. Memorization of feature interactions through a wide set of cross-product feature transformations is effective and interpretable, while generalization requires more feature engineering effort. Using MLP for feature representation is very straightforward and highly efficient.

The wide learning component is a single layer perceptron which can also be regarded as a generalized linear model. The deep learning component is an MLP. Combining these two learning techniques enables the recommender to capture both memorization and generalization.

Single Layer Perceptron

The perceptron consists of 4 parts.

Input value or One input layer: The input layer of the perceptron is made of artificial input neurons and takes the initial data into the system for further processing.

Weights and Bias:

Weight: It represents the dimension or strength of the connection between units. If the weight to node 1 to node 2 has a higher quantity, then neuron 1 has a more considerable influence on the neuron.

Bias: It is the same as the intercept added in a linear equation. It is an additional parameter which task is to modify the output along with the weighted sum of the input to the other neuron.

Net sum: It calculates the total sum.

Activation Function: A neuron can be activated or not, is determined by an activation function. The activation function calculates a weighted sum and further adding bias with it to give the result.

In the first step, all the inputs x are multiplied with their weights w.

In this step, add all the increased values and call them the Weighted sum.

In the last step, apply the weighted sum to a correct Activation Function.

Multi-Layer Perceptron

A multilayer perceptron (MLP) is a feed-forward artificial neural network that generates a set of outputs from a set of inputs. An MLP is characterized by several layers of input nodes connected as a directed graph between the input nodes connected as a directed graph between the input and output layers. MLP uses backpropagation for training the network.

Backpropagation

Back-propagation is the essence of neural net training. It is the method of fine-tuning the weights of a neural net based on the error rate obtained in the previous epoch (i.e., iteration). Proper tuning of the weights allows you to reduce error rates and to make the model reliable by increasing its generalization.

Backpropagation is a short form of “backward propagation of errors.” It is a standard method of training artificial neural networks. This method helps to calculate the gradient of a loss function with respects to all the weights in the network.

1. Inputs X, arrive through the preconnected path

2. Input is modelled using real weights W. The weights are usually randomly selected.

3. Calculate the output for every neuron from the input layer, to the hidden layers, to the output layer.

4. Calculate the error in the outputs

5. Travel back from the output layer to the hidden layer to adjust the weights such that the error is decreased.

ErrorB= Actual Output — Desired Output

Keep repeating the process until the desired output is achieved

Wide and Deep Learning Model

The wide and deep learning model is achieved by fusing both the models together