Linear Regression and it’s Mathematical Implementation

Original article was published on Deep Learning on Medium

Linear Regression and it’s Mathematical Implementation

Lets continue our learning

To understand Linear Regression ,first we should know

What is Regression?

Regression is a method of modelling a target value based on independent predictors. This method is mostly used for forecasting and finding out cause and effect relationship between variables.

What is LINEAR Regression?

  • Linear Regression is a machine learning algorithm.
  • It is based on supervised learning.
  • Linear regression is a type of regression analysis where the number of independent variables(x) and dependent variable (y) have a linear relationship between them.

TYPES OF LINEAR REGRESSION

Linear Regression is generally classified into two types:
1- Simple Linear Regression.
2- Multiple Linear Regression.

SIMPLE LINEAR REGRESSION

Simple linear regression is used to estimate the relationship between two quantitative variable.

Mathematical formulae for simple linear regression

where
yr=dependent variable/response vector/output
yp=predicted response
x=independent variable/feature vector/input
b=bias/weight/intercept
c=cofficeint of x

By finding the best b and c values,the model gets the best regression fit line.

The above figure shows best fit line for the dataset “SalaryData” .

In this dataset we have two features i.e YearsExperience & Salary and our linear model predicts the salary as output on giving input as years of experience.

Multiple Linear Regression

In multi linear regression we have more than one feature vectors or we can two or more independent variables.

The multiple regression equation explained above takes the following form:

y = b1x1 + b2x2 + … + bnxn + c.

The goal of regression is to determine the values of the weights 𝑏₁,𝑏₂,..bn such that this y is as close as possible to the actual response.

COST FUNCTION:
By achieving the best-fit regression line, the model aims to predict y value such that the error difference between predicted value and true value is minimum.
So, it is very important to update the b and c values, to reach the best value that minimize the error between predicted y value and true y value.

GRADIENT DESCENT:

The process of optimizing the values of the coefficients by iteratively minimizing the error of the model on your training data is called as Gradient Descent.The sum of the squared errors are calculated for each pair of input and output values. A learning rate is used as a scale factor and the coefficients are updated in the direction towards minimizing the error. The process is repeated until a minimum sum squared error is achieved or no further improvement is possible.

Here comes end of this article .

SUMMARY:

In this article I’m trying to explain –

  • What is Regression?
  • What is LINEAR Regression?
  • TYPES OF LINEAR REGRESSION
  • SIMPLE LINEAR REGRESSION
  • MULTIPLE LINEAR REGRESSION
  • Mathematical implementation of Linear Regression.
  • What is Cost Function?
  • What is Gradient Descent?

THANKS FOR YOUR VALUABLE TIME