Original article was published by /u/canarysplit on Deep Learning
I’m following the tutorial that is explaining Gradient Descent and now I came to the part where the derivation of the sum of the squared residuals is being explained. I’ve plotted the quadratic function which represents all the calculations of the sum of the squared residuals for different intercepts. Source (https://www.youtube.com/watch?v=sDv4f4s2SB8 – 9:50)
However, when I’m trying to plot the same line as them that is a linear function I’m not able to do it?
Image link – https://i.stack.imgur.com/FTsro.png
When you look at the derivative function at the image, there is an intercept, which is 0, already inserted. When I don’t insert it and multiply and sum everything I get this equation: Y=6x−5.704 t Then when x (intercept) = 0, y = -5.704 just like on the image. However, when I plot this function I don’t get the same tangent line as on the image?
What is the equation of that tangent line and how to achieve it?