*Though Linear Regression may seem somewhat dull compared to some of the *

*more modern statistical learning approaches described in later chapters of this book, linear regression is still a useful and widely used statistical learning method. Moreover, it serves as a good jumping-off point for newer approaches: as we will see in later chapters, many fancy statistical learning approaches can be seen as generalization or extensions of linear regression. Consequently, the importance of having a good understanding of linear regression before studying more complex learning methods cannot be* overstated.

-Gareth et al, * An Introduction to Statistical Learning
*From an answer to one of my questions on Coursera’s forums on the relevance of Regression in modern Data science.

# Category Archives: Theory

# Linear regression explained. With javascript code (Lineareg.js)

*Disclaimer: I don’t advice using Javascript for data science. I do write/use learning libraries at times just for the fun it and ofcourse Atwoods’ law. This post is about the theoretical background for linear regression not the Javascript implementation.*

I while ago I wrote lineareg.js, a Javascript library that lets you fit a line on a dataset. You can find the source code on github or install it from npm. I realized I never went about describing it. So here it is:

The crux of the code is in the cost computation.

The hypothesis is our prediction vector.

Difference

The cost function

Now we need to minimize this cost function. For this we use gradient descent to minimize it. To find the local minima gradient descent takes a step in the greatest negative gradient in every iteration. The number of iterations.

You can find the source code on Github.

Or install it from npm “`npm install lineareg“`