12 Nov

Linear regression explained. With javascript code (Lineareg.js)

Disclaimer: I don’t advice using Javascript for data science. I do write/use learning libraries at times just for the fun it and ofcourse Atwoods’ law. This post is about the theoretical background for linear regression not the Javascript implementation.

I while ago I wrote lineareg.js, a Javascript library that lets you fit a line on a dataset. You can find the source code on github or install it from npm. I realized I never went about describing it. So here it is:

The crux of the code is in the cost computation.
The hypothesis h=\theta.X is our prediction vector.
Difference D = h-y
The cost function J = \frac{1}{2m}\sum_{i=1}^{n}D

Now we need to minimize this cost function. For this we use gradient descent to minimize it. To find the local minima gradient descent takes a step in the greatest negative gradient in every iteration. The number of iterations.

You can find the source code on Github.
Or install it from npm “`npm install lineareg“`