#native_company# #native_desc#

Polynomial Regression Application

Let's get physical

Sometimes, nothing beats holding a copy of a book in your hands. Writing in the margins, highlighting sentences, folding corners. So this book is also available from Amazon as a paperback.

Buy now on Amazon

So far, in this regression chapter, we’ve covered linear regression. This assumes the relationship between an independent and dependant variable is a straight line, but what if it’s not? What if it’s a curve? In this lecture I set you the challenge to convert the linear regression application you just created into a polynomial regression application.

The equation for a curve is y = ax2 + bx + c; this is a type of mathematical expression commonly called a polynomial.


y = ax + b is also a polynomial, but it’s typically just called a linear function because it doesn’t have any exponents like x2, which change it to a non-linear relationship.

The Application

Similar to the linear regression application we built, we will create an app that allows you to create data points and tries to find the best fit curve as above.

polynomial regression app completed
Figure 1. Image of the polynomial example application working


Open the polynomial-regression folder in the samples project and run the index.html file as we have done in the previous lessons. The code is very similar in structure to the linear regression application except that ui.js has changed to work with curves rather than lines.


Open the start.js file; we are going to be editing this file to complete the application.

The start.js file looks very similar to the linear-regression application we just built. The challenge I set you now is to convert the linear-regression application into a polynomial-regression application all by yourself.


Scattered through the code are //TODO comment blocks, these are strong hints where you should be looking.

So pause at this point, and please give it a go. Doing it yourself will be very rewarding and give you the energy needed to complete the rest of this course.


The first TODO is at the top of the file, we have tensor variables to hold the coefficients for A, and B and we need another to hold C, like so:

const a = tf.variable(tf.scalar(A));
const b = tf.variable(tf.scalar(B));
const c = tf.variable(tf.scalar(C)); (1)
1 Add a tf.variable to hold c.

The rest of the code is almost the same as the linear regression example apart from two other locations; the first is in the loss function.

Previously the predictedYs measured distance from the best-fit-line, we need that line to change to calculate the ys as if they came from a curve, like so:

const predictedYs = a.mul(actualXs.square())

This applies the equation of a curve y = ax2 + bx + c across all x values in the tensor actualXs.

The predictedYs are then used in the mean square error loss calculation in the same way as they were in the linear-regression application, so there aren’t any code changes there.

Finally since we have another tensor variable c we need to map it’s value to the UI variable C to have the curve change on the screen, like so:

A = a.dataSync()[0];
B = b.dataSync()[0];
C = c.dataSync()[0]; (1)
1 Extract the value from c and store in C.

Now if you run the application, it should draw the best fit curve instead of a line, try it out!


With polynomial regression, you can find the non-linear relationship between two variables. The only real difference between the linear regression application and the polynomial regression example is the definition of the loss function. Almost every other part of the application except the UI code is the same.

The loss function is core to machine learning. Picking the proper loss function, and understanding how to define your problem as a loss function is key to building a good machine learning model.

Advanced JavaScript

This unique course teaches you advanced JavaScript knowledge through a series of interview questions. Bring your JavaScript to the 2021's today.

Level up your JavaScript now!