Machine Learning with Scikit-Learn Python | Polynomial Linear Regression


Hello people from the future. Welcome to normalise nerd so in the last video. I have shown you how you can implement the simple linear regression in Python using scikit-learn and today we are going to learn about polynomial linear regression first. I'm gonna say what are the difference between polynomial linear regression and linear regression. And how it is implemented in scikit-learn okay so in case of linear regression we had something a curve like this which basically means the for the equation of this curve was y pred equal to x 1 w 1 times x 2w do something like this you can see the powers of X 1 and X 2 are always 1 so if this kind of thing happens then we say that it is a linear regression but suppose we have to fit a curve. Something like this not. It is not a straight line so to fit a curve something like this to our data we need to have some polynomial features of our original features. So what that mean well this simply means we have to add some polynomial terms of our original features suppose. X 1 is our original feature and X 2 is another original feature now to apply polynomial linear regression. What we need to do we need to add some extra features like X 1 square X 2 square it may be X 1 cube and so on so for illustration purpose. I have just taken this four terms which is this is X to the X 1 to the power 1 X 1 to the power 2 X to the power 1 X 2 to the power 2 and. This is the bias term. Okay so this is called a polynomial regression just because the it is a polynomial in X I now please notice that it is not a polynomial in w1w. I it is still a simple linear function in. W 1 and W I so that is why it is still a linear regression but it is a polynomial linear regression. You cannot say that it is a polynomial regression because if it were the polynomial regression then in that case all the the blue eyes would have some polynomial terms. So that was the clarification that in linear regression we had everything to the power 1 but in polynomial linear regression we have polynomial terms of our features but linear terms of our weights.

So suppose this is our Y and this is our X so if we suppose we fit this model to our data set we will get a curve like this. Please notice that it is not a straight line it is a it is something like a cubic curve okay so that was all for the conceptual part and now. I am going to jump into the code section so for the demonstration. I am using the same data set as I have used in the past two videos and the link to the data set will be provided in the description of this video so first of all. I need to import the three libraries that we are going to use okay so the libraries are imported now. I have to import the data set that is stored in data dot CSV file and after executing that you can see we have our data frame and it contains five columns and these first four columns represent our features and the last column is the predict variable and we are going to predict this column and so for the polynomial linear regression. I am using only first two features so after executing this line I will get my matrix of features at as the first two columns and this will just give me a vector Y will contain all the values corresponding to the last column. That is our output variable that we are going to predict now similarly asked. I have done in the past videos. I have to do here also the standard scaling so I am executing standard scaling and after doing that our features will be scaled and will come into a range that is in this case minus 2 to positive 2. You can see with all the values lies between minus 2 to positive 2 and after that as usual. I have to split our dataset into two portion one is training set and the other one is test set and here. I have used test set size as point two. That simply means that 20 percent of this data will belong to test set. Okay so let me just execute this line so you can see that we have our arrays. X test extreme white test and white rain now we have to introduce polynomial features to this so to do that.

We are using a cycle and again and from the pre-processing module. I am importing polynomial features class. Okay so by executing this line this class will be imported in my program and after that. I am declaring an object of this class. I am calling it as poly and this will be polynomial features and degree I am stating the degree as 3. So that simply means it will generate all the features for our original feature that are of degree 3 or lower so what that does mean well. Let me show you something here suppose this. X 1 and X 2 where our original features after doing the degree 3 polynomial linear fit. We will have our features like this so the first feature will be X 2 to the power 3 then X 1 times X 2 to the power 2. You can see the in this line. All the powers of each one adds up to 3 and in this code in this row all the powers adds up to 2 here 2 here 2 and here also 1+1 2 and in this line the powers are 1 X 1 and X 2 so the powers are 1 and lastly we have just X 1 to the power 0 X 2 to the power 0 so this would be just 1 okay so how many features we are going to have after this fit we are going to have n features for cubic features 3 squared features 2 just to the power 1 features and 1 will be just 1 because X to the 0 and X 1 to the 0 is simply 1 so we are going to have 10 features after this fit so to fit this we have to just use the fit transform method and we can simply do it by x poly well X poly will be our modified set of features so let me just execute this lines of code and after that you can see here that X poly is our new matrix that will contain all the modified features and you can see indeed that we have here 10 features this corresponds to the X 1 to the 0 times X 2 to the 0 this corresponds to the X 1 this is X 2 and this is probably X 1 square X 2 square and so on and so forth and lastly this 2 will be X 1 to the power 3 and X 2 to the power 3 ok so we are good with our polynomial features now we are going to fit this one also and after that we need to do our regression so now the regression is exactly same as the linear regression so I am importing here the linear regression class from linear model module.

Okay so let me just run this code after that. I am defining an object this is I have explained in the previous video of simple linear regression. So this is the same thing so that is not a very tough one okay so it is done. And lastly as we have done we have to fit our model to our data our training data now. Please remember that we are going to use here. X poly because I have defined the modified features in X poly in the previous video. I have showed you that for simple linear regression we were using extreme but here extreme is the original features and x poly is the modified polynomial features so we have to pass X poly here and obviously Y train here because we are not modifying our output variable that is the Y variable so after doing that I am executing this line of code so our model is fit. And lastly what we need to do we just need to do our prediction and I am going to store our predicted values in Y pred so now let me just look at this wipe red and the Y test you can see. The values are pretty cool. Close now one thing. I should tell you that this pollens is this polynomial linear regression may not be the best fit model for this data. I have just used this data to demonstrate the polynomial linear regression so. I'm pretty sure that there are different. Algorithms that will perform better on this data so that was all for this video. I hope you really enjoyed this video and learned about polynomial linear regression. Please like this video. Share this video. Share your comments. And don't forget to subscribe to normalise nerd. Thanks for watching.