Linear Regression

Linear regression is a method for modeling the relationship between one or more independent variables and a dependent variable.It is a staple of statistics and it often considered a good introductory machine learning method. In this tutorial , you will discover the matrix formulation of linear regression and how to solve it using direct and matrix factorization methods.After completing this tutorial, you will know:

  • Linear regression  and the matrix reformulation with the normal equations.
  • How to solve linear regression using a QR matrix decomposition.
  • How to solve linear regression using SVD and the pseudoinverse.

1.1 Tutorial Overview

This tutorial is divided into 7 parts; they are:

1.What is Linear Regression

2.Matrix Formulation of linear Regression

3. Linear Regression Dataset

4.Solve via Inverse

5.Solve via QR Decomposition

6. Solve via SVD and Pseudoinverse

7.Solve via Convenience Function

1.2 What is Linear Regression

Linear regression is a method for modeling the relationship between scalar values: The input variable x and the output variable y. The model assumes that y is a linear function or a weighted sum of the input variables.

                                                                y = f(x)

Or, stated with the coefficients

                                                y = b_{0} + b_{1} \times x_{1}

The model can also be used to model an output variable given multiple input variables called multivariate linear regression

                                        y = b_{0} + (b_{1}\times x_{1})+ (b_{2}\times x_{2}) + \cdots

The objective of creating a linear regression model is to find the values for the coefficient values (b) that minimize the error in the prediction of the output variable y 

1.3 Matrix Formulation of Linear Regression

 

1.4 Linear Regression Dataset

# Example of example linear regression dataset
# linear regression dataset
from numpy import array
from matplotlib import pyplot
# define dataset
data = array([
    [0.05, 0.12],
    [0.18, 0.22],
    [0.31, 0.35],
    [0.42, 0.38],
    [0.5, 0.49]
])
print(data)
# split into inputs and outputs
X,y = data[:,0],data[:,1]
X = X.reshape(len(X),1)
# scatter plot
pyplot.scatter(X, y)
pyplot.show()

 Running the example first prints the defined dataset.

1.5 Solve via Inverse

# Example of example linear regression dataset
# linear regression dataset
from numpy import array
from matplotlib import pyplot
# define dataset
data = array([
    [0.05, 0.12],
    [0.18, 0.22],
    [0.31, 0.35],
    [0.42, 0.38],
    [0.5, 0.49]
])
print(data)
# split into inputs and outputs
X,y = data[:,0],data[:,1]
X = X.reshape(len(X),1)
# scatter plot
pyplot.scatter(X, y)
pyplot.show()

1.5 Solve via Inverse

The first approach is to attempt to solve the regression problem directly using the matrix inverse.That is,given X,what are the set of cofficients b that when multiplied by X will give y.As we saw in a previous section, the normal equations define how to calculate b directly.

 

# Example of calculating a linear regression solution directly
# direct solution to linear least squares
from numpy import array
from numpy.linalg import inv
from matplotlib import pyplot
# define dataset
data = array([
    [0.05, 0.12],
    [0.18, 0.22],
    [0.31, 0.35],
    [0.42, 0.38],
    [0.5, 0.49]])

# split into inputs and outputs
X,y = data[:,0],data[:,1]
X = X.reshape((len(X), 1))
# linear leaset squares
b = inv(X.T.dot(X)).dot(X.T).dot(y)
print(b)

# predict using coefficients
yhat = X.dot(b)
# plot data and predictions
pyplot.scatter(X, y)
pyplot.plot(X, yhat, color='red')
pyplot.show()

Running the example performs the calculation and prints the coefficient vector b.

 [1.00233226]

 A scatter plot of the dataset is then created with a line plot for the model, showing a reasonable fit to the data.

 A problem with this approach is the matrix inverse that is both computationally expensive and numerically unstable. An alternative approach is to use a matrix decomposition to avoid this operation. We will look at two examples in the following sections.

1.6 Solve via QR Decomposition

 

# Example of calculating a QE decomposition
# QR decomposition
from numpy.linalg import qr
Q,R = qr(X)
b = inv(R).dot(Q.T).dot(y)
# Example of calculating a linear regression solution using a QR decomposition
# QR decomposition solution to linear least squares
from numpy import array
from numpy.linalg import inv
from numpy.linalg import qr
from matplotlib import pyplot
# define dataset
data = array([
    [0.05, 0.12],
    [0.18, 0.22],
    [0.31, 0.35],
    [0.42, 0.38],
    [0.5, 0.49]
])

# split into inputs and outputs
X,y = data[:,0],data[:,1]
X = X.reshape((len(X),1))

# factorize
Q,R = qr(X)
b = inv(R).dot(Q.T).dot(y)
print(b)
# predict using coefficients
yhat = X.dot(b)
# plot data and predictions
pyplot.scatter(X,y)
pyplot.plot(X,yhat,color='red')
pyplot.show()

The QR decomposition approach is more computationally efficient and more numerically stable than calculating the normal equation directly, but does not work for all data matrices.

 1.7 Solve via SVD and Pseudoinverse

 Where X+ is the pseudoinverse of X and the + is a superscript, D+ is the pseudoinverse of the diagonal matrix Σ and V T is the transpose of V . NumPy provides the function pinv() to calculate the pseudoinverse directly. The complete example is listed below.

# Example of calculating a linear regression solution using an SVD
# SVD solution via pseudoinverse to linear least squares
from numpy import array
from numpy.linalg import pinv
from matplotlib import pyplot
# define dataset
data = array([
    [0.05, 0.12],
    [0.18, 0.22],
    [0.31, 0.35],
    [0.42, 0.38],
    [0.5, 0.49]
])

# split into inputs and outputs
X,y = data[:,0],data[:,1]
X = X.reshape((len(X),1))
# calculate coefficients
b = pinv(X).dot(y)
print(b)

# predict using coefficients
yhat = X.dot(b)
# plot data and predictions
pyplot.scatter(X,y)
pyplot.plot(X, yhat,color='red')
pyplot.show()

 1.8 Solve via Convenience Function

# least square via convenience function
from numpy import array
from numpy.linalg import lstsq
from matplotlib import pyplot
# define dataset
data = array([
    [0.05, 0.12],
    [0.18, 0.22],
    [0.31, 0.35],
    [0.42, 0.38],
    [0.5, 0.49]
])

# split into inputs and outputs
X,y = data[:,0],data[:,1]
X = X.reshape((len(X),1))
# calculate coefficients
b,residuals,rank,s = lstsq(X,y)
print(b)
# predict using coefficients
yhat = X.dot(b)
#plot data and predictions
pyplot.scatter(X,y)
pyplot.plot(X,yhat,color='red')
pyplot.show()

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值