# Linear regression with multiple variables

## 1.Promble:

Suppose you are selling your house and you want to know what a good market price would be. One way to do this is to first collect information on recent houses sold and make a model of housing prices.

The file ex1data2.txt contains a training set of housing prices in Portland, Oregon. The first column is the size of the house (in square feet), the second column is the number of bedrooms, and the third column is the price of the house.

## 2.采用梯度下降的方法求解：

1)step 1: Feature Normalization

By looking at the values(ex1data2.txt), note that house sizes are about 1000 times the number of bedrooms. When features differ by orders of magnitude, first performing feature scaling can make gradient descent converge much more quickly

z-score标准化方法适用于属性A的最大值和最小值未知的情况，或有超出取值范围的离群数据的情况。

新数据=（原数据-均值）/标准差

MATLAB代码如下:

<span style="font-size:18px;">function [X_norm, mu, sigma] = featureNormalize(X)
%FEATURENORMALIZE Normalizes the features in X
%   FEATURENORMALIZE(X) returns a normalized version of X where
%   the mean value of each feature is 0 and the standard deviation
%   is 1. This is often a good preprocessing step to do when
%   working with learning algorithms.

% You need to set these values correctly
X_norm = X;
mu = zeros(1, size(X, 2));      % mean value 均值   size(X,2)  列数
sigma = zeros(1, size(X, 2));   % standard deviation  标准差

% ====================== YOUR CODE HERE ======================
% Instructions: First, for each feature dimension, compute the mean
%               of the feature and subtract it from the dataset,
%               storing the mean value in mu. Next, compute the
%               standard deviation of each feature and divide
%               each feature by it's standard deviation, storing
%               the standard deviation in sigma.
%
%               Note that X is a matrix where each column is a
%               feature and each row is an example. You need
%               to perform the normalization separately for
%               each feature.
%
% Hint: You might find the 'mean' and 'std' functions useful.
%
mu = mean(X);       %  mean value
sigma = std(X);     %  standard deviation
X_norm  = (X - repmat(mu,size(X,1),1)) ./  repmat(sigma,size(X,1),1);%新数据=（原数据-均值）/标准差

end</span>

theta = theta - alpha / m * X' * (X * theta - y);

function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
%   taking num_iters gradient steps with learning rate alpha

% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);

for iter = 1:num_iters

% ====================== YOUR CODE HERE ======================
% Instructions: Perform a single gradient step on the parameter vector
%               theta.
%
% Hint: While debugging, it can be useful to print out the values
%       of the cost function (computeCostMulti) and gradient here.
%
theta = theta - alpha / m * X' * (X * theta - y);

% ============================================================

% Save the cost J in every iteration
J_history(iter) = computeCostMulti(X, y, theta);

end

end

## 3.采用正规方程求解

### Normal Equations:

the closed-form solution to linear regression is:

Using this formula does not require any feature scaling, and you will get an exact solution in one calculation: there is noloop until convergencelike in gradient descent

Matlab代码如下：

function [theta] = normalEqn(X, y)
%NORMALEQN Computes the closed-form solution to linear regression
%   NORMALEQN(X,y) computes the closed-form solution to linear
%   regression using the normal equations.

theta = zeros(size(X, 2), 1);

% ====================== YOUR CODE HERE ======================
% Instructions: Complete the code to compute the closed form solution
%               to linear regression and put the result in theta.
%

% ---------------------- Sample Solution ----------------------

theta = pinv( X' * X ) * X' * y;

end

#### 吴恩达 机器学习笔记 Linear Regression with Multiple Variables

2015-03-25 12:28:16

#### Machine Learning - IV. Linear Regression with Multiple Variables多变量线性规划 (Week 2)

2015-02-05 16:44:09

#### Coursera机器学习第二周学习笔记——Linear Regression with Multiple Variables

2017-08-11 10:11:36

#### 3-Linear Regression with Multiple Variables

2015-08-15 11:29:48

#### 机器学习作业之 Linear Regression with Multiple Variables (Week 2)

2014-11-19 22:55:58

#### Linear Regression with Multiple Variables

2015-12-18 09:59:02

#### Andrew Ng 《MachineLearning》第二讲——多变量线性回归（LinearRegression with Multiple Variables）&梯度下降

2017-02-08 12:40:54

#### Machine Learning -- Linear Regression with Multiple Variables(Andrew Ng)

2016-11-15 19:41:15

#### Stanford公开课机器学习---week2-1.多变量线性回归 （Linear Regression with multiple variable）

2015-05-27 12:39:35

#### 斯坦福大学机器学习公开课---Programming Exercise 1: Linear Regression

2015-02-03 23:10:46