【深度学习】单变量线性回归算法

一、问题描述

分析单变量(城市人口)影响因素下,线性回归问题(快餐车利润)。

二、概要

1.假设函数
h θ ( x ) = θ 0 + θ 1 x \begin{aligned} h_{\theta}(x)=\theta_{0}+\theta_{1}x \end{aligned} hθ(x)=θ0+θ1x

2.代价函数:
J ( θ 0 , θ 1 ) = 1 2 m ∑ i = 1 m ( h θ ( x ( i ) ) − y ( i ) ) 2 \begin{aligned} J(\theta_{0},\theta_{1})=\frac{1}{2m}\sum_{i=1}^{m}\left (h_{\theta}( x^{(i)})-y^{(i)} \right )^{2} \end{aligned} J(θ0,θ1)=2m1i=1m(hθ(x(i))y(i))2
3.训练方法: 梯度下降法
θ j = θ j − α ∂ ∂ θ j J ( θ 0 , θ 1 ) = θ j − α 1 m ∑ i = 1 m ( h θ ( x ( i ) ) − y ( i ) ) x j ( i ) \begin{aligned} \theta_{j}=\theta_{j}-\alpha \frac{\partial}{\partial \theta_{j}}J(\theta_{0},\theta_{1})=\theta_{j}-\alpha\frac{1}{m}\sum_{i=1}^{m}\left ( h_{\theta}(x^{(i)})-y^{(i)}\right )x_{j}^{(i)} \end{aligned} θj=θjαθjJ(θ0,θ1)=θjαm1i=1m(hθ(x(i))y(i))xj(i)

三、代码实现(.m)

1.主文件
ex1.m

clear ; close all; clc
%% ================ Part 1: Graphic Display1 ================
% Load Data
data = load('files\ex1data1.txt');
X = data(:,1); 
y = data(:,2);
% Plot Training set data
subplot(2,2,1); plot(X,y,'rx','MarkerSize',10);
xlabel('Population of City (10,000)'); ylabel('Profit of Food Trucks ($10,000)');
%% ================ Part 2: Gradient Descent ================
% Choose some alpha value
alpha = 0.01;
num_iteration = 1500;
m = length(y);
% Init Theta and Run Gradient Descent 
x = [ones(m,1),X];
theta = zeros(2,1);
[theta_group, J_group] = gradientDescent(x,y,theta,alpha,num_iteration);
% The trained Hypothetical Function
h = x*theta_group(:,num_iteration);
hold on;
plot(x(:,2),h,'b','LineWidth',2); 
legend('Training set','Line Regression','Location','SouthEast');
%% ================ Part 3: Graphic Display2 ================
% Plot Loss during training 
theta0_vals = linspace(-10,10,100); 
theta1_vals = linspace(-1,4,100);
J_vals = zeros(length(theta0_vals),length(theta1_vals));
for i = 1:length(theta0_vals)
    for j = 1:length(theta1_vals)
	  t = [theta0_vals(i);theta1_vals(j)];    
	  J_vals(i,j) = computeCost(x, y, t);
    end
end
% Need to exchange coordinates due to surf & contour in Matlab
J_vals = J_vals';
%% ================ Part 4: Graphic Display3 ================
hold off;
subplot(2,2,2); surf(theta0_vals,theta1_vals,J_vals);
xlabel('\theta_0'); ylabel('\theta_1');title('3D surface');
subplot(2,2,3); contour(theta0_vals,theta1_vals,J_vals,logspace(-2, 3, 20));
xlabel('\theta_0'); ylabel('\theta_1');title('Contour map');
hold on;
plot(theta_group(1,num_iteration),theta_group(2,num_iteration),'rx','MarkerSize',10,'LineWidth', 2);
plot(theta_group(1,:),theta_group(2,:),'k.');
subplot(2,2,4);plot((1:num_iteration),J_group,'b','LineWidth',1.5);
xlabel('Training times'); ylabel('Loss');

2.调用函数
computeCost.m

function J = computeCost(x,y,theta)
%Number of Examples
m = length(y);
%Hypothetical Function
h = x * theta;
%Loss Function
J = (h-y)' * (h-y) / (2*m);

gradientDescent.m

function [theta_group,J_group] = gradientDescent(x,y,theta,alpha,num_iteration)
%Number of Features & Examples
num_feature = size(x,2);
m = length(y);
%Define theta_group & J_group
theta_group = zeros(num_feature,num_iteration);
J_group = zeros(num_iteration,1);
%Init theta_group & J_group
theta_group(:,1) = theta;
J_group(1) = computeCost(x,y,theta);

for i = 2:num_iteration
    %Hypothetical Function
    h = x * theta_group(:,i-1);
    %Gradient Descent
    for j = 1:num_feature
        theta_group(j,i) = theta_group(j,i-1) - alpha * (h - y)'*x(:,j) / m;
    end
    %Loss Function
    J_group(i) = computeCost(x,y,theta_group(:,i));
end

3.运行结果
在这里插入图片描述

四、代码下载

链接:https://pan.baidu.com/s/1fguoYy2o1j4JXykz55NCgg
提取码:27eb

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值