线性回归 梯度下降_线性回归梯度下降简介

线性回归 梯度下降

Linear Regression is one of the most popular supervised machine learning. It predicts values within a continuous range, (e.g. sale prices, life expectancy, temperature, etc) instead of trying to classify them into categories (e.g. car, bus, bike and others). The main goal of the linear regression is to find the best fit line which explains the relationship between the data.

大号 inear回归是最流行的监督的机器学习的一个。 它可以预测连续范围内的值(例如销售价格,预期寿命,温度等),而不是试图将其分类(例如汽车,公共汽车,自行车等)。 线性回归的主要目标是找到能解释数据之间关系的最佳拟合线。

Image for post
Fig.1: Linear regression simulation.
图1:线性回归模拟。

The best-fitting line (or regression line) is represented by:

最佳拟合线(或回归线)表示为:

Image for post

where m is the slope of the line known as angular gradient and b is the point at which the line crosses the y-axis also known as linear gradient.

其中m是直线的斜率,称为角度梯度b是直线与y轴交叉的点,也称为线性梯度

The main challenge to find the regression line is to determine the value of m and b, such that the line corresponding to those values is the best fitting line, therefore, the line that provides the minimum error.

找到回归线的主要挑战是确定mb的值,以使与那些值相对应的线是最佳拟合线,因此,提供最小误差的线。

How to find the best m e b values?

如何找到最佳的meb值?

A possible alternative would be to use methods like Ordinary Least Squares (OLS) — which is an analytical and non-iterative solution. OLS is a type of linear least squares method to estimate the unknown parameters. The OLS is defined by:

一种可能的替代方法是使用普通最小二乘(OLS)之类的方法 ,这是一种分析且非迭代的解决方案。 OLS是一种用于估计未知参数的线性最小二乘法。 OLS通过以下方式定义:

Image for post

where x is the independent variables, x ̄ is the average of independent variables, y is the dependent variables y ̄ is the average of dependent variables.

其中x是自变量,X̄是平均自变量的, S中的因变量ÿ̄是平均因变量。

The OLS can be a good option to solve the problem of linear regression, because it has coefficients and linear equations. Nevertheless, apply OLS to complex and non- linear machine learning algorithms, such as Neural networks, Support Vector Machines, etc. will not be feasible. This is due to the fact that the OLS solution isn’t scalable.

OLS可以解决线性回归问题,因为它具有系数和线性方程。 然而,将OLS应用于复杂和非线性的机器学习算法(如神经网络,支持向量机等)将不可行。 这是因为OLS解决方案不可扩展。

Image for post

Instead of OLS we will find the numerical approximation by an iterative method. The Gradient descent is one of the best optimisation algorithms that approximate a solution by an iterative procedure able to efficiently explore the parameter space, instead of obtaining an exact analytical solution.

代替OLS,我们将通过迭代方法找到数值近似值。 梯度下降法是最好的优化算法之一,它通过能够有效探索参数空间而不是获得精确解析解的迭代过程来近似解。

了解梯度下降概念 (Understanding Gradie

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值