python 重启内核_Python从零开始的内核回归

python 重启内核

Every beginner in Machine Learning starts by studying what regression means and how the linear regression algorithm works. In fact, the ease of understanding, explainability and the vast effective real-world use cases of linear regression is what makes the algorithm so famous. However, there are some situations to which linear regression is not suited. In this article, we will see what these situations are, what the kernel regression algorithm is and how it fits into the scenario. Finally, we will code the kernel regression algorithm with a Gaussian kernel from scratch. Basic knowledge of Python and numpy is required to follow the article.

e。通过学习什么回归方式,以及如何进行线性回归算法的工作非常初学者在机器学习开始。 实际上,算法的易懂性,可解释性和广泛有效的线性回归实际使用案例就是使该算法如此出名的原因。 但是,在某些情况下线性回归不适合。 在本文中,我们将了解这些情况,内核回归算法是什么以及它如何适合该场景。 最后,我们将从头开始使用高斯内核对内核回归算法进行编码。 阅读本文需要具备Python和numpy的基础知识。

线性回归简要回顾 (Brief Recap on Linear Regression)

Given data in the form of N feature vectors x=[x₁, x₂, …, xₙ] consisting of n features and the corresponding label vector y, linear regression tries to fit a line that best describes the data. For this, it tries to find the optimal coefficients cᵢ, i∈{0, …, n} of the line equation y = c+ cx₁+cx₂+…+cxₙ usually by gradient descent with the model accuracy measured on the RMSE metric. The equation obtained is then used to predict the target yₜ for new unseen input vector xₜ.

N个特征向量x = [X₁,x₂,...,Xₙ]组成的n个特征和对应的标签向量y,线性回归尝试的形式给定的数据,以适应线路最能描述的数据。 对于这一点,它试图找到最佳系数cᵢ, ∈{0,...,N}的直线方程Y = C₀+ C₁X₁+ C₂X₂+ ... + CₙX通常由梯度ₙ的下降,并以RMSE指标衡量的模型准确性。 然后将获得的方程式用于预测新的看不见的输入向量x的目标y

Linear regression is a simple algorithm that cannot model very complex relationships between the features. Mathematically, this is because well, it is linear with the degree of the equation being 1, which means that linear regression will always model a straight line. Indeed, this linearity is the weakness of the linear regression algorithm. Why?

线性回归是一种简单的算法,无法对要素之间的非常复杂的关系建模。 从数学上讲,这是因为它很好,它在方程的次数为1时是线性的,这意味着线性回归将始终对直线建模。 确实,这种线性是线性回归算法的弱点。 为什么?

Well, let’s consider a situation where our data doesn’t have the form of a straight line: let’s take data generated using the function f(x) = x³. If we use linear regression to fit a model to this data, we will never get anywhere close to the true cubic function because the equation for which we are finding the coefficients does not have a cubic term! So, for any data not generated using a linear function, linear regression is very likely to underfit. So, what do we do?

好吧,让我们考虑一下数据不呈直线形式的情况:让我们看一下使用函数f(x)=x³生成的数据 如果我们使用线性回归将模型拟合到该数据,我们将永远无法接近真正的三次函数,因为我们要为其找到系数的方程式没有三次项! 因此,对于未使用线性函数生成的任何数据,线性回归很可能不适合。 那么我们该怎么办?

We can use another type of regression called polynomial regress

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值