【机器学习01】 线性、逻辑回归总结

【机器学习01】 线性、逻辑回归总结

本文章基于吴恩达机器学习课程,exp_1、exp_2实验,对实验中线性回归、逻辑回归中的基本步骤、关键算法、python实现代码进行总结。

1、导入数据及对数据进行可视化

读取数据:pd.read_csv(),绘制散点图:plt.scatter(),绘制直线:plt.plot()

2、划分数据集

插入行、列:data.insert(),提取行、列数据:data.iloc(),将数据转化为numpy形式:data.values,更改数据维度:data.reshape(3,1),查看维度:data.shape

3、定义激活函数、代价函数

3.1、sigmoid():激活函数(将数据挤压到[0,1]区间):在这里插入图片描述

def sigmoid(z):
    return 1/(1+np.exp(-z))

3.2、costFunction():代价函数(判断模型误差函数):

线性回归损失函数:在这里插入图片描述

def costFunction(X,y,theta):
    inner = np.power(X@theta-y,2)
    return np.sum(inner)/(2*len(X))

逻辑回归损失函数(凸函数,存在全局最优解):在这里插入图片描述在这里插入图片描述

def costFunction(X,y,theta):
    y_=sigmoid(X@theta)#维度不同用@,维度相同用*,数值相乘用*
    first = y*np.log(y_)
    second = (1-y)*np.log(1-y_)
    return -np.sum(first+second)/len(X)

4、定义梯度下降函数

gradientDescnet():梯度下降函数(梯度的反方向是函数定点下降最快的方向,更新theta寻找最优解):在这里插入图片描述

def gradientDescent(X,y,theta,alpha,iters):
    m = len(X)
    costs  = []
    
    for i in range(iters):
        A = sigmoid(X@theta)
        theta = theta-(alpha/m)*X.T@(A-y)
        cost  = costFunction(X,y,theta)
        costs.append(cost)
        if i%1000 == 0:
            print(cost) 
        
    return costs,theta

5、传入初始值,透过数据观察梯度下降效果

theta_final,costs = gradientDescent(X,y,theta,alpha,iters,lamda)

6、预测结果,打印准确率

def predict(X,theta):
    prob = sigmoid(X@theta)
    return [1 if x>0.5 else 0 for x in prob]
y_ = np.array(predict(X,theta_final))
y_pre = y_.reshape(len(y_),1)
acc = np.mean(y_pre == y)
print(acc)

7、根据结果绘制回归曲线

绘制散点图:plt.scatter(),绘制直线:plt.plot()

GNU gdb (Ubuntu 8.1-0ubuntu3.2) 8.1.0.20180409-git
Copyright © 2018 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later http://gnu.org/licenses/gpl.html
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law. Type “show copying”
and “show warranty” for details.
This GDB was configured as “x86_64-linux-gnu”.
Type “show configuration” for configuration details.
For bug reporting instructions, please see:
http://www.gnu.org/software/gdb/bugs/.
Find the GDB manual and other documentation resources online at:
http://www.gnu.org/software/gdb/documentation/.
For help, type “help”.
Type “apropos word” to search for commands related to “word”…
Reading symbols from ./build/mono_tum…done.
BFD: warning: /home/karon/LK-ORB-SLAM-main/core is truncated: expected core file size >= 711725056, found: 544808960
[New LWP 5579]
[New LWP 5571]
[New LWP 5573]
[New LWP 5578]
[New LWP 5567]
[New LWP 5580]
[New LWP 5575]
[New LWP 5576]
[New LWP 5574]
[New LWP 5577]
[New LWP 5572]
[New LWP 5570]
[New LWP 5581]
[New LWP 5589]
[New LWP 5590]
[New LWP 5592]
[New LWP 5593]
[New LWP 5586]
[New LWP 5587]
[New LWP 5588]
[New LWP 5594]
[New LWP 5585]
[New LWP 5582]
[New LWP 5584]
[New LWP 5583]
[New LWP 5591]

warning: Section .reg-xstate/5579' in core file too small. Cannot access memory at address 0x7f252ad7e148 Cannot access memory at address 0x7f252ad7e140 Failed to read a valid object file image from memory. Core was generated by ./build/mono_tum Vocabulary/ORBvoc.txt Examples/Monocular/TUM1.yaml dataset/rgb’.
Program terminated with signal SIGSEGV, Segmentation fault.

warning: Section `.reg-xstate/5579’ in core file too small.
#0 0x00007f252a719859 in ?? ()
[Current thread is 1 (LWP 5579)]

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 31
    评论
评论 31
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值