机器学习笔记(吴恩达)——逻辑回归作业

这篇博客详细介绍了逻辑回归的概念和应用,包括数据可视化、Sigmoid函数、梯度下降法。博主探讨了正则化逻辑回归,特别是MapFeature的实现,并解释了正则化代价函数及其梯度下降优化过程。内容涵盖了批量梯度下降和更新规则,以及如何处理θ0的正则化。
摘要由CSDN通过智能技术生成

EX2 逻辑回归

1.1可视化数据

import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
filepath=r'F:\jypternotebook\吴恩达机器学习python作业代码\code\ex2-logistic regression\ex2data1.txt'
ex2_data1=pd.read_csv(filepath,header=None,names=['exam1','exam2','admitted'])
ex2_data1
exam1 exam2 admitted
0 34.623660 78.024693 0
1 30.286711 43.894998 0
2 35.847409 72.902198 0
3 60.182599 86.308552 1
4 79.032736 75.344376 1
... ... ... ...
95 83.489163 48.380286 1
96 42.261701 87.103851 1
97 99.315009 68.775409 1
98 55.340018 64.931938 1
99 74.775893 89.529813 1

100 rows × 3 columns

plt.figure(figsize=(16,8))
positive_admitted=ex2_data1[ex2_data1['admitted'].isin([1])]
negitive_admitted=ex2_data1[ex2_data1['admitted'].isin([0])]
print(positive_admitted)
print(negitive_admitted)
        exam1      exam2  admitted
3   60.182599  86.308552         1
4   79.032736  75.344376         1
6   61.106665  96.511426         1
7   75.024746  46.554014         1
8   76.098787  87.420570         1
9   84.432820  43.533393         1
12  82.307053  76.481963         1
13  69.364589  97.718692         1
15  53.971052  89.207350         1
16  69.070144  52.740470         1
18  70.661510  92.927138         1
19  76.978784  47.575964         1
21  89.676776  65.799366         1
24  77.924091  68.972360         1
25  62.271014  69.954458         1
26  80.190181  44.821629         1
30  61.379289  72.807887         1
31  85.404519  57.051984         1
33  52.045405  69.432860         1
37  64.176989  80.908061         1
40  83.902394  56.308046         1
42  94.443368  65.568922         1
46  77.193035  70.458200         1
47  97.771599  86.727822         1
48  62.073064  96.768824         1
49  91.564974  88.696293         1
50  79.944818  74.163119         1
51  99.272527  60.999031         1
52  90.546714  43.390602         1
56  97.645634  68.861573         1
58  74.248691  69.824571         1
59  71.796462  78.453562         1
60  75.395611  85.759937         1
66  40.457551  97.535185         1
68  80.279574  92.116061         1
69  66.746719  60.991394         1
71  64.039320  78.031688         1
72  72.346494  96.227593         1
73  60.457886  73.094998         1
74  58.840956  75.858448         1
75  99.827858  72.369252         1
76  47.264269  88.475865         1
77  50.458160  75.809860         1
80  88.913896  69.803789         1
81  94.834507  45.694307         1
82  67.319257  66.589353         1
83  57.238706  59.514282         1
84  80.366756  90.960148         1
85  68.468522  85.594307         1
87  75.477702  90.424539         1
88  78.635424  96.647427         1
90  94.094331  77.159105         1
91  90.448551  87.508792         1
93  74.492692  84.845137         1
94  89.845807  45.358284         1
95  83.489163  48.380286         1
96  42.261701  87.103851         1
97  99.315009  68.775409         1
98  55.340018  64.931938         1
99  74.775893  89.529813         1
        exam1      exam2  admitted
0   34.623660  78.024693         0
1   30.286711  43.894998         0
2   35.847409  72.902198         0
5   45.083277  56.316372         0
10  95.861555  38.225278         0
11  75.013658  30.603263         0
14  39.538339  76.036811         0
17  67.946855  46.678574         0
20  67.372028  42.838438         0
22  50.534788  48.855812         0
23  34.212061  44.209529         0
27  93.114389  38.800670         0
28  61.830206  50.256108         0
29  38.785804  64.995681         0
32  52.107980  63.127624         0
34  40.236894  71.167748         0
35  54.635106  52.213886         0
36  33.915500  98.869436         0
38  74.789253  41.573415         0
39  34.183640  75.237720         0
41  51.547720  46.856290         0
43  82.368754  40.618255         0
44  51.047752  45.822701         0
45  62.222676  52.060992         0
53  34.524514  60.396342         0
54  50.286496  49.804539         0
55  49.586677  59.808951         0
57  32.577200  95.598548         0
61  35.286113  47.020514         0
62  56.253817  39.261473         0
63  30.058822  49.592974         0
64  44.668262  66.450086         0
65  66.560894  41.092098         0
67  49.072563  51.883212         0
70  32.722833  43.307173         0
78  60.455556  42.508409         0
79  82.226662  42.719879         0
86  42.075455  78.844786         0
89  52.348004  60.769505         0
92  55.482161  35.570703         0



<Figure size 1152x576 with 0 Axes>
plt.scatter(positive_admitted['exam1'],positive_admitted['exam2'],c='b',marker='+',label='admitted')
plt.scatter(negitive_admitted['exam1'],negitive_admitted['exam2'],c='y',marker='o',label='Not admitted')
plt.xlabel('Exma1_score')
plt.ylabel('Exma2_score')
plt.legend()
plt.show()

在这里插入图片描述

1.2sigmod函数

def sigmod(z):
    return 1/(1+np.exp(-z))

代价函数:
J ( θ ) = 1 m ∑ i = 1 m [ − y ( i ) log ⁡ ( h θ ( x ( i ) ) ) − ( 1 − y ( i ) ) log ⁡ ( 1 − h θ ( x ( i ) ) ) ] J\left( \theta \right)=\frac{1}{m}\sum\limits_{i=1}^{m}{[-{ {y}^{(i)}}\log \left( { {h}_{\theta }}\left( { {x}^{(i)}} \right) \right)-\left( 1-{ {y}^{(i)}} \right)\log \left( 1-{ {h}_{\theta }}\left( { {x}^{(i)}} \right) \right)]} J(θ)=m1i=1m[y(i)log(hθ(x(i)))(1y(i))log(1hθ

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值