蔚蓝祥和的天空
这个作者很懒,什么都没留下…
展开
-
机器学习--PCA降维3_Review
SVD分解UK中前K个组成的图片原图片保留99%方差性recover后图片import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsefrom matplotlib import pyplot as pltdef plot原创 2020-11-16 14:00:29 · 106 阅读 · 0 评论 -
机器学习--PCA降维2_Review
import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsefrom matplotlib import pyplot as pltdef Feature_normalization(X): u = np.mean(X,axi原创 2020-11-16 13:20:16 · 108 阅读 · 0 评论 -
机器学习--KMeans、PCA降维_Review
import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsefrom matplotlib import pyplot as pltimport cv2#计算每个样本距离每个中心的距离,样本就近分配到最近的中心点def Findi原创 2020-11-16 13:15:27 · 153 阅读 · 0 评论 -
机器学习--神经网络反向传播_Review
import numpy as npimport scipy.io as scioimport scipy.optimize as optimport matplotlib.pyplot as pltdef transform_Y(y_label): m = y_label.size n = np.unique(y_label).size y_mat = np.zeros((n,m)) for j in range(m): y_mat[:,j][.原创 2020-11-15 23:38:59 · 97 阅读 · 0 评论 -
机器学习-- 线性回归正则化、方差和偏差_Review
import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioimport scipy.optimize as optplt.rcParams['font.sans-serif'] = ['SimHei'] # 显示中文标签plt.rcParams['axes.unicode_minus'] = Falsedef polyFeatures(X, p): X = X.reshape(X.size)原创 2020-11-10 21:55:50 · 225 阅读 · 0 评论 -
机器学习--Logistic回归+多分类-Fourth_Chapter_Review
import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioplt.rcParams['font.sans-serif'] = ['SimHei']plt.rcParams['axes.unicode_minus'] = Falseimport scipy.optimize as optimport randomfrom matplotlib import pyplot as pltdef plot_pi原创 2020-11-10 20:59:11 · 130 阅读 · 0 评论 -
机器学习--Logistic多元回归-Third_Chapter_Review
import numpy as npimport pandas as pdimport scipy.optimize as optimport matplotlib.pyplot as pltplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsedef X_polyFeatures(X,nums): for sum in range(2,nums+1):原创 2020-11-10 19:58:27 · 140 阅读 · 0 评论 -
机器学习--Logistic回归-Second_Chapter_Review
import numpy as npimport pandas as pdimport scipy.optimize as optdef X_plusOne(X): X.insert(0,"X1",1) return X.valuesdef Nomalization(X_plusOne): X_n = np.empty((X_plusOne.shape[0],X_plusOne.shape[1])) X_max = X_plusOne[:,1:].max(axis原创 2020-11-10 19:26:01 · 117 阅读 · 0 评论 -
机器学习--Logistic回归-First_Chapter_Review
import numpy as npimport pandas as pdimport matplotlib.pyplot as pltplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsedef X_plusOne(X): X.insert(0,"X1",1) return X.valuesdef linear_Model(X_plusOne,the原创 2020-11-10 18:13:22 · 133 阅读 · 0 评论 -
线性回归和梯度下降--Second_Chapter--Review
import numpy as npimport pandas as pdimport matplotlib.pyplot as pltplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsedef autoNorm(dataset,m): #对所有特征进行标准化 meanVals = dataset.mean(0).resh原创 2020-11-10 17:06:00 · 75 阅读 · 0 评论 -
机器学习--线性回归和梯度下降--First_Chapter_Review
import numpy as npimport pandas as pdimport matplotlib.pyplot as pltfrom mpl_toolkits.mplot3d import Axes3Dplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsedef linear_model(theta,X): return X.dot(theta).原创 2020-11-10 16:04:50 · 77 阅读 · 0 评论 -
机器学习--线性回归练习--FirstChapter
线性回归import numpy as npimport matplotlib.pyplot as pltplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsedef input_model(low,high,step): x = np.arange(low,high,step) return xdef y_label_(k,x): y_lab原创 2020-08-15 15:18:50 · 198 阅读 · 0 评论 -
机器学习--梯度下降详细记录
梯度下降过程梯度下降的过程中超参数同步更新梯度迭代过程步子减小随着迭代次数增加,斜率减小,梯度下降步子减小,因此无需减小alpha以防止步子过大产生来回震荡损失函数不减反增梯度下降过程中,损失函数增大,考虑alpha过大,减多了,增多了,可能会产生此情况梯度下降和方程求解样本数据量小的时候可考虑(X.T*X)-1 *X.T *Y直接求解矩阵的逆运算,以n(特征值数量的)三次方增长...原创 2020-08-17 01:43:51 · 131 阅读 · 0 评论 -
机器学习--线性回归练习--Second_Chapter
线性回归带截距项import numpy as npimport matplotlib.pyplot as pltplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsefrom mpl_toolkits.mplot3d import Axes3Ddef y_label_(k,x,b): y_label_ = k*x +b + np.random.randn(原创 2020-08-15 19:23:46 · 118 阅读 · 0 评论 -
机器学习--协同过滤算法
import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falseimport scipy.optimize as optimport pandas as pddef Collaborative_filering_costFunctio.原创 2020-09-26 17:16:58 · 188 阅读 · 0 评论 -
机器学习--异常点检测
import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsefrom matplotlib import pyplot as plt#Visualization(sample1)def visualization_data():..原创 2020-09-25 15:55:29 · 136 阅读 · 0 评论 -
机器学习--PCA主成分分析--Second_Chapter
主成分图片(前36个)原图片提取前100个主成分降维重建后图片import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsefrom matplotlib import pyplot as pltdef plot_pic_原创 2020-09-24 10:09:04 · 116 阅读 · 0 评论 -
机器学习--PCA主成分分析--First_Chapter
import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsefrom matplotlib import pyplot as pltdef Feature_normalization(X): u = np.mean(X,axis原创 2020-09-23 22:21:29 · 88 阅读 · 0 评论 -
机器学习--KMeans--K均值聚类
K=2K=3K=4K=5肘部法确定聚类数import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsefrom matplotlib import pyplot as pltdef Finding_closest_c原创 2020-09-23 16:42:06 · 132 阅读 · 0 评论 -
机器学习--SVM支持向量机--Support Vector Machine
SVM支持向量机C=1C=100C越大,则越不允许分错类,是对分错类的惩罚项绿线为sklearn自带API SVC中kernel="rbf"高斯核函数绘制黑线为自写高斯核函数外部矩阵调用API SVC中kernel="precomputed"绘制自写核函数注意事项:1、fit中,传入的为训练集X_train核函数矩阵,而非X_train2、predict,传入的为X_train和X_val的核函数矩阵,而非X_val的核函数矩阵同上,绿线为sklearn自带API SVC中ke原创 2020-09-22 16:26:40 · 141 阅读 · 0 评论 -
机器学习--神经网络Second_Chaper--反向传播
import numpy as npimport scipy.io as scioimport scipy.optimize as optdef Transform_ymat(y_label): m = y_label.size n = np.unique(y_label).size alist1 = [] for j in range(m): # 需要映射多少次 alist2 = [] for i in range(1, n +原创 2020-09-19 22:33:53 · 244 阅读 · 0 评论 -
机器学习-- 线性回归正则化--Second Chapter
import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioimport scipy.optimize as optplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsedef polyFeatures(X,p): X = X.reshape(X.size) polyl原创 2020-09-14 21:04:13 · 131 阅读 · 0 评论 -
机器学习-- 线性回归正则化--First_Chaper
import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioimport scipy.optimize as optplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsedef Pic_plot(X,y): fig = plt.figure() ax = fig.add原创 2020-09-14 10:28:04 · 154 阅读 · 0 评论 -
机器学习--神经网络First_Chaper--前向传播
import numpy as npimport scipy.io as sciodef Sigmoid(z): return 1 / (1 + np.exp(-z))def Predict(Output): return np.argmax(Output,axis=1) + 1def Accuracy(pre,y_label): count = 0 for m,n in zip(pre,y_label): if m == n:原创 2020-09-11 10:55:06 · 138 阅读 · 0 评论 -
机器学习--Logistic回归+多分类-Third_Chapter
import numpy as npimport matplotlib.pyplot as pltimport scipy.io as scioplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falseimport scipy.optimize as optimport randomfrom matplotlib import pyplot as pltdef plot原创 2020-09-10 21:57:01 · 136 阅读 · 0 评论 -
机器学习--Logistic回归+正则化-Second_Chapter
机器学习–Logistic回归+正则化import numpy as npimport pandas as pdimport scipy.optimize as optimport matplotlib.pyplot as pltfrom sklearn.preprocessing import PolynomialFeaturesfrom sklearn.preprocessing import StandardScalerplt.rcParams['font.sans-serif']=[原创 2020-08-24 01:32:18 · 135 阅读 · 0 评论 -
机器学习--Logistic回归-First_Chapter
Logistic回归import numpy as npimport pandas as pdimport matplotlib.pyplot as pltplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsedef X_plusOne(X): X.insert(0,"X1",1) return X.valuesdef linear_Model(X原创 2020-08-21 23:35:01 · 184 阅读 · 0 评论 -
机器学习--线性回归和梯度下降--Second_Chapter
多元线性回归梯度下降import numpy as npimport pandas as pdimport matplotlib.pyplot as pltplt.rcParams['font.sans-serif']=['SimHei'] #显示中文标签plt.rcParams['axes.unicode_minus']=Falsedef autoNorm(dataset,m): #对所有特征进行标准化 meanVals = dataset.mean(0)原创 2020-08-18 11:56:10 · 117 阅读 · 0 评论 -
机器学习--线性回归和梯度下降--First_Chapter
散点图为原先数据的分布情况,直线为拟合情况(线性模型设置为 y =theta0+theta1*x)使用梯度下降,theta0,theta1初始化为1,1,采用最后的theta0和theta1作为模型参数。Loss函数和每次theta0,theta1的关系Loss函数和theta0,theta1的关系在三维图上的展现import numpy as npimport pandas as pdimport matplotlib.pyplot as pltfrom mpl_toolkits.mp.原创 2020-08-17 01:29:41 · 156 阅读 · 0 评论