[飞桨机器学习]六种常见数据降维

[飞桨机器学习]六种常见数据降维

事实上,在高维情形下 现的数据样本稀疏、 距离计算困 难等问是所有机器学习方法共同面 的严重障碍, 被称为" 维数灾难" (curse of dimensionality) . 缓解维数灾难的一个重要途径是降维(dimension reduction) 亦称" 维数约简 “ ,即通过某种数学变换将原始高维属 性空间转变为 一个低维"子空间" (subspace) ,在这 子空 中样本密 大幅提高 计算 变得更为容易。为什么进行降维?这是因为在很多时候, 人们观测或收集到的数据样本虽是高维的,但与学习任务密切相关的也许仅是某个低维分布,即高 空间中个低维"嵌入" (embedding) 如图给出直观的例子,原始高空间中的样本点,在这个低维嵌入子空间中更容易进行学习。

常见的数据降维方法有:PCA、LDA、MDS、ISOMAP、SNE、T-SNE、AutoEncoder等

一、MDS:MultiDimensional Scaling 多维尺度变换

MDS算法要求原始空间中样本之间的距离在低维空间中得以保持。但是为了有效降维,我们往往只需要降维后的距离与原始空间距离尽可能接近即可。

def calculate_distance(x, y):
    d = np.sqrt(np.sum((x - y) ** 2))
    return d


# 计算矩阵各行之间的欧式距离;x矩阵的第i行与y矩阵的第0-j行继续欧式距离计算,构成新矩阵第i行[i0、i1...ij]
def calculate_distance_matrix(x, y):
    d = metrics.pairwise_distances(x, y)
    return d


def cal_B(D):
    (n1, n2) = D.shape
    DD = np.square(D)                    # 矩阵D 所有元素平方
    Di = np.sum(DD, axis=1) / n1         # 计算dist(i.)^2
    Dj = np.sum(DD, axis=0) / n1         # 计算dist(.j)^2
    Dij = np.sum(DD) / (n1 ** 2)         # 计算dist(ij)^2
    B = np.zeros((n1, n1))
    for i in range(n1):
        for j in range(n2):
            B[i, j] = (Dij + DD[i, j] - Di[i] - Dj[j]) / (-2)   # 计算b(ij)
    return B


def MDS(data, n=2):
    D = calculate_distance_matrix(data, data)
    # print(D)
    B = cal_B(D)
    Be, Bv = np.linalg.eigh(B)             # Be矩阵B的特征值,Bv归一化的特征向量
    # print numpy.sum(B-numpy.dot(numpy.dot(Bv,numpy.diag(Be)),Bv.T))
    Be_sort = np.argsort(-Be)
    Be = Be[Be_sort]                          # 特征值从大到小排序
    Bv = Bv[:, Be_sort]                       # 归一化特征向量
    Bez = np.diag(Be[0:n])                 # 前n个特征值对角矩阵
    # print Bez
    Bvz = Bv[:, 0:n]                          # 前n个归一化特征向量
    Z = np.dot(np.sqrt(Bez), Bvz.T).T
    # print(Z)
    return Z

二、ISOMAP: Isometric Mapping 等距特征映射

对于流形(Manifold,局部具有欧式空间性质的空间),两点之间的距离并非欧氏距离。而是采用“局部具有欧式空间性质”的原因,让两点之间的距离近似等于依次多个临近点的连线的长度之和。通过这个方式,将多维空间“展开”到低维空间。

Isomap算法是在MDS算法的基础上衍生出的一种算法,MDS算法是保持降维后的样本间距离不变,Isomap算法引进了邻域图,样本只与其相邻的样本连接,他们之间的距离可直接计算,较远的点可通过最小路径算出距离,在此基础上进行降维保距。

计算流程如下:

  1. 设定邻域点个数,计算邻接距离矩阵,不在邻域之外的距离设为无穷大;

  2. 求每对点之间的最小路径,将邻接矩阵矩阵转为最小路径矩阵;

  3. 输入MDS算法,得出结果,即为Isomap算法的结果。

  import numpy as np
import matplotlib.pyplot as plt
from sklearn import metrics, datasets


def floyd(D, n_neighbors=15):
   Max = np.max(D) * 1000
   n1, n2 = D.shape
   k = n_neighbors
   D1 = np.ones((n1, n1)) * Max
   D_arg = np.argsort(D, axis=1)
   for i in range(n1):
       D1[i, D_arg[i, 0:k + 1]] = D[i, D_arg[i, 0:k + 1]]
   for k in range(n1):
       for i in range(n1):
           for j in range(n1):
               if D1[i, k] + D1[k, j] < D1[i, j]:
                   D1[i, j] = D1[i, k] + D1[k, j]

   return D1


def calculate_distance(x, y):
   d = np.sqrt(np.sum((x - y) ** 2))
   return d


# 计算矩阵各行之间的欧式距离;x矩阵的第i行与y矩阵的第0-j行继续欧式距离计算,构成新矩阵第i行[i0、i1...ij]
def calculate_distance_matrix(x, y):
   d = metrics.pairwise_distances(x, y)
   return d


def cal_B(D):
   (n1, n2) = D.shape
   DD = np.square(D)  # 矩阵D 所有元素平方
   Di = np.sum(DD, axis=1) / n1  # 计算dist(i.)^2
   Dj = np.sum(DD, axis=0) / n1  # 计算dist(.j)^2
   Dij = np.sum(DD) / (n1 ** 2)  # 计算dist(ij)^2
   B = np.zeros((n1, n1))
   for i in range(n1):
       for j in range(n2):
           B[i, j] = (Dij + DD[i, j] - Di[i] - Dj[j]) / (-2)  # 计算b(ij)
   return B


def MDS(data, n=2):
   D = calculate_distance_matrix(data, data)
   # print(D)
   B = cal_B(D)
   Be, Bv = np.linalg.eigh(B)  # Be矩阵B的特征值,Bv归一化的特征向量
   # print numpy.sum(B-numpy.dot(numpy.dot(Bv,numpy.diag(Be)),Bv.T))
   Be_sort = np.argsort(-Be)
   Be = Be[Be_sort]  # 特征值从大到小排序
   Bv = Bv[:, Be_sort]  # 归一化特征向量
   Bez = np.diag(Be[0:n])  # 前n个特征值对角矩阵
   # print Bez
   Bvz = Bv[:, 0:n]  # 前n个归一化特征向量
   Z = np.dot(np.sqrt(Bez), Bvz.T).T
   # print(Z)
   return Z


def Isomap(data, n=2, n_neighbors=30):
   D = calculate_distance_matrix(data, data)
   D_floyd = floyd(D)
   B = cal_B(D_floyd)
   Be, Bv = np.linalg.eigh(B)
   Be_sort = np.argsort(-Be)
   Be = Be[Be_sort]
   Bv = Bv[:, Be_sort]
   Bez = np.diag(Be[0:n])
   Bvz = Bv[:, 0:n]
   Z = np.dot(np.sqrt(Bez), Bvz.T).T
   return Z


def generate_curve_data():
   xx, target = datasets.samples_generator.make_s_curve(400, random_state=9)
   return xx, target


if __name__ == '__main__':
   data, target = generate_curve_data()
   Z_Isomap = Isomap(data, n=2)
   Z_MDS = MDS(data)
   figure = plt.figure()
   plt.suptitle('ISOMAP COMPARE TO MDS')
   plt.subplot(1, 2, 1)
   plt.title('ISOMAP')
   plt.scatter(Z_Isomap[:, 0], Z_Isomap[:, 1], c=target, s=60)
   plt.subplot(1, 2, 2)
   plt.title('MDS')
   plt.scatter(Z_MDS[:, 0], Z_MDS[:, 1], c=target, s=60)
   plt.show()

参考

三、PCA:Principle component analysis 主成分分析

它是一个线性变换。这个变换把数据变换到一个新的坐标系统中,使得任何数据投影的第一大方差在第一个坐标(称为第一主成分)上,第二大方差在第二个坐标(第二主成分)上,依次类推。主成分分析经常用于减少数据集的维数,同时保持数据集的对方差贡献最大的特征。

def pca(data, n):
    data = np.array(data)

    # 均值
    mean_vector = np.mean(data, axis=0)

    # 协方差
    cov_mat = np.cov(data - mean_vector, rowvar=0)

    # 特征值 特征向量
    fvalue, fvector = np.linalg.eig(cov_mat)

    # 排序
    fvaluesort = np.argsort(-fvalue)

    # 取前几大的序号
    fValueTopN = fvaluesort[:n]

    # 保留前几大的数值
    newdata = fvector[:, fValueTopN]

    new = np.dot(data, newdata)

    return new

四、LDA:Linear Discriminant Analysis 线性判别分析

与PCA一样,是一种线性降维算法。不同于PCA只会选择数据变化最大的方向,由于LDA是有监督的(分类标签),所以LDA会主要以类别为思考因素,使得投影后的样本尽可能可分。它通过在k维空间选择一个投影超平面,使得不同类别在该超平面上的投影之间的距离尽可能近,同时不同类别的投影之间的距离尽可能远。从而试图明确地模拟数据类之间的差异。

LDA方法的考虑是,对于一个多类别的分类问题,想要把它们映射到一个低维空间,如一维空间从而达到降维的目的,我们希望映射之后的数据间,两个类别之间“离得越远”,且类别内的数据点之间“离得越近”,这样两个类别就越好区分。因此LDA方法分别计算“within-class”的分散程度Sw和“between-class”的分散程度Sb,而我们希望的是Sb/Sw越大越好,从而找到最合适的映射向量w。

假设我们的数据集 D = { ( x 1 , y 1 ) , ( x 2 , y 2 ) , . . . , ( ( x m , y m ) ) } D=\{(x_1,y_1), (x_2,y_2), ...,((x_m,y_m))\} D={(x1,y1),(x2,y2),...,((xm,ym))},其中任意样本xixi为n维向量,yi∈{0,1}。我们定义Nj(j=0,1)为第j类样本的个数,Xj(j=0,1)为第j类样本的集合,而μj(j=0,1)为第j类样本的均值向量,定义Σj(j=0,1)为第j类样本的协方差矩阵(严格说是缺少分母部分的协方差矩阵)。

μj的表达式为:

Σj的表达式为:

import numpy as np
import csv
import matplotlib.pyplot as plt


def loadDataset(filename):
    data1 ,data2 = [], []
    with open(filename, 'r') as f:
        lines = csv.reader(f)
        data = list(lines)
    for i in range(len(data)):
        del(data[i][0])
        for j in range(len(data[i])):
            data[i][j] = float(data[i][j])
        if data[i][-1]:
            del(data[i][-1])
            data1.append(data[i])
        else:
            del(data[i][-1])
            data2.append(data[i])

    return np.array(data1), np.array(data2)



def lda_num2(data1,  data2,  n=2):
    mu0 = data2.mean(0)
    mu1 = data1.mean(0)
    print(mu0)
    print(mu1)

    sum0 = np.zeros((mu0.shape[0], mu0.shape[0]))
    for i in range(len(data2)):
        sum0 += np.dot((data2[i] - mu0).T, (data2[i] - mu0))
    sum1 = np.zeros(mu1.shape[0])
    for i in range(len(data1)):
        sum1 += np.dot((data1[i] - mu1).T, (data1[i] - mu1))

    s_w = sum0 + sum1
    print(s_w)
    w = np.linalg.pinv(s_w) * (mu0 - mu1)

    new_w = w[:n].T

    new_data1 = np.dot(data1, new_w)
    new_data2 = np.dot(data2, new_w)

    return new_data1, new_data2


def view(data):
    x = [i[0] for i in data]
    y = [i[1] for i in data]

    plt.figure()
    plt.scatter(x, y)
    plt.show()


if __name__ == '__main__':
    data1, data2 = loadDataset("Pima.csv")

    newdata1, newdata2 = lda_num2(data1, data2, 2)

    print(newdata1)
    print(newdata2)
    view(np.concatenate((newdata1, newdata2))*10**7)
    view(newdata1 * 10 ** 7)
    view(newdata2 * 10 ** 7)

五、SNE:Stochastic Neighbor Embedding

SNE是通过仿射(affinitie)变换将数据点映射到概率分布上,主要包括两个步骤:

  • SNE构建一个高维对象之间的概率分布,使得相似的对象有更高的概率被选择,而不相似的对象有较低的概率被选择。
  • SNE在低维空间里在构建这些点的概率分布,使得这两个概率分布之间尽可能的相似。

我们看到SNE模型是非监督的降维,他跟kmeans等不同,他不能通过训练得到一些东西之后再用于其它数据(比如kmeans可以通过训练得到k个点,再用于其它数据集,而t-SNE只能单独的对数据做操作,也就是说他只有fit_transform,而没有fit操作)

SNE是先将欧几里得距离转换为条件概率来表达点与点之间的相似度。具体来说,给定一个N个高维的数据 x1,…,xN(注意N不是维度), SNE首先是计算概率pij,正比于xi和xj之间的相似度(这种概率是我们自主构建的),即以x_i自己为中心,以高斯分布选择x_j作为近邻点的条件概率::

这里的有一个参数是σi,对于不同的点xi取值不一样,后续会讨论如何设置。此外设置px∣x=0因为我们关注的是两两之间的相似度。

那对于低维度下的yi,我们可以指定高斯分布为方差为根号二分之一,因此它们之间的相似度如下:

同样,设定qi∣i=0

如果降维的效果比较好,局部特征保留完整,那么 pi∣j=qi∣j, 因此我们优化两个分布之间的距离-KL散度(Kullback-Leibler divergences),那么目标函数(cost function)如下:

首先不同的点具有不同的σi,Pi的熵(entropy)会随着σi的增加而增加。SNE使用困惑度(perplexity)的概念,用二分搜索的方式来寻找一个最佳的σ。其中困惑度指:

这里的H(Pi)是Pi的熵,即:

困惑度可以解释为一个点附近的有效近邻点个数。SNE对困惑度的调整比较有鲁棒性,通常选择5-50之间,给定之后,使用二分搜索的方式寻找合适的σ

在初始化中,可以用较小的σ下的高斯分布来进行初始化。为了加速优化过程和避免陷入局部最优解,梯度中需要使用一个相对较大的动量(momentum)。即参数更新中除了当前的梯度,还要引入之前的梯度累加的指数衰减项,如下:

这里的Y(t)表示迭代t次的解,η表示学习速率,α(t)表示迭代t次的动量。

此外,在初始优化的阶段,每次迭代中可以引入一些高斯噪声,之后像模拟退火一样逐渐减小该噪声,可以用来避免陷入局部最优解。因此,SNE在选择高斯噪声,以及学习速率,什么时候开始衰减,动量选择等等超参数上,需要跑多次优化才可以。

六、t-SNE:t-distributed stochastic neighbor embedding

尽管SNE提供了很好的可视化方法,但是他很难优化,而且存在”crowding problem”(拥挤问题)。后续中,Hinton等人又提出了t-SNE的方法。与SNE不同,主要如下:

  • 使用对称版的SNE,简化梯度公式
  • 低维空间下,使用t分布替代高斯分布表达两点之间的相似度

t-SNE在低维空间下使用更重长尾分布的t分布来避免crowding问题和优化问题。

show png

我们对比一下高斯分布和t分布(如上图,code见probability/distribution.md), t分布受异常值影响更小,拟合结果更为合理,较好的捕获了数据的整体特征。

使用了t分布之后的q变化,如下:

其中yi=xi/2σ

此外,t分布是无限多个高斯分布的叠加,计算上不是指数的,会方便很多。优化的梯度如下:

t-sne的有效性,也可以从上图中看到:横轴表示距离,纵轴表示相似度, 可以看到,对于较大相似度的点,t分布在低维空间中的距离需要稍小一点;而对于低相似度的点,t分布在低维空间中的距离需要更远。这恰好满足了我们的需求,即同一簇内的点(距离较近)聚合的更紧密,不同簇之间的点(距离较远)更加疏远。

总结一下,t-SNE的梯度更新有两大优势:

  • 对于不相似的点,用一个较小的距离会产生较大的梯度来让这些点排斥开来。
  • 这种排斥又不会无限大(梯度中分母),避免不相似的点距离太远。

[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-w0CIsYq2-1616142885574)(http://www.datakit.cn/images/machinelearning/t-sne_optimise.gif)]

资料来源

import numpy as np


def cal_pairwise_dist(x):
    '''计算pairwise 距离, x是matrix
    (a-b)^2 = a^w + b^2 - 2*a*b
    '''
    sum_x = np.sum(np.square(x), 1)
    dist = np.add(np.add(-2 * np.dot(x, x.T), sum_x).T, sum_x)
    return dist


def cal_perplexity(dist, idx=0, beta=1.0):
    '''计算perplexity, D是距离向量,
    idx指dist中自己与自己距离的位置,beta是高斯分布参数
    这里的perp仅计算了熵,方便计算
    '''
    prob = np.exp(-dist * beta)
    # 设置自身prob为0
    prob[idx] = 0
    sum_prob = np.sum(prob)
    perp = np.log(sum_prob) + beta * np.sum(dist * prob) / sum_prob
    prob /= sum_prob
    return perp, prob


def seach_prob(x, tol=1e-5, perplexity=30.0):
    '''二分搜索寻找beta,并计算pairwise的prob
    '''

    # 初始化参数
    print("Computing pairwise distances...")
    (n, d) = x.shape
    dist = cal_pairwise_dist(x)
    pair_prob = np.zeros((n, n))
    beta = np.ones((n, 1))
    # 取log,方便后续计算
    base_perp = np.log(perplexity)

    for i in range(n):
        if i % 500 == 0:
            print("Computing pair_prob for point %s of %s ..." % (i, n))

        betamin = -np.inf
        betamax = np.inf
        perp, this_prob = cal_perplexity(dist[i], i, beta[i])

        # 二分搜索,寻找最佳sigma下的prob
        perp_diff = perp - base_perp
        tries = 0
        while np.abs(perp_diff) > tol and tries < 50:
            if perp_diff > 0:
                betamin = beta[i].copy()
                if betamax == np.inf or betamax == -np.inf:
                    beta[i] = beta[i] * 2
                else:
                    beta[i] = (beta[i] + betamax) / 2
            else:
                betamax = beta[i].copy()
                if betamin == np.inf or betamin == -np.inf:
                    beta[i] = beta[i] / 2
                else:
                    beta[i] = (beta[i] + betamin) / 2

            # 更新perb,prob值
            perp, this_prob = cal_perplexity(dist[i], i, beta[i])
            perp_diff = perp - base_perp
            tries = tries + 1
        # 记录prob值
        pair_prob[i,] = this_prob
    print("Mean value of sigma: ", np.mean(np.sqrt(1 / beta)))
    return pair_prob


def pca(x, no_dims=50):
    ''' PCA算法
    使用PCA先进行预降维
    '''
    print("Preprocessing the data using PCA...")
    (n, d) = x.shape
    x = x - np.tile(np.mean(x, 0), (n, 1))
    l, M = np.linalg.eig(np.dot(x.T, x))
    y = np.dot(x, M[:, 0:no_dims])
    return y


def tsne(x, no_dims=2, initial_dims=50, perplexity=30.0, max_iter=1000):
    """Runs t-SNE on the dataset in the NxD array x
    to reduce its dimensionality to no_dims dimensions.
    The syntaxis of the function is Y = tsne.tsne(x, no_dims, perplexity),
    where x is an NxD NumPy array.
    """

    # Check inputs
    if isinstance(no_dims, float):
        print("Error: array x should have type float.")
        return -1
    if round(no_dims) != no_dims:
        print("Error: number of dimensions should be an integer.")
        return -1

    # 初始化参数和变量
    x = pca(x, initial_dims).real
    (n, d) = x.shape
    initial_momentum = 0.5
    final_momentum = 0.8
    eta = 500
    min_gain = 0.01
    y = np.random.randn(n, no_dims)
    dy = np.zeros((n, no_dims))
    iy = np.zeros((n, no_dims))
    gains = np.ones((n, no_dims))

    # 对称化
    P = seach_prob(x, 1e-5, perplexity)
    P = P + np.transpose(P)
    P = P / np.sum(P)
    # early exaggeration
    P = P * 4
    P = np.maximum(P, 1e-12)

    # Run iterations
    for iter in range(max_iter):
        # Compute pairwise affinities
        sum_y = np.sum(np.square(y), 1)
        num = 1 / (1 + np.add(np.add(-2 * np.dot(y, y.T), sum_y).T, sum_y))
        num[range(n), range(n)] = 0
        Q = num / np.sum(num)
        Q = np.maximum(Q, 1e-12)

        # Compute gradient
        PQ = P - Q
        for i in range(n):
            dy[i, :] = np.sum(np.tile(PQ[:, i] * num[:, i], (no_dims, 1)).T * (y[i, :] - y), 0)

        # Perform the update
        if iter < 20:
            momentum = initial_momentum
        else:
            momentum = final_momentum
        gains = (gains + 0.2) * ((dy > 0) != (iy > 0)) + (gains * 0.8) * ((dy > 0) == (iy > 0))
        gains[gains < min_gain] = min_gain
        iy = momentum * iy - eta * (gains * dy)
        y = y + iy
        y = y - np.tile(np.mean(y, 0), (n, 1))
        # Compute current value of cost function
        if (iter + 1) % 100 == 0:
            if iter > 100:
                C = np.sum(P * np.log(P / Q))
            else:
                C = np.sum(P / 4 * np.log(P / 4 / Q))
            print("Iteration ", (iter + 1), ": error is ", C)
        # Stop lying about P-values
        if iter == 100:
            P = P / 4
    print("finished training!")
    return y


if __name__ == "__main__":
    # Run Y = tsne.tsne(X, no_dims, perplexity) to perform t-SNE on your dataset.
    X = np.loadtxt("mnist2500_X.txt")
    labels = np.loadtxt("mnist2500_labels.txt")
    Y = tsne(X, 2, 50, 20.0)
    from matplotlib import pyplot as plt

    plt.scatter(Y[:, 0], Y[:, 1], 20, labels)
    plt.show()

%matplotlib inline
import numpy as np
import csv
import matplotlib.pyplot as plt
from sklearn import metrics


def loadDataset(filename):
    label = []
    with open(filename, 'r') as f:
        lines = csv.reader(f)
        data = list(lines)
    for i in range(len(data)):
        del(data[i][0])
        for j in range(len(data[i])):
            data[i][j] = float(data[i][j])
        if data[i][-1]:
            label.append(data[i][-1])
        else:
            label.append(-1)
        del(data[i][-1])
    return data, label


def calculate_distance(x, y):
    d = np.sqrt(np.sum((x - y) ** 2))
    return d


# 计算矩阵各行之间的欧式距离;x矩阵的第i行与y矩阵的第0-j行继续欧式距离计算,构成新矩阵第i行[i0、i1...ij]
def calculate_distance_matrix(x, y):
    d = metrics.pairwise_distances(x, y)
    return d


def cal_B(D):
    (n1, n2) = D.shape
    DD = np.square(D)                    # 矩阵D 所有元素平方
    Di = np.sum(DD, axis=1) / n1         # 计算dist(i.)^2
    Dj = np.sum(DD, axis=0) / n1         # 计算dist(.j)^2
    Dij = np.sum(DD) / (n1 ** 2)         # 计算dist(ij)^2
    B = np.zeros((n1, n1))
    for i in range(n1):
        for j in range(n2):
            B[i, j] = (Dij + DD[i, j] - Di[i] - Dj[j]) / (-2)   # 计算b(ij)
    return B


def MDS(data, n=2):
    D = calculate_distance_matrix(data, data)
    # print(D)
    B = cal_B(D)
    Be, Bv = np.linalg.eigh(B)             # Be矩阵B的特征值,Bv归一化的特征向量
    # print numpy.sum(B-numpy.dot(numpy.dot(Bv,numpy.diag(Be)),Bv.T))
    Be_sort = np.argsort(-Be)
    Be = Be[Be_sort]                          # 特征值从大到小排序
    Bv = Bv[:, Be_sort]                       # 归一化特征向量
    Bez = np.diag(Be[0:n])                 # 前n个特征值对角矩阵
    # print Bez
    Bvz = Bv[:, 0:n]                          # 前n个归一化特征向量
    Z = np.dot(np.sqrt(Bez), Bvz.T).T
    # print(Z)
    return Z


def view(data):
    x = [i[0] for i in data]
    y = [i[1] for i in data]

    plt.figure()
    plt.scatter(x, y)
    plt.show()




if __name__ == '__main__':
    data, labels = loadDataset("data/data43561/Pima.csv")

    newdata2 = MDS(data, 2)

    view(newdata2)

import numpy as np
import matplotlib.pyplot as plt
from sklearn import metrics, datasets


def floyd(D, n_neighbors=15):
    Max = np.max(D) * 1000
    n1, n2 = D.shape
    k = n_neighbors
    D1 = np.ones((n1, n1)) * Max
    D_arg = np.argsort(D, axis=1)
    for i in range(n1):
        D1[i, D_arg[i, 0:k + 1]] = D[i, D_arg[i, 0:k + 1]]
    for k in range(n1):
        for i in range(n1):
            for j in range(n1):
                if D1[i, k] + D1[k, j] < D1[i, j]:
                    D1[i, j] = D1[i, k] + D1[k, j]

    return D1


def calculate_distance(x, y):
    d = np.sqrt(np.sum((x - y) ** 2))
    return d


# 计算矩阵各行之间的欧式距离;x矩阵的第i行与y矩阵的第0-j行继续欧式距离计算,构成新矩阵第i行[i0、i1...ij]
def calculate_distance_matrix(x, y):
    d = metrics.pairwise_distances(x, y)
    return d


def cal_B(D):
    (n1, n2) = D.shape
    DD = np.square(D)  # 矩阵D 所有元素平方
    Di = np.sum(DD, axis=1) / n1  # 计算dist(i.)^2
    Dj = np.sum(DD, axis=0) / n1  # 计算dist(.j)^2
    Dij = np.sum(DD) / (n1 ** 2)  # 计算dist(ij)^2
    B = np.zeros((n1, n1))
    for i in range(n1):
        for j in range(n2):
            B[i, j] = (Dij + DD[i, j] - Di[i] - Dj[j]) / (-2)  # 计算b(ij)
    return B


def MDS(data, n=2):
    D = calculate_distance_matrix(data, data)
    # print(D)
    B = cal_B(D)
    Be, Bv = np.linalg.eigh(B)  # Be矩阵B的特征值,Bv归一化的特征向量
    # print numpy.sum(B-numpy.dot(numpy.dot(Bv,numpy.diag(Be)),Bv.T))
    Be_sort = np.argsort(-Be)
    Be = Be[Be_sort]  # 特征值从大到小排序
    Bv = Bv[:, Be_sort]  # 归一化特征向量
    Bez = np.diag(Be[0:n])  # 前n个特征值对角矩阵
    # print Bez
    Bvz = Bv[:, 0:n]  # 前n个归一化特征向量
    Z = np.dot(np.sqrt(Bez), Bvz.T).T
    # print(Z)
    return Z


def Isomap(data, n=2, n_neighbors=30):
    D = calculate_distance_matrix(data, data)
    D_floyd = floyd(D)
    B = cal_B(D_floyd)
    Be, Bv = np.linalg.eigh(B)
    Be_sort = np.argsort(-Be)
    Be = Be[Be_sort]
    Bv = Bv[:, Be_sort]
    Bez = np.diag(Be[0:n])
    Bvz = Bv[:, 0:n]
    Z = np.dot(np.sqrt(Bez), Bvz.T).T
    return Z


def generate_curve_data():
    xx, target = datasets.samples_generator.make_s_curve(400, random_state=9)
    return xx, target


if __name__ == '__main__':
    data, target = generate_curve_data()
    Z_Isomap = Isomap(data, n=2)
    Z_MDS = MDS(data)
    figure = plt.figure()
    plt.suptitle('ISOMAP COMPARE TO MDS')
    plt.subplot(1, 2, 1)
    plt.title('ISOMAP')
    plt.scatter(Z_Isomap[:, 0], Z_Isomap[:, 1], c=target, s=60)
    plt.subplot(1, 2, 2)
    plt.title('MDS')
    plt.scatter(Z_MDS[:, 0], Z_MDS[:, 1], c=target, s=60)
    plt.show()
import numpy as np
import csv
import matplotlib.pyplot as plt
from sklearn import metrics


def loadDataset(filename):
    label = []
    with open(filename, 'r') as f:
        lines = csv.reader(f)
        data = list(lines)
    for i in range(len(data)):
        del(data[i][0])
        for j in range(len(data[i])):
            data[i][j] = float(data[i][j])
        if data[i][-1]:
            label.append(data[i][-1])
        else:
            label.append(-1)
        del(data[i][-1])
    return data, label


def pca(data, n):
    data = np.array(data)

    # 均值
    mean_vector = np.mean(data, axis=0)

    # 协方差
    cov_mat = np.cov(data - mean_vector, rowvar=0)

    # 特征值 特征向量
    fvalue, fvector = np.linalg.eig(cov_mat)

    # 排序
    fvaluesort = np.argsort(-fvalue)

    # 取前几大的序号
    fValueTopN = fvaluesort[:n]

    # 保留前几大的数值
    newdata = fvector[:, fValueTopN]

    new = np.dot(data, newdata)

    return new


def view(data):
    x = [i[0] for i in data]
    y = [i[1] for i in data]

    plt.figure()
    plt.scatter(x, y)
    plt.show()




if __name__ == '__main__':
    data, labels = loadDataset("data/data43561/Pima.csv")

    newdata = pca(data, 2)

    view(newdata)

import numpy as np
import csv
import matplotlib.pyplot as plt


def loadDataset(filename):
    data1 ,data2 = [], []
    with open(filename, 'r') as f:
        lines = csv.reader(f)
        data = list(lines)
    for i in range(len(data)):
        del(data[i][0])
        for j in range(len(data[i])):
            data[i][j] = float(data[i][j])
        if data[i][-1]:
            del(data[i][-1])
            data1.append(data[i])
        else:
            del(data[i][-1])
            data2.append(data[i])

    return np.array(data1), np.array(data2)



def lda_num2(data1,  data2,  n=2):
    mu0 = data2.mean(0)
    mu1 = data1.mean(0)
    print(mu0)
    print(mu1)

    sum0 = np.zeros((mu0.shape[0], mu0.shape[0]))
    for i in range(len(data2)):
        sum0 += np.dot((data2[i] - mu0).T, (data2[i] - mu0))
    sum1 = np.zeros(mu1.shape[0])
    for i in range(len(data1)):
        sum1 += np.dot((data1[i] - mu1).T, (data1[i] - mu1))

    s_w = sum0 + sum1
    print(s_w)
    w = np.linalg.pinv(s_w) * (mu0 - mu1)

    new_w = w[:n].T

    new_data1 = np.dot(data1, new_w)
    new_data2 = np.dot(data2, new_w)

    return new_data1, new_data2


def view(data):
    x = [i[0] for i in data]
    y = [i[1] for i in data]

    plt.figure()
    plt.scatter(x, y)
    plt.show()


if __name__ == '__main__':
    data1, data2 = loadDataset("data/data43561/Pima.csv")

    newdata1, newdata2 = lda_num2(data1, data2, 2)

    print(newdata1)
    print(newdata2)
    view(np.concatenate((newdata1, newdata2))*10**7)
    view(newdata1 * 10 ** 7)
    view(newdata2 * 10 ** 7)
[109.98      68.184     19.664     68.792     30.3042     0.429734
  31.19    ]
[141.25746269  70.82462687  22.1641791  100.3358209   35.14253731
   0.5505      37.06716418]
[[11250384.96961214 11250384.96961214 11250384.96961214 11250384.96961214
  11250384.96961214 11250384.96961214 11250384.96961214]
 [11250384.96961214 11250384.96961214 11250384.96961214 11250384.96961214
  11250384.96961214 11250384.96961214 11250384.96961214]
 [11250384.96961214 11250384.96961214 11250384.96961214 11250384.96961214
  11250384.96961214 11250384.96961214 11250384.96961214]
 [11250384.96961214 11250384.96961214 11250384.96961214 11250384.96961214
  11250384.96961214 11250384.96961214 11250384.96961214]
 [11250384.96961214 11250384.96961214 11250384.96961214 11250384.96961214
  11250384.96961214 11250384.96961214 11250384.96961214]
 [11250384.96961214 11250384.96961214 11250384.96961214 11250384.96961214
  11250384.96961214 11250384.96961214 11250384.96961214]
 [11250384.96961214 11250384.96961214 11250384.96961214 11250384.96961214
  11250384.96961214 11250384.96961214 11250384.96961214]]
[[-9.72882435e-06 -9.72882435e-06]
 [-1.12352790e-05 -1.12352790e-05]
 [-1.84669612e-05 -1.84669612e-05]
 [-1.03948553e-05 -1.03948553e-05]
 [-4.36200673e-05 -4.36200673e-05]
 [-8.12775434e-06 -8.12775434e-06]
 [-1.05824322e-05 -1.05824322e-05]
 [-6.04167795e-05 -6.04167795e-05]
 [-2.06332919e-05 -2.06332919e-05]
 [-6.27828692e-06 -6.27828692e-06]
 [-2.12038004e-05 -2.12038004e-05]
 [-7.01569180e-06 -7.01569180e-06]
 [-1.31342515e-05 -1.31342515e-05]
 [-1.23381231e-05 -1.23381231e-05]
 [-7.85742869e-06 -7.85742869e-06]
 [-1.79325366e-05 -1.79325366e-05]
 [-1.48358307e-05 -1.48358307e-05]
 [-9.50870726e-06 -9.50870726e-06]
 [-2.40868386e-05 -2.40868386e-05]
 [-7.09836321e-06 -7.09836321e-06]
 [-6.24579163e-06 -6.24579163e-06]
 [-1.96234450e-05 -1.96234450e-05]
 [-2.50450402e-05 -2.50450402e-05]
 [-1.13412890e-05 -1.13412890e-05]
 [-6.97895099e-06 -6.97895099e-06]
 [-2.86514025e-05 -2.86514025e-05]
 [-2.92755087e-05 -2.92755087e-05]
 [-8.59553531e-06 -8.59553531e-06]
 [-7.51989077e-06 -7.51989077e-06]
 [-7.43249927e-06 -7.43249927e-06]
 [-1.18178663e-05 -1.18178663e-05]
 [-8.40880406e-06 -8.40880406e-06]
 [-8.08897972e-06 -8.08897972e-06]
 [-9.11314564e-06 -9.11314564e-06]
 [-1.52750214e-05 -1.52750214e-05]
 [-8.79629013e-06 -8.79629013e-06]
 [-2.09396069e-05 -2.09396069e-05]
 [-1.02874316e-05 -1.02874316e-05]
 [-8.55468205e-06 -8.55468205e-06]
 [-1.84695459e-05 -1.84695459e-05]
 [-3.83222082e-05 -3.83222082e-05]
 [-2.01789230e-05 -2.01789230e-05]
 [-9.64860490e-06 -9.64860490e-06]
 [-8.09346280e-06 -8.09346280e-06]
 [-1.62651106e-05 -1.62651106e-05]
 [-7.31288519e-06 -7.31288519e-06]
 [-1.17518991e-05 -1.17518991e-05]
 [-1.61949188e-05 -1.61949188e-05]
 [-7.26580038e-06 -7.26580038e-06]
 [-2.04399318e-05 -2.04399318e-05]
 [-7.83451311e-06 -7.83451311e-06]
 [-2.36169980e-05 -2.36169980e-05]
 [-7.17595982e-06 -7.17595982e-06]
 [-1.90081746e-05 -1.90081746e-05]
 [-1.19190894e-05 -1.19190894e-05]
 [-1.00678511e-05 -1.00678511e-05]
 [-1.71623476e-05 -1.71623476e-05]
 [-8.47276806e-06 -8.47276806e-06]
 [-1.59628442e-05 -1.59628442e-05]
 [-6.83414806e-06 -6.83414806e-06]
 [-1.61010455e-05 -1.61010455e-05]
 [-1.88009413e-05 -1.88009413e-05]
 [-1.63594653e-05 -1.63594653e-05]
 [-8.50646822e-06 -8.50646822e-06]
 [-1.22120908e-05 -1.22120908e-05]
 [-3.99865124e-05 -3.99865124e-05]
 [-1.18694865e-05 -1.18694865e-05]
 [-1.38239127e-05 -1.38239127e-05]
 [-1.81276323e-05 -1.81276323e-05]
 [-9.98806195e-06 -9.98806195e-06]
 [-8.54512051e-06 -8.54512051e-06]
 [-2.22241419e-05 -2.22241419e-05]
 [-9.61974825e-06 -9.61974825e-06]
 [-1.29381140e-05 -1.29381140e-05]
 [-2.75834597e-05 -2.75834597e-05]
 [-2.85747219e-05 -2.85747219e-05]
 [-1.05748948e-05 -1.05748948e-05]
 [-1.17404405e-05 -1.17404405e-05]
 [-1.64409867e-05 -1.64409867e-05]
 [-1.75900775e-05 -1.75900775e-05]
 [-2.53629193e-05 -2.53629193e-05]
 [-1.46295696e-05 -1.46295696e-05]
 [-5.87285756e-06 -5.87285756e-06]
 [-7.43963947e-06 -7.43963947e-06]
 [-3.83405592e-05 -3.83405592e-05]
 [-1.03767461e-05 -1.03767461e-05]
 [-1.01953612e-05 -1.01953612e-05]
 [-9.08949569e-06 -9.08949569e-06]
 [-3.02212969e-05 -3.02212969e-05]
 [-1.07069114e-05 -1.07069114e-05]
 [-2.26122933e-05 -2.26122933e-05]
 [-1.13419346e-05 -1.13419346e-05]
 [-1.04141755e-05 -1.04141755e-05]
 [-8.60445614e-06 -8.60445614e-06]
 [-1.77517560e-05 -1.77517560e-05]
 [-1.17008001e-05 -1.17008001e-05]
 [-2.10229498e-05 -2.10229498e-05]
 [-7.39550898e-06 -7.39550898e-06]
 [-1.87046441e-05 -1.87046441e-05]
 [-8.55126782e-06 -8.55126782e-06]
 [-7.92970889e-06 -7.92970889e-06]
 [-8.41506492e-06 -8.41506492e-06]
 [-8.82355912e-06 -8.82355912e-06]
 [-7.11580526e-06 -7.11580526e-06]
 [-6.95221997e-06 -6.95221997e-06]
 [-9.25016392e-06 -9.25016392e-06]
 [-1.03145631e-05 -1.03145631e-05]
 [-7.30223546e-06 -7.30223546e-06]
 [-2.06386100e-05 -2.06386100e-05]
 [-1.13261597e-05 -1.13261597e-05]
 [-1.89287057e-05 -1.89287057e-05]
 [-1.94085935e-05 -1.94085935e-05]
 [-2.99456196e-05 -2.99456196e-05]
 [-1.75010159e-05 -1.75010159e-05]
 [-1.00786226e-05 -1.00786226e-05]
 [-1.68663693e-05 -1.68663693e-05]
 [-7.75705614e-06 -7.75705614e-06]
 [-1.78427750e-05 -1.78427750e-05]
 [-1.85084558e-05 -1.85084558e-05]
 [-1.98270915e-05 -1.98270915e-05]
 [-1.52404008e-05 -1.52404008e-05]
 [-7.48192013e-06 -7.48192013e-06]
 [-1.12575800e-05 -1.12575800e-05]
 [-1.22159373e-05 -1.22159373e-05]
 [-7.38901144e-06 -7.38901144e-06]
 [-8.08576751e-06 -8.08576751e-06]
 [-1.15580285e-05 -1.15580285e-05]
 [-1.69280674e-05 -1.69280674e-05]
 [-1.38734382e-05 -1.38734382e-05]
 [-1.10299012e-05 -1.10299012e-05]
 [-7.63182704e-06 -7.63182704e-06]
 [-1.95887502e-05 -1.95887502e-05]
 [-1.12889640e-05 -1.12889640e-05]
 [-1.28272177e-06 -1.28272177e-06]
 [-1.05724445e-05 -1.05724445e-05]
 [-1.76598634e-05 -1.76598634e-05]
 [-8.27456793e-06 -8.27456793e-06]
 [-2.65254069e-05 -2.65254069e-05]
 [-3.03593286e-05 -3.03593286e-05]
 [-9.70957656e-06 -9.70957656e-06]
 [-7.93179356e-06 -7.93179356e-06]
 [-1.69401632e-05 -1.69401632e-05]
 [-3.76375277e-05 -3.76375277e-05]
 [-2.80901549e-05 -2.80901549e-05]
 [-9.97538814e-06 -9.97538814e-06]
 [-7.72428171e-06 -7.72428171e-06]
 [-7.45952519e-06 -7.45952519e-06]
 [-2.58879786e-05 -2.58879786e-05]
 [-1.04714475e-05 -1.04714475e-05]
 [-9.95753384e-06 -9.95753384e-06]
 [-8.46576455e-06 -8.46576455e-06]
 [-1.19990740e-05 -1.19990740e-05]
 [-6.30798805e-06 -6.30798805e-06]
 [-1.40203606e-05 -1.40203606e-05]
 [-1.05643100e-05 -1.05643100e-05]
 [-7.61380976e-06 -7.61380976e-06]
 [-1.21750622e-05 -1.21750622e-05]
 [-4.41081653e-05 -4.41081653e-05]
 [-1.83593595e-05 -1.83593595e-05]
 [-3.80379693e-05 -3.80379693e-05]
 [-9.44056534e-06 -9.44056534e-06]
 [-1.48678024e-05 -1.48678024e-05]
 [-2.18627961e-05 -2.18627961e-05]
 [-2.74954942e-05 -2.74954942e-05]
 [-2.17162150e-05 -2.17162150e-05]
 [-1.69615431e-05 -1.69615431e-05]
 [-8.68129977e-06 -8.68129977e-06]
 [-1.20731313e-05 -1.20731313e-05]
 [-7.08264360e-06 -7.08264360e-06]
 [-7.57025085e-06 -7.57025085e-06]
 [-1.24615338e-05 -1.24615338e-05]
 [-1.05667037e-05 -1.05667037e-05]
 [-8.43706650e-06 -8.43706650e-06]
 [-1.10621270e-05 -1.10621270e-05]
 [-2.34523476e-05 -2.34523476e-05]
 [-7.47693232e-06 -7.47693232e-06]
 [-1.80789046e-05 -1.80789046e-05]
 [-2.88889250e-05 -2.88889250e-05]
 [-8.94546218e-06 -8.94546218e-06]
 [-2.31080356e-05 -2.31080356e-05]
 [-1.52236325e-05 -1.52236325e-05]
 [-2.06523818e-05 -2.06523818e-05]
 [-1.29123319e-06 -1.29123319e-06]
 [-1.66051211e-05 -1.66051211e-05]
 [-6.00255485e-06 -6.00255485e-06]
 [-1.62491658e-05 -1.62491658e-05]
 [-1.90751354e-05 -1.90751354e-05]
 [-8.49120384e-06 -8.49120384e-06]
 [-8.02334017e-06 -8.02334017e-06]
 [-1.75120230e-05 -1.75120230e-05]
 [-1.93163634e-05 -1.93163634e-05]
 [-1.91648456e-05 -1.91648456e-05]
 [-6.56215087e-06 -6.56215087e-06]
 [-2.47149080e-05 -2.47149080e-05]
 [-2.38889150e-05 -2.38889150e-05]
 [-8.32858168e-06 -8.32858168e-06]
 [-2.80347344e-05 -2.80347344e-05]
 [-1.74116783e-05 -1.74116783e-05]
 [-7.67875684e-06 -7.67875684e-06]
 [-1.29272051e-05 -1.29272051e-05]
 [-9.80051864e-06 -9.80051864e-06]
 [-4.26469681e-05 -4.26469681e-05]
 [-9.17301185e-06 -9.17301185e-06]
 [-2.02934312e-05 -2.02934312e-05]
 [-7.77231719e-06 -7.77231719e-06]
 [-8.64361859e-06 -8.64361859e-06]
 [-2.22242019e-05 -2.22242019e-05]
 [-1.08981325e-05 -1.08981325e-05]
 [-1.71102985e-05 -1.71102985e-05]
 [-1.10160184e-05 -1.10160184e-05]
 [-2.81850123e-05 -2.81850123e-05]
 [-2.20233205e-05 -2.20233205e-05]
 [-2.92734999e-05 -2.92734999e-05]
 [-1.73918775e-05 -1.73918775e-05]
 [-7.63704599e-06 -7.63704599e-06]
 [-7.29199384e-06 -7.29199384e-06]
 [-7.37773083e-06 -7.37773083e-06]
 [-6.92461705e-06 -6.92461705e-06]
 [-1.19200590e-05 -1.19200590e-05]
 [-9.51558879e-06 -9.51558879e-06]
 [-1.87037108e-05 -1.87037108e-05]
 [-2.02231861e-05 -2.02231861e-05]
 [-1.64122235e-05 -1.64122235e-05]
 [-4.06710603e-05 -4.06710603e-05]
 [-9.66608561e-06 -9.66608561e-06]
 [-1.24611446e-05 -1.24611446e-05]
 [-2.41978714e-05 -2.41978714e-05]
 [-1.70166035e-05 -1.70166035e-05]
 [-7.71133871e-06 -7.71133871e-06]
 [-9.73289606e-06 -9.73289606e-06]
 [-7.42342842e-06 -7.42342842e-06]
 [-1.20008327e-05 -1.20008327e-05]
 [-1.00457085e-05 -1.00457085e-05]
 [-7.69680722e-06 -7.69680722e-06]
 [-1.04313434e-05 -1.04313434e-05]
 [-8.04681660e-06 -8.04681660e-06]
 [-1.99663361e-05 -1.99663361e-05]
 [-1.03509537e-05 -1.03509537e-05]
 [-1.58160370e-05 -1.58160370e-05]
 [-3.67876997e-05 -3.67876997e-05]
 [-1.77747556e-05 -1.77747556e-05]
 [-8.37113245e-06 -8.37113245e-06]
 [-1.09466689e-05 -1.09466689e-05]
 [-6.84467218e-06 -6.84467218e-06]
 [-1.04461922e-05 -1.04461922e-05]
 [-1.54623901e-05 -1.54623901e-05]
 [-8.54618011e-06 -8.54618011e-06]
 [-3.40896154e-05 -3.40896154e-05]
 [-2.15791778e-05 -2.15791778e-05]
 [-6.85692518e-06 -6.85692518e-06]
 [-1.68830891e-05 -1.68830891e-05]
 [-1.29859980e-05 -1.29859980e-05]
 [-7.70427533e-06 -7.70427533e-06]
 [-1.79746317e-05 -1.79746317e-05]
 [-6.93617659e-06 -6.93617659e-06]
 [-1.68257023e-05 -1.68257023e-05]
 [-9.16039105e-06 -9.16039105e-06]
 [-9.69721008e-06 -9.69721008e-06]
 [-2.31923858e-05 -2.31923858e-05]
 [-1.02347865e-05 -1.02347865e-05]
 [-8.56020646e-06 -8.56020646e-06]
 [-4.07301900e-05 -4.07301900e-05]
 [-1.00205041e-05 -1.00205041e-05]
 [-1.48700575e-05 -1.48700575e-05]
 [-8.19659640e-06 -8.19659640e-06]
 [-1.22360311e-05 -1.22360311e-05]
 [-1.09850829e-05 -1.09850829e-05]
 [-8.20162543e-06 -8.20162543e-06]]
[[-5.83436751e-06 -5.83436751e-06]
 [-1.13193346e-05 -1.13193346e-05]
 [-7.48054709e-06 -7.48054709e-06]
 [-7.14380173e-06 -7.14380173e-06]
 [-7.33166393e-06 -7.33166393e-06]
 [-9.11553167e-06 -9.11553167e-06]
 [-1.16411616e-05 -1.16411616e-05]
 [-2.18360887e-05 -2.18360887e-05]
 [-6.86319130e-06 -6.86319130e-06]
 [-1.43368132e-05 -1.43368132e-05]
 [-1.58026814e-05 -1.58026814e-05]
 [-7.78342716e-06 -7.78342716e-06]
 [-7.61728317e-06 -7.61728317e-06]
 [-8.86275703e-06 -8.86275703e-06]
 [-6.13372290e-06 -6.13372290e-06]
 [-8.15826455e-06 -8.15826455e-06]
 [-1.78299912e-05 -1.78299912e-05]
 [-8.85840301e-06 -8.85840301e-06]
 [-1.52137339e-05 -1.52137339e-05]
 [-8.69585750e-06 -8.69585750e-06]
 [-7.24748797e-06 -7.24748797e-06]
 [-9.99477672e-06 -9.99477672e-06]
 [-9.12184504e-06 -9.12184504e-06]
 [-4.96652464e-06 -4.96652464e-06]
 [-6.21334268e-06 -6.21334268e-06]
 [-1.13740230e-05 -1.13740230e-05]
 [-8.58762971e-06 -8.58762971e-06]
 [-7.25439373e-06 -7.25439373e-06]
 [-2.93390649e-05 -2.93390649e-05]
 [-4.85247806e-06 -4.85247806e-06]
 [-1.34030696e-05 -1.34030696e-05]
 [-9.50135999e-06 -9.50135999e-06]
 [-1.51740352e-05 -1.51740352e-05]
 [-4.98987737e-06 -4.98987737e-06]
 [-3.39677167e-06 -3.39677167e-06]
 [-1.62351340e-05 -1.62351340e-05]
 [-6.68963155e-06 -6.68963155e-06]
 [-7.57569908e-06 -7.57569908e-06]
 [-8.37814179e-06 -8.37814179e-06]
 [-1.50768246e-05 -1.50768246e-05]
 [-1.68909249e-05 -1.68909249e-05]
 [-2.38245818e-05 -2.38245818e-05]
 [-5.49304408e-06 -5.49304408e-06]
 [-7.71991946e-07 -7.71991946e-07]
 [-4.61464904e-06 -4.61464904e-06]
 [-6.50340273e-06 -6.50340273e-06]
 [-7.24584472e-06 -7.24584472e-06]
 [-7.11220373e-06 -7.11220373e-06]
 [-4.43312208e-06 -4.43312208e-06]
 [-9.90451016e-06 -9.90451016e-06]
 [-6.61930877e-06 -6.61930877e-06]
 [-1.44520042e-05 -1.44520042e-05]
 [-7.40495818e-06 -7.40495818e-06]
 [-1.07906440e-05 -1.07906440e-05]
 [-6.97126683e-06 -6.97126683e-06]
 [-5.19400882e-06 -5.19400882e-06]
 [-1.81441358e-05 -1.81441358e-05]
 [-8.75503230e-06 -8.75503230e-06]
 [-1.26340496e-05 -1.26340496e-05]
 [-2.24077824e-05 -2.24077824e-05]
 [-6.17703950e-06 -6.17703950e-06]
 [-9.10231393e-06 -9.10231393e-06]
 [-9.81140727e-06 -9.81140727e-06]
 [-9.31838283e-06 -9.31838283e-06]
 [-7.97341911e-06 -7.97341911e-06]
 [-7.80044339e-06 -7.80044339e-06]
 [-5.76963291e-06 -5.76963291e-06]
 [-1.67221105e-05 -1.67221105e-05]
 [-6.51565927e-06 -6.51565927e-06]
 [-1.72392679e-05 -1.72392679e-05]
 [-6.72522064e-06 -6.72522064e-06]
 [-8.20409726e-06 -8.20409726e-06]
 [-5.17403759e-06 -5.17403759e-06]
 [-5.21787538e-06 -5.21787538e-06]
 [-6.37737421e-06 -6.37737421e-06]
 [-9.39485457e-06 -9.39485457e-06]
 [-7.33736332e-06 -7.33736332e-06]
 [-1.28236411e-05 -1.28236411e-05]
 [-8.84339707e-06 -8.84339707e-06]
 [-1.57010405e-05 -1.57010405e-05]
 [-1.30523343e-05 -1.30523343e-05]
 [-6.01302184e-06 -6.01302184e-06]
 [-9.09777833e-06 -9.09777833e-06]
 [-1.61082877e-05 -1.61082877e-05]
 [-9.48230443e-06 -9.48230443e-06]
 [-1.14281800e-05 -1.14281800e-05]
 [-8.28546998e-06 -8.28546998e-06]
 [-2.56528549e-05 -2.56528549e-05]
 [-8.40760242e-06 -8.40760242e-06]
 [-7.29485953e-06 -7.29485953e-06]
 [-1.06193662e-05 -1.06193662e-05]
 [-2.59588390e-05 -2.59588390e-05]
 [-6.47477500e-06 -6.47477500e-06]
 [-4.51004057e-06 -4.51004057e-06]
 [-1.39191501e-05 -1.39191501e-05]
 [-9.70279700e-06 -9.70279700e-06]
 [-5.99292518e-06 -5.99292518e-06]
 [-2.05546593e-05 -2.05546593e-05]
 [-7.36616666e-06 -7.36616666e-06]
 [-3.76176480e-05 -3.76176480e-05]
 [-1.17527474e-05 -1.17527474e-05]
 [-1.47391593e-05 -1.47391593e-05]
 [-8.95531453e-06 -8.95531453e-06]
 [-9.81530517e-06 -9.81530517e-06]
 [-1.31375080e-05 -1.31375080e-05]
 [-2.39890821e-05 -2.39890821e-05]
 [-6.56923374e-06 -6.56923374e-06]
 [-9.34648256e-06 -9.34648256e-06]
 [-7.75661827e-06 -7.75661827e-06]
 [-7.14649445e-06 -7.14649445e-06]
 [-1.18050949e-05 -1.18050949e-05]
 [-5.56079616e-06 -5.56079616e-06]
 [-8.33385073e-06 -8.33385073e-06]
 [-8.43039449e-06 -8.43039449e-06]
 [-5.91797686e-06 -5.91797686e-06]
 [-9.38311819e-06 -9.38311819e-06]
 [-5.86414079e-06 -5.86414079e-06]
 [-1.29558800e-05 -1.29558800e-05]
 [-2.22830715e-06 -2.22830715e-06]
 [-4.95234868e-06 -4.95234868e-06]
 [-9.02315204e-06 -9.02315204e-06]
 [-7.01708640e-06 -7.01708640e-06]
 [-1.36092931e-05 -1.36092931e-05]
 [-5.83877551e-06 -5.83877551e-06]
 [-6.67243302e-06 -6.67243302e-06]
 [-7.36323239e-06 -7.36323239e-06]
 [-8.87303456e-06 -8.87303456e-06]
 [-7.12498627e-06 -7.12498627e-06]
 [-9.00950134e-06 -9.00950134e-06]
 [-1.81231421e-05 -1.81231421e-05]
 [-7.26741091e-06 -7.26741091e-06]
 [-1.13693015e-05 -1.13693015e-05]
 [-5.49260497e-06 -5.49260497e-06]
 [-9.62402992e-06 -9.62402992e-06]
 [-1.16914851e-05 -1.16914851e-05]
 [-1.50249457e-05 -1.50249457e-05]
 [-7.36741134e-06 -7.36741134e-06]
 [-2.02688815e-05 -2.02688815e-05]
 [-9.74670640e-06 -9.74670640e-06]
 [-7.80151360e-06 -7.80151360e-06]
 [-6.68506865e-06 -6.68506865e-06]
 [-5.49145002e-05 -5.49145002e-05]
 [-1.08473329e-05 -1.08473329e-05]
 [-7.55358703e-06 -7.55358703e-06]
 [-7.86411225e-06 -7.86411225e-06]
 [-7.73212652e-06 -7.73212652e-06]
 [-6.71418818e-06 -6.71418818e-06]
 [-6.05870715e-06 -6.05870715e-06]
 [-1.12040708e-05 -1.12040708e-05]
 [-2.05516898e-05 -2.05516898e-05]
 [-7.95866478e-06 -7.95866478e-06]
 [-4.95566049e-05 -4.95566049e-05]
 [-3.11962222e-05 -3.11962222e-05]
 [-7.30536756e-06 -7.30536756e-06]
 [-6.98491572e-06 -6.98491572e-06]
 [-8.25513070e-06 -8.25513070e-06]
 [-9.17024745e-06 -9.17024745e-06]
 [-5.93104506e-06 -5.93104506e-06]
 [-7.32709003e-06 -7.32709003e-06]
 [-7.41198729e-06 -7.41198729e-06]
 [-3.32033288e-05 -3.32033288e-05]
 [-1.93029616e-05 -1.93029616e-05]
 [-6.40820710e-06 -6.40820710e-06]
 [-9.46398483e-06 -9.46398483e-06]
 [-1.04701880e-05 -1.04701880e-05]
 [-8.36659127e-06 -8.36659127e-06]
 [-6.48047833e-06 -6.48047833e-06]
 [-1.02191633e-05 -1.02191633e-05]
 [-7.92393446e-06 -7.92393446e-06]
 [-7.71901722e-06 -7.71901722e-06]
 [-7.20405083e-06 -7.20405083e-06]
 [-1.01285618e-05 -1.01285618e-05]
 [-1.34384135e-05 -1.34384135e-05]
 [-7.64889915e-06 -7.64889915e-06]
 [-2.28340173e-05 -2.28340173e-05]
 [-1.55219560e-05 -1.55219560e-05]
 [-1.75836549e-05 -1.75836549e-05]
 [-1.66854555e-05 -1.66854555e-05]
 [-4.12835789e-05 -4.12835789e-05]
 [-9.05573826e-06 -9.05573826e-06]
 [-1.16277700e-05 -1.16277700e-05]
 [-7.81521155e-06 -7.81521155e-06]
 [-1.02594381e-05 -1.02594381e-05]
 [-1.64815855e-05 -1.64815855e-05]
 [-2.05105931e-05 -2.05105931e-05]
 [-7.52511697e-06 -7.52511697e-06]
 [-8.03813929e-06 -8.03813929e-06]
 [-9.45344921e-06 -9.45344921e-06]
 [-1.40061179e-05 -1.40061179e-05]
 [-1.70724162e-05 -1.70724162e-05]
 [-5.65830929e-06 -5.65830929e-06]
 [-1.55663575e-05 -1.55663575e-05]
 [-1.20854764e-05 -1.20854764e-05]
 [-1.27353341e-05 -1.27353341e-05]
 [-1.02014733e-05 -1.02014733e-05]
 [-1.56615971e-05 -1.56615971e-05]
 [-2.14508126e-05 -2.14508126e-05]
 [-7.39620138e-06 -7.39620138e-06]
 [-1.94414765e-05 -1.94414765e-05]
 [-1.11938384e-05 -1.11938384e-05]
 [-1.09936440e-05 -1.09936440e-05]
 [-7.81952839e-06 -7.81952839e-06]
 [-8.81555311e-06 -8.81555311e-06]
 [-7.07360415e-06 -7.07360415e-06]
 [-9.52222595e-06 -9.52222595e-06]
 [-2.52095590e-05 -2.52095590e-05]
 [-7.40420325e-06 -7.40420325e-06]
 [-1.42402113e-05 -1.42402113e-05]
 [-1.06280992e-05 -1.06280992e-05]
 [-9.99948149e-07 -9.99948149e-07]
 [-7.99032268e-06 -7.99032268e-06]
 [-6.66569813e-06 -6.66569813e-06]
 [-1.47738686e-05 -1.47738686e-05]
 [-1.34288621e-05 -1.34288621e-05]
 [-7.03301821e-06 -7.03301821e-06]
 [-1.07030343e-05 -1.07030343e-05]
 [-6.28263478e-06 -6.28263478e-06]
 [-8.76909045e-06 -8.76909045e-06]
 [-4.77313253e-06 -4.77313253e-06]
 [-8.41295785e-06 -8.41295785e-06]
 [-6.07874980e-06 -6.07874980e-06]
 [-9.44029271e-06 -9.44029271e-06]
 [-1.02330317e-05 -1.02330317e-05]
 [-7.56615980e-06 -7.56615980e-06]
 [-2.62000228e-05 -2.62000228e-05]
 [-1.13702839e-05 -1.13702839e-05]
 [-6.52237610e-06 -6.52237610e-06]
 [-9.33274599e-06 -9.33274599e-06]
 [-1.24227484e-05 -1.24227484e-05]
 [-9.48702565e-06 -9.48702565e-06]
 [-1.23682489e-05 -1.23682489e-05]
 [-1.70232749e-05 -1.70232749e-05]
 [-1.12833621e-05 -1.12833621e-05]
 [-1.02440309e-05 -1.02440309e-05]
 [-1.08065902e-05 -1.08065902e-05]
 [-1.17702715e-05 -1.17702715e-05]
 [-6.79304284e-06 -6.79304284e-06]
 [-1.73691770e-05 -1.73691770e-05]
 [-9.34807679e-06 -9.34807679e-06]
 [-1.43104014e-05 -1.43104014e-05]
 [-1.03820065e-05 -1.03820065e-05]
 [-1.13146771e-05 -1.13146771e-05]
 [-1.80653102e-05 -1.80653102e-05]
 [-3.19810769e-05 -3.19810769e-05]
 [-1.25475318e-05 -1.25475318e-05]
 [-2.38379025e-05 -2.38379025e-05]
 [-1.30823412e-05 -1.30823412e-05]
 [-5.43956138e-06 -5.43956138e-06]
 [-8.86398812e-06 -8.86398812e-06]
 [-5.25461964e-06 -5.25461964e-06]
 [-1.74419016e-05 -1.74419016e-05]
 [-6.48625762e-06 -6.48625762e-06]
 [-7.00717107e-06 -7.00717107e-06]
 [-1.74748677e-05 -1.74748677e-05]
 [-2.69653384e-05 -2.69653384e-05]
 [-1.25120006e-05 -1.25120006e-05]
 [-6.39799040e-06 -6.39799040e-06]
 [-5.48263971e-06 -5.48263971e-06]
 [-1.77615587e-05 -1.77615587e-05]
 [-1.05416124e-05 -1.05416124e-05]
 [-1.13459055e-05 -1.13459055e-05]
 [-7.42542273e-06 -7.42542273e-06]
 [-5.59988368e-06 -5.59988368e-06]
 [-1.72489621e-05 -1.72489621e-05]
 [-6.05705842e-06 -6.05705842e-06]
 [-1.10124348e-05 -1.10124348e-05]
 [-8.87451900e-06 -8.87451900e-06]
 [-8.77962463e-06 -8.77962463e-06]
 [-6.06743919e-06 -6.06743919e-06]
 [-9.26544436e-06 -9.26544436e-06]
 [-9.26065966e-06 -9.26065966e-06]
 [-6.29049928e-06 -6.29049928e-06]
 [-7.14604911e-06 -7.14604911e-06]
 [-8.50793205e-06 -8.50793205e-06]
 [-1.44810267e-05 -1.44810267e-05]
 [-1.05991675e-05 -1.05991675e-05]
 [-1.18392214e-05 -1.18392214e-05]
 [-1.13943962e-05 -1.13943962e-05]
 [-1.08852783e-05 -1.08852783e-05]
 [-1.82670296e-05 -1.82670296e-05]
 [-7.69153865e-06 -7.69153865e-06]
 [-1.26552541e-05 -1.26552541e-05]
 [-8.81367040e-06 -8.81367040e-06]
 [-9.91576845e-06 -9.91576845e-06]
 [-1.26311149e-05 -1.26311149e-05]
 [-1.11519238e-05 -1.11519238e-05]
 [-4.79394203e-06 -4.79394203e-06]
 [-8.24483053e-06 -8.24483053e-06]
 [-6.13931885e-06 -6.13931885e-06]
 [-7.56755373e-06 -7.56755373e-06]
 [-1.37860766e-05 -1.37860766e-05]
 [-7.03152366e-06 -7.03152366e-06]
 [-1.22850307e-05 -1.22850307e-05]
 [-1.80005453e-05 -1.80005453e-05]
 [-9.40548391e-06 -9.40548391e-06]
 [-8.80661749e-06 -8.80661749e-06]
 [-7.76546787e-06 -7.76546787e-06]
 [-8.94289797e-06 -8.94289797e-06]
 [-7.28415159e-06 -7.28415159e-06]
 [-9.16648374e-06 -9.16648374e-06]
 [-1.37429175e-05 -1.37429175e-05]
 [-1.26103657e-05 -1.26103657e-05]
 [-8.95934861e-06 -8.95934861e-06]
 [-8.18617017e-06 -8.18617017e-06]
 [-8.54663757e-06 -8.54663757e-06]
 [-1.30323864e-05 -1.30323864e-05]
 [-3.64164102e-05 -3.64164102e-05]
 [-2.65244191e-05 -2.65244191e-05]
 [-6.56223263e-06 -6.56223263e-06]
 [-1.23337186e-05 -1.23337186e-05]
 [-9.50306902e-06 -9.50306902e-06]
 [-6.35863308e-06 -6.35863308e-06]
 [-6.75477952e-06 -6.75477952e-06]
 [-4.77356117e-06 -4.77356117e-06]
 [-1.07100090e-05 -1.07100090e-05]
 [-7.11491387e-06 -7.11491387e-06]
 [-9.88820912e-06 -9.88820912e-06]
 [-2.09537915e-05 -2.09537915e-05]
 [-1.16633081e-05 -1.16633081e-05]
 [-5.88100823e-06 -5.88100823e-06]
 [-1.10031925e-05 -1.10031925e-05]
 [-6.75114783e-06 -6.75114783e-06]
 [-5.34552545e-06 -5.34552545e-06]
 [-1.79700733e-05 -1.79700733e-05]
 [-9.94940198e-06 -9.94940198e-06]
 [-8.08391521e-06 -8.08391521e-06]
 [-2.06947373e-05 -2.06947373e-05]
 [-6.31960083e-06 -6.31960083e-06]
 [-5.93433769e-06 -5.93433769e-06]
 [-1.13633601e-05 -1.13633601e-05]
 [-8.37789076e-06 -8.37789076e-06]
 [-5.32445763e-06 -5.32445763e-06]
 [-2.72476255e-05 -2.72476255e-05]
 [-8.60110136e-06 -8.60110136e-06]
 [-1.55755832e-05 -1.55755832e-05]
 [-6.74527456e-06 -6.74527456e-06]
 [-7.90322312e-06 -7.90322312e-06]
 [-5.72049343e-06 -5.72049343e-06]
 [-1.09720055e-05 -1.09720055e-05]
 [-1.34988743e-05 -1.34988743e-05]
 [-1.83574060e-05 -1.83574060e-05]
 [-7.15573517e-06 -7.15573517e-06]
 [-1.38525925e-05 -1.38525925e-05]
 [-7.08853315e-06 -7.08853315e-06]
 [-9.82256215e-06 -9.82256215e-06]
 [-5.75523916e-06 -5.75523916e-06]
 [-8.52581932e-06 -8.52581932e-06]
 [-7.13876366e-06 -7.13876366e-06]
 [-4.42634045e-06 -4.42634045e-06]
 [-2.03369013e-05 -2.03369013e-05]
 [-9.11893331e-06 -9.11893331e-06]
 [-1.04368852e-05 -1.04368852e-05]
 [-1.79411859e-05 -1.79411859e-05]
 [-1.45474856e-05 -1.45474856e-05]
 [-1.20355877e-05 -1.20355877e-05]
 [-7.50822397e-06 -7.50822397e-06]
 [-1.18397097e-05 -1.18397097e-05]
 [-7.83725445e-06 -7.83725445e-06]
 [-8.42412901e-06 -8.42412901e-06]
 [-1.23796265e-05 -1.23796265e-05]
 [-2.04410761e-05 -2.04410761e-05]
 [-6.67450508e-06 -6.67450508e-06]
 [-7.46753305e-06 -7.46753305e-06]
 [-7.20435346e-06 -7.20435346e-06]
 [-5.81451603e-06 -5.81451603e-06]
 [-1.04079958e-05 -1.04079958e-05]
 [-9.65782225e-06 -9.65782225e-06]
 [-6.11864304e-06 -6.11864304e-06]
 [-1.12113716e-05 -1.12113716e-05]
 [-7.69065404e-06 -7.69065404e-06]
 [-1.36429966e-05 -1.36429966e-05]
 [-1.70929604e-05 -1.70929604e-05]
 [-5.46189735e-06 -5.46189735e-06]
 [-8.25798420e-06 -8.25798420e-06]
 [-9.72750183e-06 -9.72750183e-06]
 [-1.33303417e-05 -1.33303417e-05]
 [-2.80537408e-05 -2.80537408e-05]
 [-1.13587019e-05 -1.13587019e-05]
 [-1.44517012e-05 -1.44517012e-05]
 [-8.49260330e-06 -8.49260330e-06]
 [-7.10153046e-06 -7.10153046e-06]
 [-8.20956274e-06 -8.20956274e-06]
 [-6.82523838e-06 -6.82523838e-06]
 [-6.02680812e-06 -6.02680812e-06]
 [-6.68258213e-06 -6.68258213e-06]
 [-4.59360988e-06 -4.59360988e-06]
 [-1.55675256e-05 -1.55675256e-05]
 [-1.20986967e-05 -1.20986967e-05]
 [-2.13458821e-05 -2.13458821e-05]
 [-5.05348204e-06 -5.05348204e-06]
 [-7.14925537e-06 -7.14925537e-06]
 [-1.37944849e-05 -1.37944849e-05]
 [-7.12912335e-06 -7.12912335e-06]
 [-5.95333561e-06 -5.95333561e-06]
 [-8.11700294e-06 -8.11700294e-06]
 [-8.00615454e-06 -8.00615454e-06]
 [-8.41400951e-06 -8.41400951e-06]
 [-2.54098136e-05 -2.54098136e-05]
 [-1.75237659e-05 -1.75237659e-05]
 [-1.59360065e-05 -1.59360065e-05]
 [-7.03023020e-06 -7.03023020e-06]
 [-6.87336790e-06 -6.87336790e-06]
 [-7.66986449e-06 -7.66986449e-06]
 [-5.49405665e-06 -5.49405665e-06]
 [-1.67478551e-05 -1.67478551e-05]
 [-6.18585855e-06 -6.18585855e-06]
 [-1.16713423e-05 -1.16713423e-05]
 [-1.29771516e-05 -1.29771516e-05]
 [-6.92842760e-06 -6.92842760e-06]
 [-9.47107427e-06 -9.47107427e-06]
 [-7.85859316e-06 -7.85859316e-06]
 [-8.37127637e-06 -8.37127637e-06]
 [-8.42902970e-06 -8.42902970e-06]
 [-6.18513335e-06 -6.18513335e-06]
 [-1.20507885e-05 -1.20507885e-05]
 [-7.06046879e-06 -7.06046879e-06]
 [-1.86795144e-05 -1.86795144e-05]
 [-6.07465948e-06 -6.07465948e-06]
 [-7.01967631e-06 -7.01967631e-06]
 [-1.00782205e-05 -1.00782205e-05]
 [-9.18444170e-06 -9.18444170e-06]
 [-1.28295474e-05 -1.28295474e-05]
 [-8.15464632e-06 -8.15464632e-06]
 [-5.68272770e-06 -5.68272770e-06]
 [-1.55526281e-05 -1.55526281e-05]
 [-3.52635862e-05 -3.52635862e-05]
 [-6.94861218e-06 -6.94861218e-06]
 [-1.17236055e-05 -1.17236055e-05]
 [-1.36799408e-05 -1.36799408e-05]
 [-1.25183853e-05 -1.25183853e-05]
 [-7.59029844e-06 -7.59029844e-06]
 [-1.47359341e-05 -1.47359341e-05]
 [-1.17427679e-05 -1.17427679e-05]
 [-1.96322240e-05 -1.96322240e-05]
 [-8.59942942e-06 -8.59942942e-06]
 [-1.04126548e-05 -1.04126548e-05]
 [-1.50563023e-05 -1.50563023e-05]
 [-1.76165455e-05 -1.76165455e-05]
 [-1.57202465e-05 -1.57202465e-05]
 [-2.02357446e-05 -2.02357446e-05]
 [-6.38709702e-06 -6.38709702e-06]
 [-8.08670262e-06 -8.08670262e-06]
 [-2.20869621e-05 -2.20869621e-05]
 [-6.59341314e-06 -6.59341314e-06]
 [-6.14037186e-06 -6.14037186e-06]
 [-2.17065268e-05 -2.17065268e-05]
 [-6.59445088e-06 -6.59445088e-06]
 [-1.25076879e-05 -1.25076879e-05]
 [-8.84480926e-06 -8.84480926e-06]
 [-2.00797125e-05 -2.00797125e-05]
 [-8.11976095e-06 -8.11976095e-06]
 [-6.95415263e-06 -6.95415263e-06]
 [-1.91721735e-05 -1.91721735e-05]
 [-7.03266404e-06 -7.03266404e-06]
 [-1.33701481e-05 -1.33701481e-05]
 [-5.86657902e-06 -5.86657902e-06]
 [-6.07100387e-06 -6.07100387e-06]
 [-1.71476447e-05 -1.71476447e-05]
 [-7.69825073e-06 -7.69825073e-06]
 [-1.94449002e-05 -1.94449002e-05]
 [-8.09417947e-06 -8.09417947e-06]
 [-1.29550228e-05 -1.29550228e-05]
 [-5.73332110e-06 -5.73332110e-06]
 [-2.72265477e-05 -2.72265477e-05]
 [-3.20040639e-05 -3.20040639e-05]
 [-9.59015248e-06 -9.59015248e-06]
 [-2.50781203e-05 -2.50781203e-05]
 [-6.74176005e-06 -6.74176005e-06]
 [-6.55971837e-06 -6.55971837e-06]
 [-1.73764121e-05 -1.73764121e-05]
 [-5.82701487e-06 -5.82701487e-06]
 [-1.89498799e-05 -1.89498799e-05]
 [-1.39853968e-05 -1.39853968e-05]
 [-7.51578660e-06 -7.51578660e-06]
 [-7.66058704e-06 -7.66058704e-06]
 [-1.79698168e-05 -1.79698168e-05]
 [-9.03924004e-06 -9.03924004e-06]
 [-1.07861453e-05 -1.07861453e-05]
 [-5.96766474e-06 -5.96766474e-06]
 [-1.63353726e-05 -1.63353726e-05]
 [-7.08632656e-06 -7.08632656e-06]
 [-6.43184219e-06 -6.43184219e-06]
 [-1.50142175e-05 -1.50142175e-05]
 [-4.86587336e-06 -4.86587336e-06]
 [-1.56819652e-05 -1.56819652e-05]
 [-1.20149846e-05 -1.20149846e-05]
 [-1.36661136e-05 -1.36661136e-05]
 [-1.80533617e-05 -1.80533617e-05]
 [-1.29877182e-05 -1.29877182e-05]
 [-9.14545146e-06 -9.14545146e-06]
 [-1.22908790e-05 -1.22908790e-05]
 [-7.02822560e-06 -7.02822560e-06]
 [-9.08678124e-06 -9.08678124e-06]
 [-6.98455264e-06 -6.98455264e-06]
 [-6.78811740e-06 -6.78811740e-06]
 [-5.89592378e-06 -5.89592378e-06]
 [-1.75723166e-05 -1.75723166e-05]
 [-7.99060996e-06 -7.99060996e-06]
 [-1.42729250e-05 -1.42729250e-05]
 [-6.26455068e-06 -6.26455068e-06]]
import numpy as np
from matplotlib import pyplot as plt

def cal_pairwise_dist(x):
    '''计算pairwise 距离, x是matrix
    (a-b)^2 = a^w + b^2 - 2*a*b
    '''
    sum_x = np.sum(np.square(x), 1)
    dist = np.add(np.add(-2 * np.dot(x, x.T), sum_x).T, sum_x)
    return dist


def cal_perplexity(dist, idx=0, beta=1.0):
    '''计算perplexity, D是距离向量,
    idx指dist中自己与自己距离的位置,beta是高斯分布参数
    这里的perp仅计算了熵,方便计算
    '''
    prob = np.exp(-dist * beta)
    # 设置自身prob为0
    prob[idx] = 0
    sum_prob = np.sum(prob)
    perp = np.log(sum_prob) + beta * np.sum(dist * prob) / sum_prob
    prob /= sum_prob
    return perp, prob


def seach_prob(x, tol=1e-5, perplexity=30.0):
    '''二分搜索寻找beta,并计算pairwise的prob
    '''

    # 初始化参数
    print("Computing pairwise distances...")
    (n, d) = x.shape
    dist = cal_pairwise_dist(x)
    pair_prob = np.zeros((n, n))
    beta = np.ones((n, 1))
    # 取log,方便后续计算
    base_perp = np.log(perplexity)

    for i in range(n):
        if i % 500 == 0:
            print("Computing pair_prob for point %s of %s ..." % (i, n))

        betamin = -np.inf
        betamax = np.inf
        perp, this_prob = cal_perplexity(dist[i], i, beta[i])

        # 二分搜索,寻找最佳sigma下的prob
        perp_diff = perp - base_perp
        tries = 0
        while np.abs(perp_diff) > tol and tries < 50:
            if perp_diff > 0:
                betamin = beta[i].copy()
                if betamax == np.inf or betamax == -np.inf:
                    beta[i] = beta[i] * 2
                else:
                    beta[i] = (beta[i] + betamax) / 2
            else:
                betamax = beta[i].copy()
                if betamin == np.inf or betamin == -np.inf:
                    beta[i] = beta[i] / 2
                else:
                    beta[i] = (beta[i] + betamin) / 2

            # 更新perb,prob值
            perp, this_prob = cal_perplexity(dist[i], i, beta[i])
            perp_diff = perp - base_perp
            tries = tries + 1
        # 记录prob值
        pair_prob[i,] = this_prob
    print("Mean value of sigma: ", np.mean(np.sqrt(1 / beta)))
    return pair_prob


def pca(x, no_dims=50):
    ''' PCA算法
    使用PCA先进行预降维
    '''
    print("Preprocessing the data using PCA...")
    (n, d) = x.shape
    x = x - np.tile(np.mean(x, 0), (n, 1))
    l, M = np.linalg.eig(np.dot(x.T, x))
    y = np.dot(x, M[:, 0:no_dims])
    return y


def tsne(x, no_dims=2, initial_dims=50, perplexity=30.0, max_iter=1000):
    """Runs t-SNE on the dataset in the NxD array x
    to reduce its dimensionality to no_dims dimensions.
    The syntaxis of the function is Y = tsne.tsne(x, no_dims, perplexity),
    where x is an NxD NumPy array.
    """

    # Check inputs
    if isinstance(no_dims, float):
        print("Error: array x should have type float.")
        return -1
    if round(no_dims) != no_dims:
        print("Error: number of dimensions should be an integer.")
        return -1

    # 初始化参数和变量
    x = pca(x, initial_dims).real
    (n, d) = x.shape
    initial_momentum = 0.5
    final_momentum = 0.8
    eta = 500
    min_gain = 0.01
    y = np.random.randn(n, no_dims)
    dy = np.zeros((n, no_dims))
    iy = np.zeros((n, no_dims))
    gains = np.ones((n, no_dims))

    # 对称化
    P = seach_prob(x, 1e-5, perplexity)
    P = P + np.transpose(P)
    P = P / np.sum(P)
    # early exaggeration
    P = P * 4
    P = np.maximum(P, 1e-12)

    # Run iterations
    for iter in range(max_iter):
        # Compute pairwise affinities
        sum_y = np.sum(np.square(y), 1)
        num = 1 / (1 + np.add(np.add(-2 * np.dot(y, y.T), sum_y).T, sum_y))
        num[range(n), range(n)] = 0
        Q = num / np.sum(num)
        Q = np.maximum(Q, 1e-12)

        # Compute gradient
        PQ = P - Q
        for i in range(n):
            dy[i, :] = np.sum(np.tile(PQ[:, i] * num[:, i], (no_dims, 1)).T * (y[i, :] - y), 0)

        # Perform the update
        if iter < 20:
            momentum = initial_momentum
        else:
            momentum = final_momentum
        gains = (gains + 0.2) * ((dy > 0) != (iy > 0)) + (gains * 0.8) * ((dy > 0) == (iy > 0))
        gains[gains < min_gain] = min_gain
        iy = momentum * iy - eta * (gains * dy)
        y = y + iy
        y = y - np.tile(np.mean(y, 0), (n, 1))
        # Compute current value of cost function
        if (iter + 1) % 100 == 0:
            if iter > 100:
                C = np.sum(P * np.log(P / Q))
            else:
                C = np.sum(P / 4 * np.log(P / 4 / Q))
            print("Iteration ", (iter + 1), ": error is ", C)
        # Stop lying about P-values
        if iter == 100:
            P = P / 4
    print("finished training!")
    return y


if __name__ == "__main__":
    # Run Y = tsne.tsne(X, no_dims, perplexity) to perform t-SNE on your dataset.
    X = np.loadtxt("data/data43965/mnist2500_X.txt")
    labels = np.loadtxt("data/data43965/labels.txt")
    Y = tsne(X, 2, 50, 20.0)


    plt.scatter(Y[:, 0], Y[:, 1], 20, labels)
    plt.show()

Preprocessing the data using PCA...
Computing pairwise distances...
Computing pair_prob for point 0 of 2500 ...
Computing pair_prob for point 500 of 2500 ...
Computing pair_prob for point 1000 of 2500 ...
Computing pair_prob for point 1500 of 2500 ...
Computing pair_prob for point 2000 of 2500 ...
Mean value of sigma:  2.386596621341598
Iteration  100 : error is  2.6004845459938735
Iteration  200 : error is  1.3510884241524586
Iteration  300 : error is  1.1628466510856448
Iteration  400 : error is  1.0959801071394555
Iteration  500 : error is  1.0622782112741476
Iteration  600 : error is  1.0426450964779017
Iteration  700 : error is  1.0300298579821003
Iteration  800 : error is  1.0213631772484943
Iteration  900 : error is  1.0150281009030835
Iteration  1000 : error is  1.0102834422886886
finished training!

运行代码请点击:https://aistudio.baidu.com/aistudio/projectdetail/628802?shared=1

欢迎三连!

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值