Coursera 机器学习 Week8 编程作业: K-means Clustering and Principal Component Analysis

此次部分是要完成K-均值聚类和主成成分分析

1 K-means Clustering

首先我们会在2维的数据集上实现K-均值聚类,之后我们会运用K-均值聚类来对图片进行压缩,也就是将拥有256种颜色的图片聚类成拥有16种颜色的图片。

1.1 Implementing K-means

实现K-均值聚类的步骤如下:

  1. 随机选取K个中心点
  2. 设置迭代次数,在每次迭代中运行下面两个内循环
  3. 选取每个 x ( i ) x^{(i)} x(i)到K个中心点中最小距离的那个中心点,并将 x ( i ) x^{(i)} x(i)归为那个中心点的类
  4. 更新每个中心点的位置,位置为所属它类的所有 x ( i ) x{(i)} x(i)的均值

1.1.1 Finding closest centroids

这一部分主要实现上面的第三步,我们需要用一个向量来记录下每个点所属于的类。
我们需要在findClosestCentroids.m中实现

findClosestCentroids.m

function idx = findClosestCentroids(X, centroids)
%FINDCLOSESTCENTROIDS computes the centroid memberships for every example
%   idx = FINDCLOSESTCENTROIDS (X, centroids) returns the closest centroids
%   in idx for a dataset X where each row is a single example. idx = m x 1 
%   vector of centroid assignments (i.e. each entry in range [1..K])
%

% Set K
K = size(centroids, 1);

% You need to return the following variables correctly.
idx = zeros(size(X,1), 1);

% ====================== YOUR CODE HERE ======================
% Instructions: Go over every example, find its closest centroid, and store
%               the index inside idx at the appropriate location.
%               Concretely, idx(i) should contain the index of the centroid
%               closest to example i. Hence, it should be a value in the 
%               range 1..K
%
% Note: You can use a for-loop over the examples to compute this.
%
m = size(X,1);
for i=1:m
  bestdis = 100000;
  for j=1:K
    tdis = X(i,:)-centroids(j,:);
    dis = tdis*tdis';
    if(bestdis>dis)
      bestdis=dis;
      idx(i)=j;
    endif
  endfor
endfor

% =============================================================

end


1.1.2 Computing centroid means

这一部分我们主要实现第四步。更新中心点的位置。
我们会在computeCentroids.m中实现。

computeCentroids.m

function centroids = computeCentroids(X, idx, K)
%COMPUTECENTROIDS returns the new centroids by computing the means of the 
%data points assigned to each centroid.
%   centroids = COMPUTECENTROIDS(X, idx, K) returns the new centroids by 
%   computing the means of the data points assigned to each centroid. It is
%   given a dataset X where each row is a single data point, a vector
%   idx of centroid assignments (i.e. each entry in range [1..K]) for each
%   example, and K, the number of centroids. You should return a matrix
%   centroids, where each row of centroids is the mean of the data points
%   assigned to it.
%

% Useful variables
[m n] = size(X);

% You need to return the following variables correctly.
centroids = zeros(K, n);

% ====================== YOUR CODE HERE ======================
% Instructions: Go over every centroid and compute mean of all points that
%               belong to it. Concretely, the row vector centroids(i, :)
%               should contain the mean of the data points assigned to
%               centroid i.
%
% Note: You can use a for-loop over the centroids to compute this.
%

num = zeros(K,1);
for i=1:m
  num(idx(i))++;
  centroids(idx(i),:)+=X(i,:);
endfor
for i=1:K
  centroids(i,:)/=num(i);  
endfor

% =============================================================

end

1.2 K-means on example dataset

实现了上面两步之后,我们就会得到中心点的变化轨迹。
在这里插入图片描述
通过上图可以看到中心点的变化轨迹。

1.3 Random initialization

这一部分主要实现初始中心点的选取,我们需要从训练集中随机抽取K个点,然后将这些点作为K个中心点的位置。
测试中已经给出了随机选取的代码。

kMeansInitCentroids.m

function centroids = kMeansInitCentroids(X, K)
%KMEANSINITCENTROIDS This function initializes K centroids that are to be 
%used in K-Means on the dataset X
%   centroids = KMEANSINITCENTROIDS(X, K) returns K initial centroids to be
%   used with the K-Means on the dataset X
%

% You should return this values correctly
centroids = zeros(K, size(X, 2));

% ====================== YOUR CODE HERE ======================
% Instructions: You should set centroids to randomly chosen examples from
%               the dataset X
%
randidx = randperm(size(X,1));
centroids = X(randidx(1:K),:);

% =============================================================

end


1.4 Image compression with K-means

在这一部分,我们主要实现利用K-均值聚类算法对图像进行压缩。
运行结果如下:
在这里插入图片描述

2 Principal Component Analysis

这一部分主要实现用PCA对数据进行降维。我们会先实现将2维数据降成1维,然后再对更大的数据集进行降维。

2.1 Example Dataset

在这里插入图片描述

2.2 Implementing PCA

这一部分我们主要实现PCA算法,在实现之前,我们需要对数据进行特征缩放和均值归一。
完成特征缩放和均值归一,我们会在pca.m中实现PCA算法。
我们需要算出协方差矩阵,计算式子如下:
∑ = 1 m X T X \sum=\frac{1}{m}X^TX =m1XTX
之后我们调用svd函数就可以实现了。

pca.m

function [U, S] = pca(X)
%PCA Run principal component analysis on the dataset X
%   [U, S, X] = pca(X) computes eigenvectors of the covariance matrix of X
%   Returns the eigenvectors U, the eigenvalues (on diagonal) in S
%

% Useful values
[m, n] = size(X);

% You need to return the following variables correctly.
U = zeros(n);
S = zeros(n);

% ====================== YOUR CODE HERE ======================
% Instructions: You should first compute the covariance matrix. Then, you
%               should use the "svd" function to compute the eigenvectors
%               and eigenvalues of the covariance matrix. 
%
% Note: When computing the covariance matrix, remember to divide by m (the
%       number of examples).
%
Sigma = 1/m*X'*X;
[U,S,V] = svd(Sigma);

% =========================================================================

end

2.3 Dimensionality Reduction with PCA

这一部分我们主要是将 x ( i ) x^{(i)} x(i)映射为 z ( i ) z^{(i)} z(i),进行降维

2.3.1 Projecting the data onto the principal components

projectData.m

function Z = projectData(X, U, K)
%PROJECTDATA Computes the reduced data representation when projecting only 
%on to the top k eigenvectors
%   Z = projectData(X, U, K) computes the projection of 
%   the normalized inputs X into the reduced dimensional space spanned by
%   the first K columns of U. It returns the projected examples in Z.
%

% You need to return the following variables correctly.
Z = zeros(size(X, 1), K);

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the projection of the data using only the top K 
%               eigenvectors in U (first K columns). 
%               For the i-th example X(i,:), the projection on to the k-th 
%               eigenvector is given as follows:
%                    x = X(i, :)';
%                    projection_k = x' * U(:, k);
%

Z = X*U(:,1:K);

% =============================================================

end

2.3.2 Reconstructing an approximation of the data

recoverData.m

function X_rec = recoverData(Z, U, K)
%RECOVERDATA Recovers an approximation of the original data when using the 
%projected data
%   X_rec = RECOVERDATA(Z, U, K) recovers an approximation the 
%   original data that has been reduced to K dimensions. It returns the
%   approximate reconstruction in X_rec.
%

% You need to return the following variables correctly.
X_rec = zeros(size(Z, 1), size(U, 1));

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the approximation of the data by projecting back
%               onto the original space using the top K eigenvectors in U.
%
%               For the i-th example Z(i,:), the (approximate)
%               recovered data for dimension j is given as follows:
%                    v = Z(i, :)';
%                    recovered_j = v' * U(j, 1:K)';
%
%               Notice that U(j, 1:K) is a row vector.
%               

X_rec = Z*U(:,1:K)';

% =============================================================

end

这就是此次作业需要我们手动实现的部分。

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值