此次部分是要完成K-均值聚类和主成成分分析
1 K-means Clustering
首先我们会在2维的数据集上实现K-均值聚类,之后我们会运用K-均值聚类来对图片进行压缩,也就是将拥有256种颜色的图片聚类成拥有16种颜色的图片。
1.1 Implementing K-means
实现K-均值聚类的步骤如下:
- 随机选取K个中心点
- 设置迭代次数,在每次迭代中运行下面两个内循环
- 选取每个 x ( i ) x^{(i)} x(i)到K个中心点中最小距离的那个中心点,并将 x ( i ) x^{(i)} x(i)归为那个中心点的类
- 更新每个中心点的位置,位置为所属它类的所有 x ( i ) x{(i)} x(i)的均值
1.1.1 Finding closest centroids
这一部分主要实现上面的第三步,我们需要用一个向量来记录下每个点所属于的类。
我们需要在findClosestCentroids.m中实现
findClosestCentroids.m
function idx = findClosestCentroids(X, centroids)
%FINDCLOSESTCENTROIDS computes the centroid memberships for every example
% idx = FINDCLOSESTCENTROIDS (X, centroids) returns the closest centroids
% in idx for a dataset X where each row is a single example. idx = m x 1
% vector of centroid assignments (i.e. each entry in range [1..K])
%
% Set K
K = size(centroids, 1);
% You need to return the following variables correctly.
idx = zeros(size(X,1), 1);
% ====================== YOUR CODE HERE ======================
% Instructions: Go over every example, find its closest centroid, and store
% the index inside idx at the appropriate location.
% Concretely, idx(i) should contain the index of the centroid
% closest to example i. Hence, it should be a value in the
% range 1..K
%
% Note: You can use a for-loop over the examples to compute this.
%
m = size(X,1);
for i=1:m
bestdis = 100000;
for j=1:K
tdis = X(i,:)-centroids(j,:);
dis = tdis*tdis';
if(bestdis>dis)
bestdis=dis;
idx(i)=j;
endif
endfor
endfor
% =============================================================
end
1.1.2 Computing centroid means
这一部分我们主要实现第四步。更新中心点的位置。
我们会在computeCentroids.m中实现。
computeCentroids.m
function centroids = computeCentroids(X, idx, K)
%COMPUTECENTROIDS returns the new centroids by computing the means of the
%data points assigned to each centroid.
% centroids = COMPUTECENTROIDS(X, idx, K) returns the new centroids by
% computing the means of the data points assigned to each centroid. It is
% given a dataset X where each row is a single data point, a vector
% idx of centroid assignments (i.e. each entry in range [1..K]) for each
% example, and K, the number of centroids. You should return a matrix
% centroids, where each row of centroids is the mean of the data points
% assigned to it.
%
% Useful variables
[m n] = size(X);
% You need to return the following variables correctly.
centroids = zeros(K, n);
% ====================== YOUR CODE HERE ======================
% Instructions: Go over every centroid and compute mean of all points that
% belong to it. Concretely, the row vector centroids(i, :)
% should contain the mean of the data points assigned to
% centroid i.
%
% Note: You can use a for-loop over the centroids to compute this.
%
num = zeros(K,1);
for i=1:m
num(idx(i))++;
centroids(idx(i),:)+=X(i,:);
endfor
for i=1:K
centroids(i,:)/=num(i);
endfor
% =============================================================
end
1.2 K-means on example dataset
实现了上面两步之后,我们就会得到中心点的变化轨迹。
通过上图可以看到中心点的变化轨迹。
1.3 Random initialization
这一部分主要实现初始中心点的选取,我们需要从训练集中随机抽取K个点,然后将这些点作为K个中心点的位置。
测试中已经给出了随机选取的代码。
kMeansInitCentroids.m
function centroids = kMeansInitCentroids(X, K)
%KMEANSINITCENTROIDS This function initializes K centroids that are to be
%used in K-Means on the dataset X
% centroids = KMEANSINITCENTROIDS(X, K) returns K initial centroids to be
% used with the K-Means on the dataset X
%
% You should return this values correctly
centroids = zeros(K, size(X, 2));
% ====================== YOUR CODE HERE ======================
% Instructions: You should set centroids to randomly chosen examples from
% the dataset X
%
randidx = randperm(size(X,1));
centroids = X(randidx(1:K),:);
% =============================================================
end
1.4 Image compression with K-means
在这一部分,我们主要实现利用K-均值聚类算法对图像进行压缩。
运行结果如下:
2 Principal Component Analysis
这一部分主要实现用PCA对数据进行降维。我们会先实现将2维数据降成1维,然后再对更大的数据集进行降维。
2.1 Example Dataset
2.2 Implementing PCA
这一部分我们主要实现PCA算法,在实现之前,我们需要对数据进行特征缩放和均值归一。
完成特征缩放和均值归一,我们会在pca.m中实现PCA算法。
我们需要算出协方差矩阵,计算式子如下:
∑
=
1
m
X
T
X
\sum=\frac{1}{m}X^TX
∑=m1XTX
之后我们调用svd函数就可以实现了。
pca.m
function [U, S] = pca(X)
%PCA Run principal component analysis on the dataset X
% [U, S, X] = pca(X) computes eigenvectors of the covariance matrix of X
% Returns the eigenvectors U, the eigenvalues (on diagonal) in S
%
% Useful values
[m, n] = size(X);
% You need to return the following variables correctly.
U = zeros(n);
S = zeros(n);
% ====================== YOUR CODE HERE ======================
% Instructions: You should first compute the covariance matrix. Then, you
% should use the "svd" function to compute the eigenvectors
% and eigenvalues of the covariance matrix.
%
% Note: When computing the covariance matrix, remember to divide by m (the
% number of examples).
%
Sigma = 1/m*X'*X;
[U,S,V] = svd(Sigma);
% =========================================================================
end
2.3 Dimensionality Reduction with PCA
这一部分我们主要是将 x ( i ) x^{(i)} x(i)映射为 z ( i ) z^{(i)} z(i),进行降维
2.3.1 Projecting the data onto the principal components
projectData.m
function Z = projectData(X, U, K)
%PROJECTDATA Computes the reduced data representation when projecting only
%on to the top k eigenvectors
% Z = projectData(X, U, K) computes the projection of
% the normalized inputs X into the reduced dimensional space spanned by
% the first K columns of U. It returns the projected examples in Z.
%
% You need to return the following variables correctly.
Z = zeros(size(X, 1), K);
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the projection of the data using only the top K
% eigenvectors in U (first K columns).
% For the i-th example X(i,:), the projection on to the k-th
% eigenvector is given as follows:
% x = X(i, :)';
% projection_k = x' * U(:, k);
%
Z = X*U(:,1:K);
% =============================================================
end
2.3.2 Reconstructing an approximation of the data
recoverData.m
function X_rec = recoverData(Z, U, K)
%RECOVERDATA Recovers an approximation of the original data when using the
%projected data
% X_rec = RECOVERDATA(Z, U, K) recovers an approximation the
% original data that has been reduced to K dimensions. It returns the
% approximate reconstruction in X_rec.
%
% You need to return the following variables correctly.
X_rec = zeros(size(Z, 1), size(U, 1));
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the approximation of the data by projecting back
% onto the original space using the top K eigenvectors in U.
%
% For the i-th example Z(i,:), the (approximate)
% recovered data for dimension j is given as follows:
% v = Z(i, :)';
% recovered_j = v' * U(j, 1:K)';
%
% Notice that U(j, 1:K) is a row vector.
%
X_rec = Z*U(:,1:K)';
% =============================================================
end
这就是此次作业需要我们手动实现的部分。