Week8_2Principal Component Analysis

Week8_2Principal Component Analysis

第 1 题

Consider the following 2D dataset:
8-1
Which of the following figures correspond to possible values that PCA may return for u(1) u ( 1 ) (the first eigenvector / first principal component)? Check all that apply (you may have to check more than one figure).

  • 8-2
  • 8-3
  • 8-4
  • 8-5

*     答案: 1 2 *
* minimize the projection error:要找到投影距离最小的向量,是1和2,方向正还是负都是可以的. *


第 2 题

Which of the following is a reasonable way to select the number of principal components k k ?
(Recall that n is the dimensionality of the input data and m m is the number of input examples.)

  • Choose k to be the smallest value so that at least 99% of the variance is retained.
  • Choose the value of k that minimizes the approximation error 1mi=1m||x(i)xapprox(i)||2.

    • Choose k to be the smallest value so that at least 1% of the variance is retained.
    • Choose k to be 99% of n (i.e., k=0.99n k = 0.99 ∗ n , rounded to the nearest integer).
    • *     答案: 1 *
      * 选项1: . 正确 *
      * 选项1: . 正确 *
      * 选项1: . 正确 *
      * 选项1: . 正确 *


      第 3 题

      Suppose someone tells you that they ran PCA in such a way that “95% of the variance was retained.” What is an equivalent statement to this?

      • 1mmi=1||x(i)||21mmi=1||x(i)x(i)approx||20.05 1 m ∑ i = 1 m | | x ( i ) | | 2 1 m ∑ i = 1 m | | x ( i ) − x a p p r o x ( i ) | | 2 ≥ 0.05
      • 1mmi=1||x(i)||21mmi=1||x(i)x(i)approx||20.05 1 m ∑ i = 1 m | | x ( i ) | | 2 1 m ∑ i = 1 m | | x ( i ) − x a p p r o x ( i ) | | 2 ≤ 0.05
      • 1mmi=1||x(i)x(i)approx||21mmi=1||x(i)||20.05 1 m ∑ i = 1 m | | x ( i ) − x a p p r o x ( i ) | | 2 1 m ∑ i = 1 m | | x ( i ) | | 2 ≤ 0.05
      • 1mmi=1||x(i)||21mmi=1||x(i)x(i)approx||20.95 1 m ∑ i = 1 m | | x ( i ) | | 2 1 m ∑ i = 1 m | | x ( i ) − x a p p r o x ( i ) | | 2 ≤ 0.95

      *     答案: 3 *
      * 选项1: . 正确 *
      * 选项1: . 正确 *
      * 选项1: . 正确 *
      * 选项1: . 正确 *


      第 4 题

      Which of the following statements are true? Check all that apply.

      • Given only z(i) z ( i ) and Ureduce U r e d u c e , there is no way to reconstruct any reasonable approximation to x(i) x ( i ) .
      • Even if all the input features are on very similar scales, we should still perform mean normalization (so that each feature has zero mean) before running PCA.
      • PCA is susceptible to local optima; trying multiple random initializations may help.
      • Given input data xRn x ∈ R n , it makes sense to run PCA only with values of k that satisfy kn k ≤ n . (In particular, running it with k=n k = n is possible but not helpful, and k>n k > n does not make sense.)

      *     答案: 2 4 *
      * 选项1: . 正确 *


      第 5 题

      Which of the following are recommended applications of PCA? Select all that apply.

      • As a replacement for (or alternative to) linear regression: For most learning applications, PCA and linear regression give substantially similar results.
      • Data compression: Reduce the dimension of your data, so that it takes up less memory / disk space.
      • Data visualization: To take 2D data, and find a different way of plotting it in 2D (using k=2).
      • Data compression: Reduce the dimension of your input data x(i) x ( i ) , which will be used in a supervised learning algorithm (i.e., use PCA so that your supervised learning algorithm runs faster).

      *     答案: 2 4 *
      * 选项1: . 正确 *

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值