日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

Machine Learning week 8 quiz: Principal Component Analysis

發布時間:2025/3/21 编程问答 15 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Machine Learning week 8 quiz: Principal Component Analysis 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

Principal Component Analysis

5?試題

1.?

Consider the following 2D dataset:

Which of the following figures correspond to possible values that PCA may return for?u(1)?(the first eigenvector / first principal component)? Check all that apply (you may have to check more than one figure).

2.?

Which of the following is a reasonable way to select the number of principal components?k?

(Recall that?n?is the dimensionality of the input data and?m?is the number of input examples.)

Choose?k?to be 99% of?n?(i.e.,?k=0.99?n, rounded to the nearest integer).

Choose?k?to be the smallest value so that at least 1% of the variance is retained.

Choose the value of?k?that minimizes the approximation error?1mmi=1||x(i)?x(i)approx||2.

Choose?k?to be the smallest value so that at least 99% of the variance is retained.

3.?

Suppose someone tells you that they ran PCA in such a way that "95% of the variance was retained." What is an equivalent statement to this?

1mmi=1||x(i)?x(i)approx||21mmi=1||x(i)||20.95

1mmi=1||x(i)?x(i)approx||21mmi=1||x(i)||20.05

1mmi=1||x(i)?x(i)approx||21mmi=1||x(i)||20.05

1mmi=1||x(i)?x(i)approx||21mmi=1||x(i)||20.95

4.?

Which of the following statements are true? Check all that apply.

Feature scaling is not useful for PCA, since the eigenvector calculation (such as using Octave's?svd(Sigma)?routine) takes care of this automatically.

Given an input?xRn, PCA compresses it to a lower-dimensional vector?zRk.

PCA can be used only to reduce the dimensionality of data by 1 (such as 3D to 2D, or 2D to 1D).

If the input features are on very different scales, it is a good idea to perform feature scaling before applying PCA.

5.?

Which of the following are recommended applications of PCA? Select all that apply.

Data visualization: Reduce data to 2D (or 3D) so that it can be plotted.

To get more features to feed into a learning algorithm.

Clustering: To automatically group examples into coherent groups.

Data compression: Reduce the dimension of your input data?x(i), which will be used in a supervised learning algorithm (i.e., use PCA so that your supervised learning algorithm runs faster).

總結

以上是生活随笔為你收集整理的Machine Learning week 8 quiz: Principal Component Analysis的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。