日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當(dāng)前位置: 首頁(yè) > 编程资源 > 编程问答 >内容正文

编程问答

Deep learning:十一(PCA和whitening在二维数据中的练习)

發(fā)布時(shí)間:2023/12/15 编程问答 33 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Deep learning:十一(PCA和whitening在二维数据中的练习) 小編覺得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

?

  前言:

  這節(jié)主要是練習(xí)下PCA,PCA Whitening以及ZCA Whitening在2D數(shù)據(jù)上的使用,2D的數(shù)據(jù)集是45個(gè)數(shù)據(jù)點(diǎn),每個(gè)數(shù)據(jù)點(diǎn)是2維的。參考的資料是:Exercise:PCA in 2D。結(jié)合前面的博文Deep learning:十(PCA和whitening)理論知識(shí),來進(jìn)一步理解PCA和Whitening的作用。

?

  matlab某些函數(shù):

  scatter:

  scatter(X,Y,<S>,<C>,’<type>’);
  <S> – 點(diǎn)的大小控制,設(shè)為和X,Y同長(zhǎng)度一維向量,則值決定點(diǎn)的大小;設(shè)為常數(shù)或缺省,則所有點(diǎn)大小統(tǒng)一。
  <C> – 點(diǎn)的顏色控制,設(shè)為和X,Y同長(zhǎng)度一維向量,則色彩由值大小線性分布;設(shè)為和X,Y同長(zhǎng)度三維向量,則按colormap RGB值定義每點(diǎn)顏色,[0,0,0]是黑色,[1,1,1]是白色。缺省則顏色統(tǒng)一。
  <type> – 點(diǎn)型:可選filled指代填充,缺省則畫出的是空心圈。

  plot:

  plot可以用來畫直線,比如說plot([1 2],[0 4])是畫出一條連接(1,0)到(2,4)的直線,主要點(diǎn)坐標(biāo)的對(duì)應(yīng)關(guān)系。

?

  實(shí)驗(yàn)過程:

  一、首先download這些二維數(shù)據(jù),因?yàn)閿?shù)據(jù)是以文本方式保存的,所以load的時(shí)候是以ascii碼讀入的。然后對(duì)輸入樣本進(jìn)行協(xié)方差矩陣計(jì)算,并計(jì)算出該矩陣的SVD分解,得到其特征值向量,在原數(shù)據(jù)點(diǎn)上畫出2條主方向,如下圖所示:

?  

  二、將經(jīng)過PCA降維后的新數(shù)據(jù)在坐標(biāo)中顯示出來,如下圖所示:

?  

  三、用新數(shù)據(jù)反過來重建原數(shù)據(jù),其結(jié)果如下圖所示:

?  

  四、使用PCA whitening的方法得到原數(shù)據(jù)的分布情況如:

?  

  五、使用ZCA whitening的方法得到的原數(shù)據(jù)的分布如下所示:

?  

  PCA whitening和ZCA whitening不同之處在于處理后的結(jié)果數(shù)據(jù)的方差不同,盡管不同維度的方差是相等的。

?

  實(shí)驗(yàn)代碼:

close all%%================================================================ %% Step 0: Load data % We have provided the code to load data from pcaData.txt into x. % x is a 2 * 45 matrix, where the kth column x(:,k) corresponds to % the kth data point.Here we provide the code to load natural image data into x. % You do not need to change the code below.x = load('pcaData.txt','-ascii'); figure(1); scatter(x(1, :), x(2, :)); title('Raw data');%%================================================================ %% Step 1a: Implement PCA to obtain U % Implement PCA to obtain the rotation matrix U, which is the eigenbasis % sigma. % -------------------- YOUR CODE HERE -------------------- u = zeros(size(x, 1)); % You need to compute this [n m] = size(x); %x = x-repmat(mean(x,2),1,m);%預(yù)處理,均值為0 sigma = (1.0/m)*x*x'; [u s v] = svd(sigma);% -------------------------------------------------------- hold on plot([0 u(1,1)], [0 u(2,1)]);%畫第一條線 plot([0 u(1,2)], [0 u(2,2)]);%第二條線 scatter(x(1, :), x(2, :)); hold off%%================================================================ %% Step 1b: Compute xRot, the projection on to the eigenbasis % Now, compute xRot by projecting the data on to the basis defined % by U. Visualize the points by performing a scatter plot.% -------------------- YOUR CODE HERE -------------------- xRot = zeros(size(x)); % You need to compute this xRot = u'*x;% -------------------------------------------------------- % Visualise the covariance matrix. You should see a line across the % diagonal against a blue background. figure(2); scatter(xRot(1, :), xRot(2, :)); title('xRot');%%================================================================ %% Step 2: Reduce the number of dimensions from 2 to 1. % Compute xRot again (this time projecting to 1 dimension). % Then, compute xHat by projecting the xRot back onto the original axes % to see the effect of dimension reduction% -------------------- YOUR CODE HERE -------------------- k = 1; % Use k = 1 and project the data onto the first eigenbasis xHat = zeros(size(x)); % You need to compute this xHat = u*([u(:,1),zeros(n,1)]'*x);% -------------------------------------------------------- figure(3); scatter(xHat(1, :), xHat(2, :)); title('xHat');%%================================================================ %% Step 3: PCA Whitening % Complute xPCAWhite and plot the results.epsilon = 1e-5; % -------------------- YOUR CODE HERE -------------------- xPCAWhite = zeros(size(x)); % You need to compute this xPCAWhite = diag(1./sqrt(diag(s)+epsilon))*u'*x;% -------------------------------------------------------- figure(4); scatter(xPCAWhite(1, :), xPCAWhite(2, :)); title('xPCAWhite');%%================================================================ %% Step 3: ZCA Whitening % Complute xZCAWhite and plot the results.% -------------------- YOUR CODE HERE -------------------- xZCAWhite = zeros(size(x)); % You need to compute this xZCAWhite = u*diag(1./sqrt(diag(s)+epsilon))*u'*x;% -------------------------------------------------------- figure(5); scatter(xZCAWhite(1, :), xZCAWhite(2, :)); title('xZCAWhite');%% Congratulations! When you have reached this point, you are done! % You can now move onto the next PCA exercise. :)

?

?

  參考資料:

? ? ?Exercise:PCA in 2D

? ? ?Deep learning:十(PCA和whitening)

?

?

?

?

轉(zhuǎn)載于:https://www.cnblogs.com/tornadomeet/archive/2013/03/21/2973631.html

總結(jié)

以上是生活随笔為你收集整理的Deep learning:十一(PCA和whitening在二维数据中的练习)的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。