日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

吴恩达机器学习Ex1多元回归部分

發布時間:2025/4/5 编程问答 26 豆豆
生活随笔 收集整理的這篇文章主要介紹了 吴恩达机器学习Ex1多元回归部分 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

多元線性回歸
提交作業情況:

背景:預測房價
數據集:房屋大小,臥室的數量,房價。

Loading data ... First 10 examples from the dataset: x = [2104 3], y = 399900 x = [1600 3], y = 329900 x = [2400 3], y = 369000 x = [1416 2], y = 232000 x = [3000 4], y = 539900 x = [1985 4], y = 299900 x = [1534 3], y = 314900 x = [1427 3], y = 198999 x = [1380 3], y = 212000 x = [1494 3], y = 242500 Program paused. Press enter to continue.

數值歸一化

減去均值,然后除以標準差。
兩者在matlab中有對應的函數,均值的函數名mean,標準差的函數名std
featureNormalize.m文件

function [X_norm, mu, sigma] = featureNormalize(X) %FEATURENORMALIZE Normalizes the features in X % FEATURENORMALIZE(X) returns a normalized version of X where % the mean value of each feature is 0 and the standard deviation % is 1. This is often a good preprocessing step to do when % working with learning algorithms.% You need to set these values correctly X_norm = X;%房子大小,臥室的數量 mu = zeros(1, size(X, 2)); sigma = zeros(1, size(X, 2));% ====================== YOUR CODE HERE ====================== % Instructions: First, for each feature dimension, compute the mean % of the feature and subtract it from the dataset, % storing the mean value in mu. Next, compute the % standard deviation of each feature and divide % each feature by it's standard deviation, storing % the standard deviation in sigma. % % Note that X is a matrix where each column is a % feature and each row is an example. You need % to perform the normalization separately for % each feature. % % Hint: You might find the 'mean' and 'std' functions useful. % mu=mean(X); sigma=std(X); X_norm=(X-mu)./sigma; % ============================================================end

多元線性回歸的代價函數

矩陣形式
J(θ)=12m(Xθ?y)T(Xθ?y)J(\theta)=\frac{1}{2m}(X\theta-y)^T(X\theta-y)J(θ)=2m1?(Xθ?y)T(Xθ?y)

其中
X是一個矩陣,維度是(m×(n+1))(m\times (n+1))(m×(n+1)),
θ\thetaθ是一個向量,維度是((n+1)×1)((n+1)\times 1)((n+1)×1)
y是一個列向量,維度是(m×1)(m\times 1)(m×1)

computeCostMulti.m文件

function J = computeCostMulti(X, y, theta) %COMPUTECOSTMULTI Compute cost for linear regression with multiple variables % J = COMPUTECOSTMULTI(X, y, theta) computes the cost of using theta as the % parameter for linear regression to fit the data points in X and y% Initialize some useful values m = length(y); % number of training examples% You need to return the following variables correctly J = 0;% ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta % You should set J to the cost.J=1/(2*m)*(X*theta-y)'*(X*theta-y);% =========================================================================end

多元線性回歸梯度

梯度下降法求θ\thetaθ:

θ=θ?1mXT(Xθ?y)\theta=\theta-\frac{1}{m}X^T(X\theta-y) θ=θ?m1?XT(Xθ?y)
下面的代碼主要用來實現此公式
gradientDescentMulti.m文件

function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters) %GRADIENTDESCENTMULTI Performs gradient descent to learn theta % theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by % taking num_iters gradient steps with learning rate alpha% Initialize some useful values m = length(y); % number of training examples J_history = zeros(num_iters, 1);for iter = 1:num_iters% ====================== YOUR CODE HERE ======================% Instructions: Perform a single gradient step on the parameter vector% theta. %% Hint: While debugging, it can be useful to print out the values% of the cost function (computeCostMulti) and gradient here.%theta=theta-alpha/m*X'*(X*theta-y);% ============================================================% Save the cost J in every iteration J_history(iter) = computeCostMulti(X, y, theta);endend

選擇(這也是學習率和迭代次數比較合理的選擇)

% Choose some alpha value alpha = 0.09; num_iters = 50;

得到的收斂圖

得到的收斂圖

% Choose some alpha value alpha = 0.12; num_iters = 50;

% Choose some alpha value alpha = 0.15; num_iters = 50;

得到的收斂圖

part2完整代碼

%% ================ Part 2: Gradient Descent ================% ====================== YOUR CODE HERE ====================== % Instructions: We have provided you with the following starter % code that runs gradient descent with a particular % learning rate (alpha). % % Your task is to first make sure that your functions - % computeCost and gradientDescent already work with % this starter code and support multiple variables. % % After that, try running gradient descent with % different values of alpha and see which one gives % you the best result. % % Finally, you should complete the code at the end % to predict the price of a 1650 sq-ft, 3 br house. % % Hint: By using the 'hold on' command, you can plot multiple % graphs on the same figure. % % Hint: At prediction, make sure you do the same feature normalization. %fprintf('Running gradient descent ...\n');% Choose some alpha value alpha = 0.12; num_iters = 50;% Init Theta and Run Gradient Descent theta = zeros(3, 1); [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);% Plot the convergence graph figure; hold on; plot(1:numel(J_history), J_history, '-g', 'LineWidth', 2); xlabel('Number of iterations'); ylabel('Cost J');% Display gradient descent's result fprintf('Theta computed from gradient descent: \n'); fprintf(' %f \n', theta); fprintf('\n');% Estimate the price of a 1650 sq-ft, 3 br house % ====================== YOUR CODE HERE ====================== % Recall that the first column of X is all-ones. Thus, it does % not need to be normalized. price = 0; % You should change this price=[1 ([1650 3]-mu)./sigma]*theta;% ============================================================fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...'(using gradient descent):\n $%f\n'], price);fprintf('Program paused. Press enter to continue.\n'); pause;

梯度下降法預測房價的代碼

price=[1 ([1650 3]-mu)./sigma]*theta;

得到的結果

Running gradient descent ... Theta computed from gradient descent: 339842.312379 106499.134141 -2521.749858 Predicted price of a 1650 sq-ft, 3 br house (using gradient descent):$293411.151468

使用正規方程

正規方程求解θ\thetaθ 的公式為
θ=(XTX)?1XTy\theta=(X^TX)^{-1}X^T yθ=(XTX)?1XTy

function [theta] = normalEqn(X, y) %NORMALEQN Computes the closed-form solution to linear regression % NORMALEQN(X,y) computes the closed-form solution to linear % regression using the normal equations.theta = zeros(size(X, 2), 1);% ====================== YOUR CODE HERE ====================== % Instructions: Complete the code to compute the closed form solution % to linear regression and put the result in theta. %% ---------------------- Sample Solution ----------------------theta=pinv(X'*X)*X'*y;% -------------------------------------------------------------% ============================================================end

part3完整代碼

%% ================ Part 3: Normal Equations ================fprintf('Solving with normal equations...\n');% ====================== YOUR CODE HERE ====================== % Instructions: The following code computes the closed form % solution for linear regression using the normal % equations. You should complete the code in % normalEqn.m % % After doing so, you should complete this code % to predict the price of a 1650 sq-ft, 3 br house. %%% Load Data data = csvread('ex1data2.txt'); X = data(:, 1:2); y = data(:, 3); m = length(y);% Add intercept term to X X = [ones(m, 1) X];% Calculate the parameters from the normal equation theta = normalEqn(X, y);% Display normal equation's result fprintf('Theta computed from the normal equations: \n'); fprintf(' %f \n', theta); fprintf('\n');% Estimate the price of a 1650 sq-ft, 3 br house % ====================== YOUR CODE HERE ====================== price = 0; % You should change this %price=[1 ([1650 3]-mu)./sigma]*theta; price=[1 1650 3]*theta; % ============================================================fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...'(using normal equations):\n $%f\n'], price);

預測房價的代碼

price=[1 1650 3]*theta;

這里不需要再進行歸一化,

得到的結果```cpp Solving with normal equations... Theta computed from the normal equations: 89597.909544 139.210674 -8738.019113 Predicted price of a 1650 sq-ft, 3 br house (using normal equations):$293081.464335

最后兩者的結果進行比較,得到不同的θ\thetaθ值,最后預測的房價結果差不多,使用梯度下降法預測的房價是 $293411.151468,使用正規方程預測的房價是$293081.464335。

Normalizing Features ... Running gradient descent ... Theta computed from gradient descent: 339842.312379 106499.134141 -2521.749858 Predicted price of a 1650 sq-ft, 3 br house (using gradient descent):$293411.151468 Program paused. Press enter to continue. Solving with normal equations... Theta computed from the normal equations: 89597.909544 139.210674 -8738.019113 Predicted price of a 1650 sq-ft, 3 br house (using normal equations):$293081.464335

請注意預測房價的代碼差別

%梯度下降法 %需要歸一化 price=[1 ([1650 3]-mu)./sigma]*theta; %正規方程 price=[1 1650 3]*theta;


當特征數量不大(小于10000)時,通常采用正規方程方法來計算參數θ\thetaθ,而不是用梯度下降法。

圖片來源:黃海廣,機器學習筆記

總結

以上是生活随笔為你收集整理的吴恩达机器学习Ex1多元回归部分的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。