日韩av黄I国产麻豆传媒I国产91av视频在线观看I日韩一区二区三区在线看I美女国产在线I麻豆视频国产在线观看I成人黄色短片

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 >

吴恩达机器学习Ex1

發布時間:2025/4/5 30 豆豆
生活随笔 收集整理的這篇文章主要介紹了 吴恩达机器学习Ex1 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

本次是week2 Linear Regression 的作業情況。
作業得分情況

作業代碼
通過執行ex1.m文件獲得想要的結果,其他函數為該文件所調用。
ex1.m文件

%% Machine Learning Online Class - Exercise 1: Linear Regression% Instructions % ------------ % % This file contains code that helps you get started on the % linear exercise. You will need to complete the following functions % in this exericse: % % warmUpExercise.m % plotData.m % gradientDescent.m % computeCost.m % gradientDescentMulti.m % computeCostMulti.m % featureNormalize.m % normalEqn.m % % For this exercise, you will not need to change any code in this file, % or any other files other than those mentioned above. % % x refers to the population size in 10,000s % y refers to the profit in $10,000s %%% Initialization clear ; close all; clc%% ==================== Part 1: Basic Function ==================== % Complete warmUpExercise.m fprintf('Running warmUpExercise ... \n'); fprintf('5x5 Identity Matrix: \n'); warmUpExercise()fprintf('Program paused. Press enter to continue.\n'); pause;%% ======================= Part 2: Plotting ======================= fprintf('Plotting Data ...\n') data = load('ex1data1.txt'); X = data(:, 1); y = data(:, 2); m = length(y); % number of training examples% Plot Data % Note: You have to complete the code in plotData.m plotData(X, y);fprintf('Program paused. Press enter to continue.\n'); pause;%% =================== Part 3: Cost and Gradient descent ===================X = [ones(m, 1), data(:,1)]; % Add a column of ones to x theta = zeros(2, 1); % initialize fitting parameters % Some gradient descent settings iterations = 1500; alpha = 0.01;fprintf('\nTesting the cost function ...\n') % compute and display initial cost J = computeCost(X, y, theta); fprintf('With theta = [0 ; 0]\nCost computed = %f\n', J); fprintf('Expected cost value (approx) 32.07\n');% further testing of the cost function J = computeCost(X, y, [-1 ; 2]); fprintf('\nWith theta = [-1 ; 2]\nCost computed = %f\n', J); fprintf('Expected cost value (approx) 54.24\n');fprintf('Program paused. Press enter to continue.\n'); pause;fprintf('\nRunning Gradient Descent ...\n') % run gradient descent theta = gradientDescent(X, y, theta, alpha, iterations);% print theta to screen fprintf('Theta found by gradient descent:\n'); fprintf('%f\n', theta); fprintf('Expected theta values (approx)\n'); fprintf(' -3.6303\n 1.1664\n\n');% Plot the linear fit hold on; % keep previous plot visible plot(X(:,2), X*theta, '-') legend('Training data', 'Linear regression') hold off % don't overlay any more plots on this figure% Predict values for population sizes of 35,000 and 70,000 predict1 = [1, 3.5] *theta; fprintf('For population = 35,000, we predict a profit of %f\n',...predict1*10000); predict2 = [1, 7] * theta; fprintf('For population = 70,000, we predict a profit of %f\n',...predict2*10000);fprintf('Program paused. Press enter to continue.\n'); pause;%% ============= Part 4: Visualizing J(theta_0, theta_1) ============= fprintf('Visualizing J(theta_0, theta_1) ...\n')% Grid over which we will calculate J theta0_vals = linspace(-10, 10, 100); theta1_vals = linspace(-1, 4, 100);% initialize J_vals to a matrix of 0's J_vals = zeros(length(theta0_vals), length(theta1_vals));% Fill out J_vals for i = 1:length(theta0_vals)for j = 1:length(theta1_vals)t = [theta0_vals(i); theta1_vals(j)];J_vals(i,j) = computeCost(X, y, t);end end% Because of the way meshgrids work in the surf command, we need to % transpose J_vals before calling surf, or else the axes will be flipped J_vals = J_vals'; % Surface plot figure; surf(theta0_vals, theta1_vals, J_vals) xlabel('\theta_0'); ylabel('\theta_1');% Contour plot figure; % Plot J_vals as 15 contours spaced logarithmically between 0.01 and 100 contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 3, 20)) xlabel('\theta_0'); ylabel('\theta_1'); hold on; plot(theta(1), theta(2), 'rx', 'MarkerSize', 10, 'LineWidth', 2);

各個函數具體實現如下:

plotData.m文件

function plotData(x, y) %PLOTDATA Plots the data points x and y into a new figure % PLOTDATA(x,y) plots the data points and gives the figure axes labels of % population and profit.figure; % open a new figure window% ====================== YOUR CODE HERE ====================== % Instructions: Plot the training data into a figure using the % "figure" and "plot" commands. Set the axes labels using % the "xlabel" and "ylabel" commands. Assume the % population and revenue data have been passed in % as the x and y arguments of this function. % % Hint: You can use the 'rx' option with plot to have the markers % appear as red crosses. Furthermore, you can make the % markers larger by using plot(..., 'rx', 'MarkerSize', 10);plot(x,y,'rx','MarkerSize',10); xlabel('Population of City in 10,100s'),ylabel('Profit in $10,000s');% ============================================================ end

結果

computeCost.m文件

function J = computeCost(X, y, theta) %COMPUTECOST Compute cost for linear regression % J = COMPUTECOST(X, y, theta) computes the cost of using theta as the % parameter for linear regression to fit the data points in X and y% Initialize some useful values m = length(y); % number of training examples% You need to return the following variables correctly J = 0;% ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta % You should set J to the cost. h=X*theta; error=(h-y).^2; J=sum(error)/(2*m);% =========================================================================end

解析
復習單變量線性回歸方程
代價函數 J(θ)J(\theta)J(θ)的表達式如下

其中m表示樣本個數,預測函數如下:

這里 hθ(x)h_{\theta}(x)hθ?(x) 寫成了行向量的形式,實際在matlab中使用的是列向量的形式。

比如這道題中h=X*theta,矩陣X維度是(m×2m\times2m×2),向量theta維度是(2×12\times12×1),所以表達式h=X*theta,得到h,維度是(m×1m\times1m×1),正好和向量y(m×1m\times1m×1)匹配。

這里 J寫成矩陣的形式只有一步sum((X*theta-y).^2) /(2*m),這里需要注意的是矩陣的乘法(X*theta-y).^2)點乘2表示對應位置元素相乘,如果不加點,表示矩陣相乘。這里需要的是對應元素相乘。

測試結果

Testing the cost function …
With theta = [0 ; 0]
Cost computed = 32.072734
Expected cost value (approx) 32.07
With theta = [-1 ; 2]
Cost computed = 54.242455
Expected cost value (approx) 54.24

gradientDescent.m文件

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters) %GRADIENTDESCENT Performs gradient descent to learn theta % theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by % taking num_iters gradient steps with learning rate alpha% Initialize some useful values m = length(y); % number of training examples J_history = zeros(num_iters, 1);for iter = 1:num_iters% ====================== YOUR CODE HERE ======================% Instructions: Perform a single gradient step on the parameter vector% theta. %% Hint: While debugging, it can be useful to print out the values% of the cost function (computeCost) and gradient here.%theta=theta-alpha/m*X'*(X*theta-y);% ============================================================% Save the cost J in every iteration J_history(iter) = computeCost(X, y, theta);endend

解析
復習梯度下降法的表達式
θ(j)\theta(j)θ(j)的迭代公式如下,這里直接給出?J(θ0,θ1)?θj\frac{\partial J(\theta_0,\theta_1) }{\partial \theta_j}?θj??J(θ0?,θ1?)?的結果,即α\alphaα后面一項

矩陣形式theta=theta-alpha/m*X'*(X*theta-y);
這里先看一下下式的矩陣表示

表達式X'*(X*theta-y) 把后面乘xjx_jxj?提前,并且轉置
h=X*theta,矩陣X維度是(m×2m\times2m×2),向量theta維度是(2×12\times12×1),所以表達式h=X*theta,得到h,維度是(m×1m\times1m×1),正好和向量y(m×1m\times1m×1)維度匹配。
X轉置的維度(2×m)(2\times m)(2×m) 和(X*theta-y)(m×1m\times1m×1)相乘,得到向量theta維度是(2×12\times12×1)。這就是向量theta更新的過程

測試梯度下降法

Running Gradient Descent …
Theta found by gradient descent:
-3.630291
1.166362
Expected theta values (approx)
-3.6303
1.1664
For population = 35,000, we predict a profit of 4519.767868
For population = 70,000, we predict a profit of 45342.450129
Program paused. Press enter to continue.

線性規劃的圖形

作業自帶的功能
Visualizing J使代價函數顯化
使用surf函數
使用contour函數

通過上面兩幅圖,第二幅等高線圖更容易看到代價函數J(θ)J(\theta)J(θ)取得最小值的點的位置。

關于數據集
文件名:ex1data1.txt

6.1101,17.592 5.5277,9.1302 8.5186,13.662 7.0032,11.854 5.8598,6.8233 8.3829,11.886 7.4764,4.3483 8.5781,12 6.4862,6.5987 5.0546,3.8166 5.7107,3.2522 14.164,15.505 5.734,3.1551 8.4084,7.2258 5.6407,0.71618 5.3794,3.5129 6.3654,5.3048 5.1301,0.56077 6.4296,3.6518 7.0708,5.3893 6.1891,3.1386 20.27,21.767 5.4901,4.263 6.3261,5.1875 5.5649,3.0825 18.945,22.638 12.828,13.501 10.957,7.0467 13.176,14.692 22.203,24.147 5.2524,-1.22 6.5894,5.9966 9.2482,12.134 5.8918,1.8495 8.2111,6.5426 7.9334,4.5623 8.0959,4.1164 5.6063,3.3928 12.836,10.117 6.3534,5.4974 5.4069,0.55657 6.8825,3.9115 11.708,5.3854 5.7737,2.4406 7.8247,6.7318 7.0931,1.0463 5.0702,5.1337 5.8014,1.844 11.7,8.0043 5.5416,1.0179 7.5402,6.7504 5.3077,1.8396 7.4239,4.2885 7.6031,4.9981 6.3328,1.4233 6.3589,-1.4211 6.2742,2.4756 5.6397,4.6042 9.3102,3.9624 9.4536,5.4141 8.8254,5.1694 5.1793,-0.74279 21.279,17.929 14.908,12.054 18.959,17.054 7.2182,4.8852 8.2951,5.7442 10.236,7.7754 5.4994,1.0173 20.341,20.992 10.136,6.6799 7.3345,4.0259 6.0062,1.2784 7.2259,3.3411 5.0269,-2.6807 6.5479,0.29678 7.5386,3.8845 5.0365,5.7014 10.274,6.7526 5.1077,2.0576 5.7292,0.47953 5.1884,0.20421 6.3557,0.67861 9.7687,7.5435 6.5159,5.3436 8.5172,4.2415 9.1802,6.7981 6.002,0.92695 5.5204,0.152 5.0594,2.8214 5.7077,1.8451 7.6366,4.2959 5.8707,7.2029 5.3054,1.9869 8.2934,0.14454 13.394,9.0551 5.4369,0.61705

總結

以上是生活随笔為你收集整理的吴恩达机器学习Ex1的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。