日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問(wèn) 生活随笔!

生活随笔

當(dāng)前位置: 首頁(yè) > 编程资源 > 编程问答 >内容正文

编程问答

吴恩达 coursera ML 第五课总结+作业答案

發(fā)布時(shí)間:2025/3/15 编程问答 24 豆豆
生活随笔 收集整理的這篇文章主要介紹了 吴恩达 coursera ML 第五课总结+作业答案 小編覺(jué)得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

前言

學(xué)以致用,以學(xué)促用,通過(guò)筆記總結(jié),鞏固學(xué)習(xí)成果,復(fù)習(xí)新學(xué)的概念。

目錄

文章目錄

  • 前言
  • 目錄
  • 正文
    • 模型引入
    • 決策邊界
    • 誤差函數(shù)
    • 多分類問(wèn)題
  • 作業(yè)答案

正文

本節(jié)學(xué)習(xí)內(nèi)容主要為邏輯回歸-分類。

模型引入

問(wèn)題引入,收到一封郵件后,電腦如何自動(dòng)判斷將其歸類為垃圾郵件,節(jié)約我們看郵件的時(shí)間。
例子,根據(jù)腫瘤尺寸對(duì)癌癥的良性和惡性進(jìn)行分類,假設(shè)計(jì)算的值》=0.5,則認(rèn)為腫瘤是惡性的。
因?yàn)?#xff0c;我們想要0<y(x)<1,因此,我們選擇了sigmoid函數(shù)作為映射函數(shù),它的函數(shù)圖像如圖所示。
對(duì)于理論輸出結(jié)果的解釋,多少概率是這個(gè)結(jié)果。

決策邊界

邏輯回歸模型詳解,對(duì)應(yīng)于y=1時(shí)的原始x值,以及中間輸出值z(mì)的大小。

決策邊界,即是分類超平面,是模型空間里正負(fù)兩類的分界線。
分類便捷不一定是條直線,對(duì)于非線性問(wèn)題它也可能是一條曲線。

誤差函數(shù)

為了選擇一個(gè)合適的參數(shù),我們需要一個(gè)合適的誤差函數(shù),而且這個(gè)誤差函數(shù)是凸函數(shù)。

直觀演示邏輯回歸函數(shù)的誤差函數(shù)1。
直觀演示邏輯回歸函數(shù)的誤差函數(shù)2。
誤差函數(shù)組合,最終形式。

## 梯度下降的實(shí)現(xiàn)流程
這個(gè)程序的優(yōu)化算法

多分類問(wèn)題

多分類的分類邊界
多分類問(wèn)題的實(shí)現(xiàn)方式,通過(guò)n個(gè)單分類器。

作業(yè)答案

ex2.m

%% Machine Learning Online Class - Exercise 2: Logistic Regression % % Instructions % ------------ % % This file contains code that helps you get started on the logistic % regression exercise. You will need to complete the following functions % in this exericse: % % sigmoid.m % costFunction.m % predict.m % costFunctionReg.m % % For this exercise, you will not need to change any code in this file, % or any other files other than those mentioned above. %%% Initialization clear ; close all; clc%% Load Data % The first two columns contains the exam scores and the third column % contains the label.data = load('ex2data1.txt'); X = data(:, [1, 2]); y = data(:, 3);%% ==================== Part 1: Plotting ==================== % We start the exercise by first plotting the data to understand the % the problem we are working with.fprintf(['Plotting data with + indicating (y = 1) examples and o ' ...'indicating (y = 0) examples.\n']);plotData(X, y);% Put some labels hold on; % Labels and Legend xlabel('Exam 1 score') ylabel('Exam 2 score')% Specified in plot order legend('Admitted', 'Not admitted') hold off;fprintf('\nProgram paused. Press enter to continue.\n'); pause;%% ============ Part 2: Compute Cost and Gradient ============ % In this part of the exercise, you will implement the cost and gradient % for logistic regression. You neeed to complete the code in % costFunction.m% Setup the data matrix appropriately, and add ones for the intercept term [m, n] = size(X);% Add intercept term to x and X_test X = [ones(m, 1) X];% Initialize fitting parameters initial_theta = zeros(n + 1, 1);% Compute and display initial cost and gradient [cost, grad] = costFunction(initial_theta, X, y);fprintf('Cost at initial theta (zeros): %f\n', cost); fprintf('Expected cost (approx): 0.693\n'); fprintf('Gradient at initial theta (zeros): \n'); fprintf(' %f \n', grad); fprintf('Expected gradients (approx):\n -0.1000\n -12.0092\n -11.2628\n');% Compute and display cost and gradient with non-zero theta test_theta = [-24; 0.2; 0.2]; [cost, grad] = costFunction(test_theta, X, y);fprintf('\nCost at test theta: %f\n', cost); fprintf('Expected cost (approx): 0.218\n'); fprintf('Gradient at test theta: \n'); fprintf(' %f \n', grad); fprintf('Expected gradients (approx):\n 0.043\n 2.566\n 2.647\n');fprintf('\nProgram paused. Press enter to continue.\n'); pause;%% ============= Part 3: Optimizing using fminunc ============= % In this exercise, you will use a built-in function (fminunc) to find the % optimal parameters theta.% Set options for fminunc options = optimset('GradObj', 'on', 'MaxIter', 400);% Run fminunc to obtain the optimal theta % This function will return theta and the cost [theta, cost] = ...fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);% Print theta to screen fprintf('Cost at theta found by fminunc: %f\n', cost); fprintf('Expected cost (approx): 0.203\n'); fprintf('theta: \n'); fprintf(' %f \n', theta); fprintf('Expected theta (approx):\n'); fprintf(' -25.161\n 0.206\n 0.201\n');% Plot Boundary plotDecisionBoundary(theta, X, y);% Put some labels hold on; % Labels and Legend xlabel('Exam 1 score') ylabel('Exam 2 score')% Specified in plot order legend('Admitted', 'Not admitted') hold off;fprintf('\nProgram paused. Press enter to continue.\n'); pause;%% ============== Part 4: Predict and Accuracies ============== % After learning the parameters, you'll like to use it to predict the outcomes % on unseen data. In this part, you will use the logistic regression model % to predict the probability that a student with score 45 on exam 1 and % score 85 on exam 2 will be admitted. % % Furthermore, you will compute the training and test set accuracies of % our model. % % Your task is to complete the code in predict.m% Predict probability for a student with score 45 on exam 1 % and score 85 on exam 2 prob = sigmoid([1 45 85] * theta); fprintf(['For a student with scores 45 and 85, we predict an admission ' ...'probability of %f\n'], prob); fprintf('Expected value: 0.775 +/- 0.002\n\n');% Compute accuracy on our training set p = predict(theta, X);fprintf('Train Accuracy: %f\n', mean(double(p == y)) * 100); fprintf('Expected accuracy (approx): 89.0\n'); fprintf('\n');

sigmoid.m

function g = sigmoid(z) %SIGMOID Compute sigmoid function % g = SIGMOID(z) computes the sigmoid of z.% You need to return the following variables correctly g = zeros(size(z));% ====================== YOUR CODE HERE ====================== % Instructions: Compute the sigmoid of each value of z (z can be a matrix, % vector or scalar). g=1./(1+exp(-z));% =============================================================end

costfunction.m

function [J, grad] = costFunction(theta, X, y) %COSTFUNCTION Compute cost and gradient for logistic regression % J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the % parameter for logistic regression and the gradient of the cost % w.r.t. to the parameters.% Initialize some useful values m = length(y); % number of training examples% You need to return the following variables correctly J = 0; grad = zeros(size(theta));% ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta. % You should set J to the cost. % Compute the partial derivatives and set grad to the partial % derivatives of the cost w.r.t. each parameter in theta % % Note: grad should have the same dimensions as theta % error=0; for i=1:m error=error-y(i)*log(sigmoid(X(i,:)*theta))-(1-y(i))*log(1-sigmoid(X(i,:)*theta)); end J=error/m; for j=1:length(theta)factor=0;for i=1:mfactor=factor+(sigmoid(X(i,:)*theta)-y(i))*X(i,j);endgrad(j)=factor/m; end% =============================================================end

predict.m

function p = predict(theta, X) %PREDICT Predict whether the label is 0 or 1 using learned logistic %regression parameters theta % p = PREDICT(theta, X) computes the predictions for X using a % threshold at 0.5 (i.e., if sigmoid(theta'*x) >= 0.5, predict 1)m = size(X, 1); % Number of training examples% You need to return the following variables correctly p = zeros(m, 1);% ====================== YOUR CODE HERE ====================== % Instructions: Complete the following code to make predictions using % your learned logistic regression parameters. % You should set p to a vector of 0's and 1's %p= sigmoid(X*theta)>0.5;

總結(jié)

以上是生活随笔為你收集整理的吴恩达 coursera ML 第五课总结+作业答案的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。

如果覺(jué)得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。