日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

监督学习-逻辑回归及编程作业(一)

發布時間:2025/7/14 编程问答 22 豆豆
生活随笔 收集整理的這篇文章主要介紹了 监督学习-逻辑回归及编程作业(一) 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

一、Logistic回歸——分類

對于分類問題,采用線性回歸是不合理的。

1.假設函數(logistic函數/Sigmoid函數):

?

?

注:假設函數 h 的值,看作結果為y=1的概率估計。決策界限可以看作是?h=0.5?的線。

2.代價函數

?

?

?

?

?

?

?

?

?

?

?

?

3.高級優化?fminunc

在上文優化過程中需要提供α值,而高級優化α是自動選擇。

?

?

?

優化結果

二、Logistic回歸——多元分類(一對多種類別)

? ?

?

?

?

?

?

?

?

?

?

?

?

?

三、編程作業

1.sigmoid.m?寫假設函數

function g = sigmoid(z) %SIGMOID Compute sigmoid function % g = SIGMOID(z) computes the sigmoid of z.% You need to return the following variables correctly g = zeros(size(z));% ====================== YOUR CODE HERE ====================== % Instructions: Compute the sigmoid of each value of z (z can be a matrix, % vector or scalar).g = 1./(1+ exp(-z));% =============================================================end

2.plotDate.m?數據可視化

function plotData(X, y) %PLOTDATA Plots the data points X and y into a new figure % PLOTDATA(x,y) plots the data points with + for the positive examples % and o for the negative examples. X is assumed to be a Mx2 matrix.% Create New Figure figure; hold on;% ====================== YOUR CODE HERE ====================== % Instructions: Plot the positive and negative examples on a % 2D plot, using the option 'k+' for the positive % examples and 'ko' for the negative examples. % axis([30 100 30 100]); pos = find( y==1 ); neg = find( y==0 ); plot(X(pos, 1), X(pos, 2), 'k+','LineWidth', 2, ... 'MarkerSize', 7); plot(X(neg, 1), X(neg, 2), 'ko', 'MarkerFaceColor', 'y', ... 'MarkerSize', 7);

3.costFunction.m?寫代價函數和梯度

?

function [J, grad] = costFunction(theta, X, y) %COSTFUNCTION Compute cost and gradient for logistic regression % J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the % parameter for logistic regression and the gradient of the cost % w.r.t. to the parameters.% Initialize some useful values m = length(y); % number of training examples% You need to return the following variables correctly J = 0; grad = zeros(size(theta));% ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta. % You should set J to the cost. % Compute the partial derivatives and set grad to the partial % derivatives of the cost w.r.t. each parameter in theta % % Note: grad should have the same dimensions as theta %h =sigmoid(X*theta); costfun = y.*log(h)+(1-y).*log(1-h); J = -1/m*sum(costfun); grad = X'*(h-y)/m;% =============================================================end

4.fminunc高級優化

命令行:

% Set options for fminunc options = optimset('GradObj', 'on', 'MaxIter', 400); % Run fminunc to obtain the optimal theta % This function will return theta and the cost [theta, cost] = ... fminunc(@(t)(costFunction(t, X, y)), initial theta, options);  

5.predict.m

對每個樣本預測分類結果(根據假設函數),將分類結果存到向量?v?中,與實際的分類結果?y?比較,得到正確率。

function p = predict(theta, X) %PREDICT Predict whether the label is 0 or 1 using learned logistic %regression parameters theta % p = PREDICT(theta, X) computes the predictions for X using a % threshold at 0.5 (i.e., if sigmoid(theta'*x) >= 0.5, predict 1)m = size(X, 1); % Number of training examples% You need to return the following variables correctly p = zeros(m, 1);% ====================== YOUR CODE HERE ====================== % Instructions: Complete the following code to make predictions using % your learned logistic regression parameters. % You should set p to a vector of 0's and 1's %h = sigmoid(X*theta); h(h>=0.5)=1; h(h<0.5)=0; p = h;% =========================================================================end

轉載于:https://www.cnblogs.com/sunxiaoshu/p/10557726.html

《新程序員》:云原生和全面數字化實踐50位技術專家共同創作,文字、視頻、音頻交互閱讀

總結

以上是生活随笔為你收集整理的监督学习-逻辑回归及编程作业(一)的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。