機器學習與模式識別_第1頁
機器學習與模式識別_第2頁
機器學習與模式識別_第3頁
機器學習與模式識別_第4頁
機器學習與模式識別_第5頁
已閱讀5頁,還剩9頁未讀 繼續(xù)免費閱讀

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進行舉報或認領(lǐng)

文檔簡介

1、機器學習與模式識別實驗報告(二)姓名:白鵬飛 班級:計科1203 學號:0909121328 指導(dǎo)老師:梁毅雄IntroductionIn this exercise, you will implement logistic regression and apply it to two dierent datasets. Before starting on the programming exercise, we strongly recommend watching the video lectures and completing the review questions for the

2、 associated topics.Files included in this exerciseex2.m - Octave script that will help step you through the exerciseex2 reg.m - Octave script for the later parts of the exerciseex2data1.txt-Trainingsetforthe rst half of the exerciseex2data2.txt - Training set for the second half of the exercisesubmi

3、tWeb.m - Alternative submission scriptsubmit.m - Submission script that sends your solutions to our serversmapFeature.m - Function to generate polynomial featuresplotDecisionBounday.m - Function to plot classiFunction to plot 2DclassiFunction to plot 2Dclassiers decision boundary ? plotData.mcation

4、data? sigmoid.m - Sigmoid Function? costFunction.m - Logistic Regression Cost Function? predict.m - Logistic Regression Prediction Function? costFunctionReg.m - Regularized Logistic Regression Cost 1?indicatesles you will need to completeThroughout the exercise, you will be using the scripts ex2.m a

5、nd ex2 reg.m. These scripts set up the dataset for the problems and make calls to functions that you will write. You do not need to modify either of them. You are only requiredtomodifyfunctionsinotherles, by following the instructions in this assignment.補全代碼后如下:plotData.mfunction plotData(X, y)%PLOT

6、DATA Plots the data points X and y into a new figure% PLOTDATA(x,y) plots the data points with + for the positive examples% and o for the negative examples. X is assumed to be a Mx2 matrix.% Create New Figurefigure; hold on;% = YOUR CODE HERE =% Instructions: Plot the positive and negative examples

7、on a%2D plot, using the option k+ for the positive%examples and ko for the negative examples.% positive = find (y = 1);negative = find (y=0);plot (X(positive, 1), X(positive,2), k+,MarkerSize,7 , LineWidth ,2);plot (X(negative, 1), X(negative,2), ko,MarkerFaceColor,y , MarkSize ,7);hold off;endSigmo

8、id.m function g = sigmoid(z)%SIGMOID Compute sigmoid functoon% J = SIGMOID(z) computes the sigmoid of z.% You need to return the following variables correctlyg = zeros(size(z);% = YOUR CODE HERE =% Instructions: Compute the sigmoid of each value of z (z can be a matrix,%vector or scalar).g = 1./(1+e

9、xp(-z);% =endcostFunction.mfunction J, grad = costFunction(theta, X, y)%COSTFUNCTION Compute cost and gradient for logistic regression% J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the% parameter for logistic regression and the gradient of the cost% w.r.t. to the parameters.% In

10、itialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctlyJ = 0;grad = zeros(size(theta);% = YOUR CODE HERE =% Instructions: Compute the cost of a particular choice of theta.%You should set J to the cost.%Compute the partial derivati

11、ves and set grad to the partial%derivatives of the cost w.r.t. each parameter in theta%Hypothesis = sigmoid (X*theta);J=1/m*sum(-y.*log(hypothesis)-(1-y).*log(1-hypothesis)+0.5*lambda/m*(theta(2:end) *theta(2:end);n = size(X,2);grad(1) = 1/m*dot(hypothesis-y,X(:,1);for i =2 : ngrad(i) = 1/m*dot(hypo

12、thesis - y,X(:,i)+lambda /m * theta(i);% Note: grad should have the same dimensions as theta% =endPredict.mfunction p = predict(theta, X)%PREDICT Predict whether the label is 0 or 1 using learned logistic%regression parameters theta% p = PREDICT(theta, X) computes the predictions for X using a%thres

13、hold at 0.5 (i.e., if sigmoid(theta*x) = 0.5, predict 1)m = size(X, 1); % Number of training examples% You need to return the following variables correctly p = zeros(m, 1);% = YOUR CODE HERE =% Instructions: Complete the following code to make predictions using%your learned logistic regression param

14、eters.%You should set p to a vector of 0s and 1s% p= sigmoid(X*theta) = 0.5;endcostFunctionReg.mfunction J, grad = costFunctionReg(theta, X, y, lambda)%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of

15、 using% theta as the parameter for regularized logistic regression and the% gradient of the cost w.r.t. to the parameters.% Initialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctlyJ = 0;grad = zeros(size(theta);% = YOUR CODE HERE

16、 =% Instructions: Compute the cost of a particular choice of theta.%You should set J to the cost.%Compute the partial derivatives and set grad to the partial%derivatives of the cost w.r.t. each parameter in thetahypothesis = sigmoid (X*theta);J=1/m*sum(-y.*log(hypothesis)-(1-y).*log(1-hypothesis)+0.5*lambda/m*(theta(2:end) *theta(2:end);n = size(X,2);grad(1) = 1/m*dot(hypothesis-y,X(:,1);for i =2 : ngrad(i) = 1/m*dot(hypothesis - y,X(:,i)+lambda /m * theta(i);% =運行后結(jié)果如下:DFigure 2 nFile

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負責。
  • 6. 下載文件中如有侵權(quán)或不適當內(nèi)容,請與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準確性、安全性和完整性, 同時也不承擔用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。

評論

0/150

提交評論