CheeseZH: Stanford University: Machine Learning Ex3: Multiclass Logistic Regression and Neural Network Prediction

时间:2023-03-08 21:34:19

Handwritten digits recognition (0-9)

Multi-class Logistic Regression

1. Vectorizing Logistic Regression

(1) Vectorizing the cost function

(2) Vectorizing the gradient

(3) Vectorizing the regularized cost function

(4) Vectorizing the regularized gradient

All above 4 formulas can be found in the previous blog: click here.

lrCostFunction.m

 function [J, grad] = lrCostFunction(theta, X, y, lambda)
%LRCOSTFUNCTION Compute cost and gradient for logistic regression with
%regularization
% J = LRCOSTFUNCTION(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters. % Initialize some useful values
m = length(y); % number of training examples % You need to return the following variables correctly
J = ;
grad = zeros(size(theta)); % ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
%
% Hint: The computation of the cost function and gradients can be
% efficiently vectorized. For example, consider the computation
%
% sigmoid(X * theta)
%
% Each row of the resulting matrix will contain the value of the
% prediction for that example. You can make use of this to vectorize
% the cost function and gradient computations.
%
% Hint: When computing the gradient of the regularized cost function,
% there're many possible vectorized solutions, but one solution
% looks like:
% grad = (unregularized gradient for logistic regression)
% temp = theta;
% temp() = ; % because we don't add anything for j = 0
% grad = grad + YOUR_CODE_HERE (using the temp variable)
% hx = sigmoid(X*theta);
reg = lambda/(*m)*sum(theta(:size(theta),:).^);
J = -/m*(y'*log(hx)+(1-y)'*log(-hx)) + reg;
theta() = ;
grad = /m*X'*(hx-y)+lambda/m*theta; % ============================================================= grad = grad(:); end

2. One-vs-all Classification (Training)

Return all the classifier parameters in a matrix Θ (a K x N+1 matrix, K is the num_labels and N is the num_features ), where each row of Θ corresponds to the learned logistic regression parameters for one class. You can do this with a 'for'-loop from 1 to K, training each classifier independently.

oneVsAll.m

 function [all_theta] = oneVsAll(X, y, num_labels, lambda)
%ONEVSALL trains multiple logistic regression classifiers and returns all
%the classifiers in a matrix all_theta, where the i-th row of all_theta
%corresponds to the classifier for label i
% [all_theta] = ONEVSALL(X, y, num_labels, lambda) trains num_labels
% logisitc regression classifiers and returns each of these classifiers
% in a matrix all_theta, where the i-th row of all_theta corresponds
% to the classifier for label i % Some useful variables
m = size(X, );
n = size(X, ); % You need to return the following variables correctly
all_theta = zeros(num_labels, n + ); % Add ones to the X data matrix
X = [ones(m, ) X]; % ====================== YOUR CODE HERE ======================
% Instructions: You should complete the following code to train num_labels
% logistic regression classifiers with regularization
% parameter lambda.
%
% Hint: theta(:) will return a column vector.
%
% Hint: You can use y == c to obtain a vector of 's and 0's that tell use
% whether the ground truth is true/false for this class.
%
% Note: For this assignment, we recommend using fmincg to optimize the cost
% function. It is okay to use a for-loop (for c = :num_labels) to
% loop over the different classes.
%
% fmincg works similarly to fminunc, but is more efficient when we
% are dealing with large number of parameters.
%
% Example Code for fmincg:
%
% % Set Initial theta
% initial_theta = zeros(n + , );
%
% % Set options for fminunc
% options = optimset('GradObj', 'on', 'MaxIter', );
%
% % Run fmincg to obtain the optimal theta
% % This function will return theta and the cost
% [theta] = ...
% fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ...
% initial_theta, options);
% for c=:num_labels,
initial_theta = all_theta(c,:)';
options = optimset('GradObj','on','MaxIter',);
theta = fmincg(@(t)(lrCostFunction(t,X,(y==c),lambda)),initial_theta,options);
all_theta(c,:) = theta';
end; % ========================================================================= end

3. One-vs-all Classification (Prediction)

predictOneVsAll.m


Neural Network Prediction

Feedword Propagation and Prediction

predict.m

 function p = predict(Theta1, Theta2, X)
%PREDICT Predict the label of an input given a trained neural network
% p = PREDICT(Theta1, Theta2, X) outputs the predicted label of X given the
% trained weights of a neural network (Theta1, Theta2) % Useful values
m = size(X, );
num_labels = size(Theta2, ); % You need to return the following variables correctly
p = zeros(size(X, ), ); % ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
% your learned neural network. You should set p to a
% vector containing labels between to num_labels.
%
% Hint: The max function might come in useful. In particular, the max
% function can also return the index of the max element, for more
% information see 'help max'. If your examples are in rows, then, you
% can use max(A, [], ) to obtain the max for each row.
%
a1 = X; %*
a1 = [ones(size(X,), ),X]; %*
a2 = sigmoid(a1*Theta1');%5000*25
a2 = [ones(size(a2,),),a2]; %*
a3 = sigmoid(a2*Theta2');%5000*10
[tmp,p] = max(a3,[],);
% ========================================================================= end

Other files and dataset can be download in Coursera.