Skip to content

Commit

Permalink
tweak logistic regression
Browse files Browse the repository at this point in the history
  • Loading branch information
sth4nth committed Jan 30, 2019
1 parent 0635e51 commit 314f475
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 9 deletions.
15 changes: 7 additions & 8 deletions chapter04/logitBin.m
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
function [model, llh] = logitBin(X, y, lambda, eta)
function [model, llh] = logitBin(X, y, lambda)
% Logistic regression for binary classification optimized by Newton-Raphson method.
% Input:
% X: d x n data matrix
% z: 1 x n label (0/1)
% y: 1 x n label (0/1)
% lambda: regularization parameter
% eta: step size
% alpha: step size
% Output:
% model: trained model structure
% llh: loglikelihood
% Written by Mo Chen ([email protected]).
if nargin < 4
eta = 1e-1;
alpha = 1e-1;
end
if nargin < 3
lambda = 1e-4;
Expand All @@ -20,18 +20,17 @@
tol = 1e-4;
epoch = 200;
llh = -inf(1,epoch);
h = 2*y-1;
w = rand(d,1);
for t = 2:epoch
a = w'*X;
llh(t) = -(sum(log1pexp(-h.*a))+0.5*lambda*dot(w,w))/n; % 4.89
if llh(t)-llh(t-1) < tol; break; end
llh(t) = (dot(a,y)-sum(log1pexp(a))-0.5*lambda*dot(w,w))/n; % 4.90
if abs(llh(t)-llh(t-1)) < tol; break; end
z = sigmoid(a); % 4.87
g = X*(z-y)'+lambda*w; % 4.96
r = z.*(1-z); % 4.98
Xw = bsxfun(@times, X, sqrt(r));
H = Xw*Xw'+lambda*eye(d); % 4.97
w = w-eta*(H\g);
w = w-alpha*(H\g); % 4.92
end
llh = llh(2:t);
model.w = w;
2 changes: 1 addition & 1 deletion chapter09/kmeansRnd.m
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
% mu: d x k centers of clusters
% Written by Mo Chen ([email protected]).
alpha = 1;
beta = nthroot(k,d); % in volume x^d there is k points: x^d=k
beta = nthroot(k,d); % k points in volume x^d : x^d=k

X = randn(d,n);
w = dirichletRnd(alpha,ones(1,k)/k);
Expand Down

0 comments on commit 314f475

Please sign in to comment.