Skip to content

Commit

Permalink
Restored sigmoid activation function and plain input to network (with…
Browse files Browse the repository at this point in the history
…out log2), as they are faster and presumably there is no difference.
  • Loading branch information
tambetm committed Jan 26, 2015
1 parent f787014 commit afc5e2c
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions NNAgent.m
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@
this.nnet = nnsetup([this.rows*this.cols layers this.actions]);
this.nnet.output = 'linear';
this.nnet.momentum = momentum;
this.nnet.activation_function = 'tanh_opt';
this.nnet.activation_function = 'sigm';
this.nnet.learningRate = learning_rate;

% initialize game
Expand Down Expand Up @@ -117,7 +117,7 @@
% a - game state
% Returns predicted Q-values.
% flatten the matrix and turn into one-element minibatch
x = max(log2(a(:)'), 0);
x = a(:)';
% copied from nnpredict()
this.nnet.testing = 1;
this.nnet = nnff(this.nnet, x, zeros(size(x,1), this.nnet.size(end)));
Expand All @@ -132,8 +132,8 @@ function train(this, b)
% Returns trained neural network.

% flatten states for input to neural network
x = max(log2(b.prestates(:,:)), 0);
xx = max(log2(b.poststates(:,:)), 0);
x = b.prestates(:,:);
xx = b.poststates(:,:);

% predict Q-values of prestates
this.nnet.testing = 1;
Expand Down

0 comments on commit afc5e2c

Please sign in to comment.