Skip to content

Commit

Permalink
Updated repo README
Browse files Browse the repository at this point in the history
  • Loading branch information
mp4096 committed Dec 31, 2015
1 parent c5368b5 commit a8a473c
Showing 1 changed file with 18 additions and 2 deletions.
20 changes: 18 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,10 @@
# `adawhatever`
A collection of various stochastic gradient descent (SGD) solvers implemented in MATLAB.
A collection of various stochastic gradient descent (SGD) solvers implemented in MATLAB:
* Vanilla SGD
* AdaGrad (vanilla / with decay)
* Adadelta
* Adam
* Adamax

## Introduction
Stochastic gradient descent is a state of the art optimisation method in machine learning. It suits the concept of learning on many data points very well and outperforms many _theoretically_ superior second-order methods.
Expand All @@ -12,5 +17,16 @@ Furthermore, 'classic' optimisation terms are used instead of machine learning l
* hyperparameters ~ solver parameters

## How to use it
All solvers require the stochastic gradient of the objective `sg`, initial value of the decision variables `x0`, number of iterations `nIter` and the indices of the stochastic gradient that should be used at each iteration `idxSG`. Furthermore, each solver requires its specific solver parameters. See MATLAB documentation and references for further details.

TODO :warning:
The solver function returns a matrix with the `i`-th guess of the decision variables in the `i+1`-th column (first column contains `x0`).

A typical solver calling script would look like this:
```matlab
gs = @<stochastic gradient function>;
x0 = [0; 0; 0; 0];
nIter = 500;
idxSG = randi(<number of stochastic gradients>, 1, nIter);
xMat = <solver>(gs, x0, nIter, idxSG, <solver parameters>);
```

0 comments on commit a8a473c

Please sign in to comment.