A simple neural network library written from scratch using Python and Numpy
Supports:
- Sigmoid, relu, leaky_relu, tanh, linear, and softmax activation functions
- Xavier and kaiming weight initialization
- Dropout
- Adam optimizer
- Batch training with live console progress display