A neural network library written in Python as partial fulfilment of my final undergraduate project. I based my library off of Michael Nielsen's network2.py neural network library. The differences between my library and Nielsen's are outlined in network_library.py.
This is the central file of my project. This file represents the "neural network library", i.e., it contains all the code necessary to create and train a neural network.
Each test is designed to compare a key neural network feature. For example, we test the performance of a network using L2 regularization to one that is not using L2 regularization in L2_regularization_test.py
.
Compares two network architectures: 784-100-100-10 and 784-100-10.
Compares a network using the cross-entropy cost function to one using the quadratic cost function.
Compares a network using small weight initialization to one not using small weight initialization.
Compares a network using L2 regularization to one not using L2 regularization.
Compares a network using Dropout to one not using Dropout.
Compares a network using DropConnect to one not using DropConnect.
Train the network using the settings I found to achieve the highest results. This file loads the trained network, 'network.txt', which achieves an accuracy level of 98.06% on the test data.
A set of miscellaneous files that support the function of the neural network library.
Loads the mnist.pkl.gz
file so that we can interact with the MNIST dataset.
The MNIST dataset.
A trained network saved as a textfile. In order to use this network's parameters you must load the textfile as neural network by using load()
in the network_library.py
file. This network achieves an accuracy rating of 98.06% on the MNIST test data.