Skip to content

Commit

Permalink
Update repo structure
Browse files Browse the repository at this point in the history
  • Loading branch information
giangtranml committed Mar 25, 2020
0 parents commit 911c1c0
Show file tree
Hide file tree
Showing 88 changed files with 83,381 additions and 0 deletions.
28 changes: 28 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
venv/

# C extensions
*.so

# Distribution / packaging
bin/
build/
develop-eggs/
dist/
eggs/
lib/
lib64/
parts/
sdist/
var/
*.egg-info/
.installed.cfg
*.egg

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

.idea/
35 changes: 35 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# Machine Learning from scratch

## About
This ML repository is all about coding Machine Learning algorithms from scratch by Numpy with the math under the hood without Auto-Differentiation frameworks like Tensorflow, Pytorch, etc. Some advanced models in Computer Vision, NLP require Tensorflow to quickly get the idea written in paper.

## Repository structure
As a software engineer, I follow the principle of OOP to construct the repository. You can see that `NeuralNetwork` class will use `FCLayer`, `BatchNormLayer`, `ActivationLayer` class and `CNN` class will use `ConvLayer`, `PoolingLayer`, `FCLayer`, `ActivationLayer`,... This helps me easily reuse every piece of code I wrote as well as for readable code.

## Table of contents
- ML from scratch models:
* [Linear Regression](https://github.com/giangtranml/ml-from-scratch/blob/master/linear_regression/linear_regression.py)
* [Logistic Regression](https://github.com/giangtranml/ml-from-scratch/blob/master/logistic_regression/logistic_regression.py)
* [Softmax Regression](https://github.com/giangtranml/ml-from-scratch/blob/master/softmax_regression/softmax_regression.py)
* [Neural Network](https://github.com/giangtranml/ml-from-scratch/blob/master/neural_network/neural_network.py)
* [Convolutional Neural Network](https://github.com/giangtranml/ml-from-scratch/blob/master/convolutional_neural_network/convolutional_neural_network.py)
* [Support Vector Machine](https://github.com/giangtranml/ml-from-scratch/blob/master/svm/svm.py)

- Deep Learning layers:
* [Fully-Connected Layer](https://github.com/giangtranml/ml-from-scratch/blob/master/nn_components/layers.py#L43)
* [Convolutional Layer](https://github.com/giangtranml/ml-from-scratch/blob/master/nn_components/layers.py#L107)
* [Pooling Layer](https://github.com/giangtranml/ml-from-scratch/blob/master/nn_components/layers.py#L245)
* [Activation Layer](https://github.com/giangtranml/ml-from-scratch/blob/master/nn_components/layers.py#L372)
* [BatchNorm Layer](https://github.com/giangtranml/ml-from-scratch/blob/master/nn_components/layers.py#L436)
* [Dropout Layer](https://github.com/giangtranml/ml-from-scratch/blob/master/nn_components/layers.py#L407)

- Optimization algorithms:
* [SGD](https://github.com/giangtranml/ml-from-scratch/blob/master/optimizations_algorithms/optimizers.py#L16)
* [SGD with Momentum](https://github.com/giangtranml/ml-from-scratch/blob/master/optimizations_algorithms/optimizers.py#L24)
* [RMSProp](https://github.com/giangtranml/ml-from-scratch/blob/master/optimizations_algorithms/optimizers.py#L37)
* [Adam](https://github.com/giangtranml/ml-from-scratch/blob/master/optimizations_algorithms/optimizers.py#L51)

- Advanced models:
* [Bahdanau Attention Mechanism](https://github.com/giangtranml/ml-from-scratch/blob/master/attention_mechanism/Bahdanau%20Attention%20Mechanism.ipynb)
* [Luong Attention Mechanism](https://github.com/giangtranml/ml-from-scratch/blob/master/attention_mechanism/Luong%20Attention%20Mechanism.ipynb)
* [Transformer](https://github.com/giangtranml/ml-from-scratch/blob/master/transformer/Transformer.ipynb)
Empty file added __init__.py
Empty file.
Loading

0 comments on commit 911c1c0

Please sign in to comment.