The repository is about lab homework of AI and Machine Learning, in which we use numpy to write many classical machine learning models.
Homework 2 is linear regression. Considering MSE loss, using SGD, BGD and MBGD, with min-max normalization and mean normalization, we test the linear regression model with self defined dataset. For all those methods above, we analyze the difference of each choice.
The code is in hw2.py, and the report is in hw2.pdf.
Homework 3 is perceptron. It use the wine dataset (each wine sample with a class label and 13 features, which are chemical indicators such as alcohol content and malic acid concentration). We randomly choose two class of the wine dataset, and use BGD and SGD to train the perceptron. After training, we use accuracy, precision, recall and f1 score to evaluate the model.
The code is in hw3.py, and the report is in hw3.pdf.
Homework 4 is logistic regression. Also using the wine dataset, use MBGD and SGD to train the model. Use accuracy, precision, recall and f1 score to evaluate the model.
The code is in hw4.py, and the report is in hw4.pdf.
Homework 5 is multi-layer perceptron. It use numpy to develop MLP model, with SGD and MBGD update method. Also, there's cross validation and early stopping, to automatically increase the size of the layer. The model is tested by classification problem and nonlinear regression problem, and both of them have perfect performance.
The code is in hw5.py, and the report is in hw5.pdf.
Homework 6 is k-nearest neighbour. Realized by numpy, the model has perfect performance with Breast Cancer Wisconsin dataset.
The code is in hw6.py, and the report is in hw6.pdf.
Homework 7 is decision tree. It use Contact Lenses Dataset to test the model. The model is only the most basic implementation.
The code is in hw7.py, and the report is in hw7.pdf.
Homework 8 is multi-class mlp. It use optical digits dataset to test the model, with many reuslts analyazation.
The code is in hw8.py, and the report is in hw8.pdf.
Homework 9 is Dimensionality Reduction and Autoencoders. It realize PCA, Linear Autoencoder and Nonlinear Autoencoder. From my analysis, such dimensionality really works.
The code is in hw9.py, and the report is in hw9.pdf.