Skip to content

Multiple classical machine learning models based on numpy, including linear regression, perceptron, logistic regression, multi-layer perceptron, knn, decision tree

Notifications You must be signed in to change notification settings

Wendy-Ying/AI-and-Machine-Learning-Lab

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

30 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Artificial Intelligence and Machine Learning

The repository is about lab homework of AI and Machine Learning, in which we use numpy to write many classical machine learning models.

hw2

Homework 2 is linear regression. Considering MSE loss, using SGD, BGD and MBGD, with min-max normalization and mean normalization, we test the linear regression model with self defined dataset. For all those methods above, we analyze the difference of each choice.

The code is in hw2.py, and the report is in hw2.pdf.

hw3

Homework 3 is perceptron. It use the wine dataset (each wine sample with a class label and 13 features, which are chemical indicators such as alcohol content and malic acid concentration). We randomly choose two class of the wine dataset, and use BGD and SGD to train the perceptron. After training, we use accuracy, precision, recall and f1 score to evaluate the model.

The code is in hw3.py, and the report is in hw3.pdf.

hw4

Homework 4 is logistic regression. Also using the wine dataset, use MBGD and SGD to train the model. Use accuracy, precision, recall and f1 score to evaluate the model.

The code is in hw4.py, and the report is in hw4.pdf.

hw5

Homework 5 is multi-layer perceptron. It use numpy to develop MLP model, with SGD and MBGD update method. Also, there's cross validation and early stopping, to automatically increase the size of the layer. The model is tested by classification problem and nonlinear regression problem, and both of them have perfect performance.

The code is in hw5.py, and the report is in hw5.pdf.

hw6

Homework 6 is k-nearest neighbour. Realized by numpy, the model has perfect performance with Breast Cancer Wisconsin dataset.

The code is in hw6.py, and the report is in hw6.pdf.

hw7

Homework 7 is decision tree. It use Contact Lenses Dataset to test the model. The model is only the most basic implementation.

The code is in hw7.py, and the report is in hw7.pdf.

hw8

Homework 8 is multi-class mlp. It use optical digits dataset to test the model, with many reuslts analyazation.

The code is in hw8.py, and the report is in hw8.pdf.

hw9

Homework 9 is Dimensionality Reduction and Autoencoders. It realize PCA, Linear Autoencoder and Nonlinear Autoencoder. From my analysis, such dimensionality really works.

The code is in hw9.py, and the report is in hw9.pdf.

About

Multiple classical machine learning models based on numpy, including linear regression, perceptron, logistic regression, multi-layer perceptron, knn, decision tree

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages