Based on the paper
Wang-Cheng Kang, Julian McAuley (2018). Self-Attentive Sequential Recommendation. In Proceedings of IEEE International Conference on Data Mining (ICDM'18)
https://cseweb.ucsd.edu/~jmcauley/pdfs/icdm18.pdf
Modified the original Git Repo: https://github.com/kang205/SASRec and the TF 2.x version: https://github.com/nnkkmto/SASRec-tf2
All the Amazon product datasets can be downloaded using the script download_and_process_amazon.py
- SASRec: Original Transformer based Recommender
- SSEPT: Transformer with User embeddings
- SASRec++: Transformer with item embeddings learnt from GCN
- TiSASRec: Time Interval aware SASRec
- RNN: RNN based sequence prediction
- HSASRec: Hierarchical SASRec - using previous user history embeddings
- python main-tf2.py --dataset=ae --train_dir=default --maxlen=50 --dropout_rate=0.5 --lr=0.001 --hidden_units=100 --num_epochs=50 --text_features=1 (for SASRec)
- python main-tf2.py --dataset=ae_v2 --train_dir=default --maxlen=200 --dropout_rate=0.5 --lr=0.001 --hidden_units=100 --num_epochs=50 --add_embeddings=1 (for SASRec++)
- python main-tf2.py --dataset=ae_v2 --train_dir=default --maxlen=200 --dropout_rate=0.5 --lr=0.001 --hidden_units=100 --num_epochs=50 --model_name=ssept (for SSEPT)
- python main-tf2.py --dataset=ae_graph --train_dir=default --maxlen=50 --dropout_rate=0.5 --lr=0.001 --hidden_units=100 --num_epochs=50 --add_history=1 --model_name=hsasrec