经典书目(百度云
提取码:b5qq)
- Deep Learning.深度学习必读.
原书地址
- 斯坦福大学《语音与语言处理》第三版:NLP必读.
原书地址
- Neural Networks and Deep Learning. 入门必读.
原书地址
- 复旦大学《神经网络与深度学习》邱锡鹏教授.
原书地址
- CS224d: Deep Learning for Natural Language Processing.
课件地址
- EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks.
地址
- A Neural Probabilistic Language Model.
地址
- Transformer.
地址
- Transformer-XL.
地址
- Convolutional Neural Networks for Sentence Classification.
地址
- Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification.
地址
- A Question-Focused Multi-Factor Attention Network for Question Answering.
地址
- AutoCross: Automatic Feature Crossing for Tabular Data in Real-World Applications.
地址
- GloVe: Global Vectors for Word Representation.
官网
- A Deep Ensemble Model with Slot Alignment for Sequence-to-Sequence Natural Language Generation.
地址
- The Design and Implementation of XiaoIce, an Empathetic Social Chatbot.
地址
- A Knowledge-Grounded Neural Conversation Model.
地址
- Neural Generative Question Answering.
地址
- A Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence Classification.
地址
- ImageNet Classification with Deep Convolutional Neural Networks.
地址
- Network In Network.
地址
,翻译
- Long Short-term Memory.
地址
- Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation.
地址
- Get To The Point: Summarization with Pointer-Generator Networks.
地址
- Generative Adversarial Text to Image Synthesis.
地址
- Image-to-Image Translation with Conditional Adversarial Networks.
地址
- Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network.
地址
- Unsupervised Learning of Visual Structure using Predictive Generative Networks.
地址
- Learning to Rank Short Text Pairs with Convolutional Deep Neural Networks.
地址
- Event Extraction via Dynamic Multi-Pooling Convolutional Neural.
地址
- Low-Memory Neural Network Training:A Technical Report.
地址
- Language Models are Unsupervised Multitask Learners.
地址
- Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context.
地址
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.
地址
- SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient.
地址
- The Illustrated Transformer.
博文
- Attention-based-model.
地址
- KL divergence.
地址
- Building Autoencoders in Keras.
地址
- Modern Deep Learning Techniques Applied to Natural Language Processing.
地址
- Node2vec embeddings for graph data.
地址
- Bert解读.
地址
地址
- 难以置信!LSTM和GRU的解析从未如此清晰(动图+视频)。
地址
- fasttext(skipgram+cbow)
- gensim(word2vec)
- eda
- svm
- fasttext
- textcnn
- bilstm+attention
- rcnn
- han
- bilstm+crf
- siamese
- keras-gpt-2.
地址
- textClassifier.
地址
- attention-is-all-you-need-keras.
地址
- BERT_with_keras.
地址
- SeqGAN.
地址
- Association of Computational Linguistics(计算语言学协会). ACL
- Empirical Methods in Natural Language Processing. EMNLP
- International Conference on Computational Linguistics. COLING
- Neural Information Processing Systems(神经信息处理系统会议). NIPS
- AAAI Conference on Artificial Intelligence. AAAI
- International Joint Conferences on AI. IJCAI
- International Conference on Machine Learning(国际机器学习大会). ICML