- a good introduction and overview on CNNs in three steps:
- a good introduction to bert and how to use it:
https://arxiv.org/pdf/1810.04805.pdf
https://huggingface.co/transformers/model_doc/bert.html
https://jalammar.github.io/illustrated-bert/
-
an example to how to fine-tune it:
https://medium.com/swlh/painless-fine-tuning-of-bert-in-pytorch-b91c14912caa
-
Stanford Deep NLP 2019:
session1,2: good to get an appropriate start point and general view of deep natural language processing fundamentals.
session3: an introduction to neural networks
session4: get a little deep into basics of backpropagation
session5: dependency parsing (I do not suggest this unless parsing is the task you are working on.)
session6,7,8: are about RNNs. It disscusses different types of it, and following a main stream it sheds light on the methods solved its challenges. It is proposed in a well-defined structure with appropriate amount of mathematical fundamentals. These are all too delicious to follow :)
- Andrew Ng ML:
week 10: a set of introductions to large scale machine learning that includes a method to apply ml algorithms in a map-reduce based framework.
week 11: an introduction to ocr and image processing. It really helped me to get much more interested in ML than before. Actually, from then I'm much more ml fan than just text-mining, my first love in computer science.
- Deep Sentiment Analysis: https://towardsdatascience.com/fine-grained-sentiment-analysis-in-python-part-1-2697bb111ed4