Home Page and Blog of the Multilingual NLP course @ Sapienza University of Rome
Introduction to the attention in deep learning: motivation, attention scores, approaches. The Transformer architecture. Introduction to BERT.
No comments:
Post a Comment