Home Page and Blog of the Multilingual NLP course @ Sapienza University of Rome
Pre-training and fine-tuning. Encoder, decoder and encoder-decoder pre-trained models. GPT, BERT. Masked language modeling and next-sentence prediction tasks. Practical session on the Transformer with BERT.
No comments:
Post a Comment