Friday, May 13, 2022

Lecture 19b (12/5/2022, 3 hours): pre-trained Transformer models and Transformer notebook

Pre-training and fine-tuning. Encoder, decoder and encoder-decoder pre-trained models. GPT, BERT. Masked language modeling and next-sentence prediction tasks. Practical session on the Transformer with BERT.


No comments:

Post a Comment