Wednesday, April 26, 2023

Lecture 14 (21/04/2023, 4.5 hours): more on the Transformer, pre-trained language models; introduction to lexical semantics

More on the Transformer architecture. Pre-trained language models: BERT, GPT, RoBERTa, XLM. Introduction to lexical semantics: meaning representations, WordNet, BabelNet. Neurosymbolic NLP.

Tuesday, April 18, 2023

Lecture 13 (17/04/2023, 2 hours, E): notebook on LSTMs

Q&A on Homework 1, Part-of-Speech tagging brief introduction, LSTMs recap, Notebook on Part-of-Speech Tagging with LSTMs, data preprocessing and training procedure best practices.

Lecture 12 (14/04/2023, 4.5 hours): neural language modeling, the attention mechanism, the Transformer

Neural language modeling. Context2vec. Neural language models with BiLSTMs. Contextualized word representations. Introduction to the attention. Introduction to the Transformer architecture.

Friday, April 14, 2023

Lecture 11 (03/04/2023, 2 hours): computational lexical semantics

Introduction to lexical semantics. Lexiconlemmas and word forms. Introduction to the notion of concepts, the triangle of meaning, concepts vs. named entities. Word sensesmonosemy vs. polysemy. Key tasks for Natural Language Understanding: Word Sense Disambiguation (within lexical semantics), Semantic Role Labeling and Semantic Parsing (sentence-level).