Thursday, May 28, 2020

Lecture 25 (28/05/2020, Google meet, 3 hours): machine translation and homework 3

Introduction to machine translation (MT) and history of MT. Overview of statistical MT. The EM algorithm for word alignment in SMT. Beam search for decoding. Introduction to neural machine translation: the encoder-decoder neural architecture; back translation; byte pair encoding. The BLEU evaluation score. Performances and recent improvements. Neural MT: the encoder-decoder architecture; advantages; results. Attention in NMT. Unsupervised machine translation. MASS.


End of the course!


Lecture 24 (26/05/2020, Google meet, 2 hours): semantic parsing

Semantic parsing: definition, comparison to Semantic Role Labeling, approaches, a recent approach in detail. The Abstract Meaning Representation (AMR) and Universal Conceptual Cognitive Annotations (UCCA) formalisms. Semantic parsing approaches.


Thursday, May 21, 2020

Lecture 23 (21/05/2020, Google meet, 3 hours): WSD and Semantic Role Labeling

Issues with WSD: the knowledge acquisition bottleneck and silver data generation. From word to sentence representations. Semantic roles. Resources: PropBank, VerbNet, FrameNet. Semantic Role Labeling (SRL): traditional features. State-of-the-art neural approaches.

Thursday, May 14, 2020

Lecture 21 (14/05/2020, Google meet, 3 hours): Word Sense Disambiguation

Introduction to Word Sense Disambiguation. Elements necessary for performing WSD. Supervised vs. unsupervised vs. knowledge-based WSD. Supervised WSD techniques. Neural WSD: LSTM and BERT-based approaches. Integration of knowledge and supervision.

Tuesday, May 12, 2020

Lecture 20 (12/05/2020, Google meet, 2 hours): XLM, XLNet, RoBERTa; Natural Language Understanding: Semantic Role Labeling; homework

XLNet, RoBERTa, XLM, XLM-R. The GLUE and SuperGLUE benchmarks. Introduction to Natural Language Understanding (NLU): Word Sense Disambiguation, Semantic Role Labeling, Semantic Parsing.


Thursday, May 7, 2020

Lecture 19 (07/05/2020, Google meet, 3 hours): Transformer (2/2) and BERT

The Transformer's encoder and decoder. Positional embeddings. BERT. Notebooks on BERT. Sense embeddings with WordNet and SemCor.

Wednesday, May 6, 2020

Lecture 18 (05/05/2020, Google meet, 2 hours): bilingual embeddings, contextualized word embeddings, ELMo, the Transformer

More on semantic vector representations. Bilingual and multilingual embeddings. Contextualized word embeddings. ELMo. The Transformer architecture.

Lecture 17 (30/04/2020, Google meet, 3 hours): BabelNet and sense embeddings

More on BabelNet. Introduction to semantic vector representations: motivation, examples. Semantic vector representations: importance of their multilinguality; linkage to BabelNet; latent vs. explicit representations; monolingual vs. multilingual representations. The NASARI lexical, unified and embedded representations..