Monday, June 4, 2018

Lecture 26 (01/06/2018): neural machine translation + end of the course!

The EM algorithm for word alignment in SMT. Beam search for decoding. Introduction to neural machine translation: the encoder-decoder neural architecture; back translation; byte pair encoding. The BLEU evaluation score. Performances and recent improvements. End of the course!

Lecture 25 (31/05/2018): bilingual/multilingual embeddings; semantic parsing

Bilingual and multilingual embeddings. Offline vs. online embeddings. Semantic parsing: definition, comparison to Semantic Role Labeling, approaches, a recent approach in detail. The Abstract Meaning Representation formalism. Introduction to machine translation (MT) and history of MT. Overview of statistical MT.

Friday, May 25, 2018

Lecture 24 (25/05/2018): semantic roles and semantic role labeling

From word to sentence representations. Semantic roles. Resources: PropBank, VerbNet, FrameNet. Semantic Role Labeling (SRL): traditional features. State-of-the-art neural approaches.

Wednesday, May 23, 2018

Lecture 23 (24/05/2018): issues in WSD; the knowledge acquisition bottleneck; sense distribution learning

Issues in Word Sense Disambiguation. Addressing the knowledge acquisition bottleneck for improving lexical semantic tasks. Learning sense distributions

Sunday, May 20, 2018

Lecture 22 (18/05/2018): unsupervised and knowledge-based WSD

Unsupervised Word Sense Disambiguation: Word Sense Induction. Context-based clustering. Co-occurrence graphs: curvature clustering, HyperLex. Knowledge-based Word Sense Disambiguation. The Lesk and Extended Lesk algorithm. Structural approaches: similarity measures and graph algorithms. Conceptual density. Structural Semantic Interconnections. Evaluation: precision, recall, F1, accuracy. Baselines. Entity Linking.

Friday, May 18, 2018

Lecture 21 (17/05/2018): supervised Word Sense Disambiguation

Two important dimensions: supervision and knowledge. Supervised Word Sense Disambiguation: pros and cons. Vector representation of context. Main supervised disambiguation paradigms: decision trees, neural networks, instance-based learning, Support Vector Machines, IMS with embeddings, neural approaches to WSD.

Thursday, May 10, 2018

Lecture 19 (10/05/2018): semantic vector representations (2)

Semantic vector representations: importance of their multilinguality; linkage to BabelNet; latent vs. explicit representations; monolingual vs. multilingual representations. The NASARI lexical, unified and embedded representations.

Lecture 18 (04/05/2018): BabelNet; semantic vector representations (1)

Introduction to BabelNet ( multilingual synsets, resources integrated, accuracy, applications. Semantic vector representations: SensEmbed.

Friday, May 4, 2018

Lecture 17 (03/05/2018): computational lexicons; WordNet; introduction to WSD; homework 2

Encoding word senses: paper dictionaries, thesauri, machine-readable dictionary, computational lexicons. WordNet. Introduction to Word Sense Disambiguation (WSD). Homework 2: supervised and knowledge-based Word Sense Disambiguation.