From word to sentence representations. Semantic roles. Resources: PropBank, VerbNet, FrameNet. Semantic Role Labeling (SRL): traditional features. State-of-the-art neural approaches.
Semantic parsing: definition, comparison to Semantic Role Labeling, approaches, a recent approach in detail. The Abstract Meaning Representation formalism. Introduction to machine translation (MT) and history of MT. Overview of statistical MT. The EM algorithm for word alignment in SMT. Beam search for decoding. Introduction to neural machine translation: the encoder-decoder
neural architecture; back translation; byte pair encoding. The BLEU
evaluation score. Performances and recent improvements. End of the
course!
Home Page and Blog of the Multilingual NLP course @ Sapienza University of Rome
Friday, May 31, 2019
Lecture 22 (30/05/2019): presentation of research in Rome
Presentation of research carried out in the multilingual NLP research at Sapienza: multilingual Word Sense Disambiguation, semantic role labeling, knowledge acquisition.
Lecture 21 (28/05/2019): prof. Pustejovsky's lecture on "Visualizing Meaning: Semantic Simulation of Actions and Events"
Visualizing Meaning: Semantic Simulation of Actions and Events
Friday, May 24, 2019
Lecture 20 (24/05/2019): neural WSD, unsupervised WSD, knowledge-based WSD
Neural Word Sense Disambiguation. Unsupervised Word Sense Disambiguation: Word Sense Induction. Context-based clustering. Co-occurrence graphs: curvature clustering, HyperLex. Knowledge-based Word Sense Disambiguation. The Lesk and Extended Lesk algorithm. Structural approaches: similarity measures and graph algorithms. Conceptual density. Structural Semantic Interconnections. Evaluation: precision, recall, F1, accuracy. Baselines. Entity Linking.
Lecture 17 (16/05/2019): more on semantic vector representations
Semantic vector representations: importance of their multilinguality; linkage to BabelNet; latent vs. explicit representations; monolingual vs. multilingual representations. The NASARI lexical, unified and embedded representations.
Thursday, May 9, 2019
Lecture 15 (03/05/2019): lexical knowledge resources (WordNet, BabelNet)
Encoding word senses: paper dictionaries, thesauri, machine-readable dictionary, computational lexicons. WordNet. Brief introduction to BabelNet.
Lecture 14 (02/05/2019): introduction to computational semantics
Introduction to computational semantics. Syntax-driven semantic analysis. Semantic attachments. First-Order Logic. Lambda notation and lambda calculus for semantic representation. Lexicon, lemmas and word forms. Word senses: monosemy vs. polysemy. Special kinds of polysemy. Computational sense representations: enumeration vs. generation. Graded word sense assignment.
Friday, April 12, 2019
Lecture 13 (12/04/2019): Dependency Parsing Hands-on
Dependency parsing hands-on. A basic graph-based approach in Keras. Introduction to the attention mechanism.
Lecture 12 (11/04/2019): Q&A on homework 1
Padding and implementation details. Label design. Performance issues and tips to speed up training.
Friday, April 5, 2019
Thursday, April 4, 2019
Lecture 10 (04/04/2019): syntactic parsing (1/2)
Introduction to syntax. Context-free grammars and languages. Treebanks. Normal forms. Dependency grammars. Syntactic parsing: top-down and bottom-up. Structural ambiguity. Backtracking vs. dynamic programming for parsing. The CKY algorithm.
Friday, March 29, 2019
Lecture 9 (29/03/2019) (P): LSTM hands-on (with sentiment analysis) + homework 1 assignment
LSTM hands-on with Keras + Tensorflow. Homework 1 assignment: neural word segmentation.
Friday, March 22, 2019
Lecture 6 (21/03/2019): language modeling
We introduced N-gram models (unigrams, bigrams, trigrams), together with their probability modeling and issues. We discussed perplexity and its close relationship with entropy, we introduced smoothing
Tuesday, March 19, 2019
Lecture 5 (15/03/2019) (D): lab on FF networks and word2vec
A practical lesson on TensorFlow. Language
classifier implemented both in Tensorflow and Keras. Word2vec model
implementation and more on analogical task. Differences between CBOW and
skip-gram.
Thursday, March 14, 2019
Monday, March 11, 2019
Saturday, March 2, 2019
Lecture 1 (28/02/2019): introduction to Natural Language Processing
We gave an introduction to the course and the field it is focused on, i.e., Natural Language Processing. We talked about the Turing Test as a tool to understand whether "machines can think". We also discussed the pitfalls of the test, including Searle's Chinese Room argument.
Thursday, January 17, 2019
Ready, steady, go!
Welcome to the Sapienza NLP course blog! This year there will be important changes:
IMPORTANT: The 2019 class hour schedule will be on Thursday 16.30-19 and Fridays 14.00pm-16.30pm, Aula 2 - Aule L ingegneria.
Please sign up to the NLP class!
- You will write a paper
- The course will be even more deep learning oriented
- For attending students, there will be only three homeworks (and no additional duty), one of which will be done with delivery by the end of September and will replace the project. Non-attending students, instead, will have to work on a full-fledged project.
IMPORTANT: The 2019 class hour schedule will be on Thursday 16.30-19 and Fridays 14.00pm-16.30pm, Aula 2 - Aule L ingegneria.
Please sign up to the NLP class!
Subscribe to:
Posts (Atom)