Thursday, May 9, 2019

Lecture 16 (09/05/2019): more on BabelNet, intro to semantic vector representations

More on BabelNet. Introduction to semantic vector representations: motivation, examples, un supervised approaches.


Lecture 15 (03/05/2019): lexical knowledge resources (WordNet, BabelNet)

Encoding word senses: paper dictionaries, thesauri, machine-readable dictionary, computational lexicons. WordNet. Brief introduction to BabelNet.


Lecture 14 (02/05/2019): introduction to computational semantics

Introduction to computational semantics. Syntax-driven semantic analysis. Semantic attachmentsFirst-Order LogicLambda notation and lambda calculus for semantic representation. Lexiconlemmas and word forms. Word sensesmonosemy vs. polysemy. Special kinds of polysemy. Computational sense representationsenumeration vs. generation. Graded word sense assignment.

Friday, April 12, 2019

Lecture 13 (12/04/2019): Dependency Parsing Hands-on

Dependency parsing hands-on. A basic graph-based approach in Keras. Introduction to the attention mechanism.

Lecture 12 (11/04/2019): Q&A on homework 1

Padding and implementation details. Label design. Performance issues and tips to speed up training.

Friday, April 5, 2019

Lecture 11 (05/04/2019): syntactic parsing (2/2)

The Early algorithm. Probabilistic CFGs. Probabilistic parsing, Neural dependency parsing with LSTMs: graph-based vs. transition based. Arc-factored dependency parsing and arc-hybrid transition-based dependency parsing.



Thursday, April 4, 2019

Lecture 10 (04/04/2019): syntactic parsing (1/2)

Introduction to syntax. Context-free grammars and languages. Treebanks. Normal forms. Dependency grammars. Syntactic parsing: top-down and bottom-up. Structural ambiguity Backtracking vs. dynamic programming for parsing. The CKY algorithm.

Friday, March 29, 2019

Lecture 9 (29/03/2019) (P): LSTM hands-on (with sentiment analysis) + homework 1 assignment

LSTM hands-on with Keras + Tensorflow. Homework 1 assignment: neural word segmentation.

Lecture 8 (28/03/2019): recurrent neural networks and LSTMs

Introduction to Recurrent Neural Networks (RNNs): definitions and configurations. Simple RNN, CBOW as RNN, gated architectures, Long-Short Term Memory networks (LSTMs).

Lecture 7 (22/03/2019): part-of-speech tagging

Stochastic part-of-speech tagging. Hidden markov models. Deleted interpolation. Linear and logistic regression: Maximum Entropy models. Transformation-based POS tagging. Handling out-of-vocabulary words. The Stanford POS tagger. Neural POS tagging with character-based LSTMs.