Tuesday, March 31, 2020

Lecture 10 (31/03/2020, Google meet, 2 hrs): Part-of-speech tagging (2/3)

Hidden markov models. Deleted interpolation. Linear and logistic regression: Maximum Entropy models; logit and logistic function; relationship to sigmoid and softmax. Transformation-based POS tagging. Handling out-of-vocabulary words.

Thursday, March 26, 2020

Lecture 9 (26/03/2020, Google meet, 3 hrs): Part-of-speech tagging (1/3)

Data preparation. Introduction to part-of-speech tagging. Universal POS tags. Stochastic part-of-speech tagging. Intro to Hidden markov models. More on word2vec: hierarchical softmax and negative sampling.

Tuesday, March 24, 2020

Lecture 8 (24/03/2020, Google meet, 2 hrs): perplexity, smoothing, interpolation

Chain rule and n-gram estimation. Perplexity and its close relationship with entropy. Smoothing and interpolation.

Thursday, March 19, 2020

Lecture 7 (19/03/2020, Google meet, 3 hrs): word2vec in PyTorch + language modeling

Word2vec in PyTorch. We introduced N-gram models (unigrams, bigrams, trigrams), together with their probability modeling and issues


Tuesday, March 17, 2020

Lecture 6 (17/03/2020, Google meet, 2 hrs): word2vec and its implementation

One-hot encodings and word embeddings. Introduction to word2vec. Differences between CBOW and skip-gram. The loss function in word2vec. Implementation with PyTorch.



Tuesday, March 10, 2020

Lecture 4 (10/03/2020 - 2.30 hrs): the Perceptron

Introduction to the Perceptron. Activation functions. Loss functions: MSE and CCE. Colab notebook. Language classification with the perceptron.



Tuesday, March 3, 2020

Lecture 3 (03/03/2020): PyTorch

Introduction to PyTorch. Introduction to deep learning for NLP. The perceptron. Colab notebook with PyTorch basics.