Wednesday, March 31, 2021

Lecture 10 (29/03/2021, 3.5 hours): more on word2vec, GloVe, RNNs, LSTMs and PyTorch Lightning

More on Word2Vec and word embeddings: hierarchical softmax; negative sampling. GloVe. Recurrent Neural Networks. Gated architectures, Long-Short Term Memory networks (LSTMs). Bidirectional LSTMs and stacked LSTMs. Character embeddings. Introduction to PyTorch Lightning

Lecture 9 (25/03/2021, 2 hours): part-of-speech tagging

Part-of-speech tagging. Hidden markov models. Deleted interpolation. Linear and logistic regression: Maximum Entropy models; logit and logistic function; relationship to sigmoid and softmax. Transformation-based POS tagging. Handling out-of-vocabulary words.

Wednesday, March 24, 2021

Tuesday, March 16, 2021

Lecture 7 (15/03/2021, 3 hours): probabilistic language modeling

Word2vec in PyTorch. We introduced N-gram models (unigrams, bigrams, trigrams), together with their probability modeling and issues

Chain rule and n-gram estimation. Perplexity and its close relationship with entropy. Smoothing and interpolation.

Thursday, March 11, 2021

Lecture 6 (11/03/2021, 2 hours): word embeddings and Word2vec

One-hot encodings and word embeddings. Introduction to word2vec. Differences between CBOW and skip-gram. The loss function in word2vec. Implementation with PyTorch.



Tuesday, March 9, 2021

Lecture 5 (08/03/2021, 3 hours): classifying Amazon reviews with a feedforward network; loss functions

Regression vs. classification. Hands-on: classifying Amazon reviews with a feedforward network. Vector representations of text. Loss functions: Mean Squared Error (MSE), Binary Cross Entropy (BCE), Categorical Cross Entropy (CCE). Sigmoid and softmax.

Saturday, March 6, 2021

Lecture 4 (04/03/2021, 2 hours): deep learning basics hands-on in PyTorch

Introduction to the Perceptron. Activation functions. Loss functions: MSE and CCE. Colab notebook. Language classification with the perceptron.



Lecture 3 (01/03/2021, 3 hours): Machine Learning basics and PyTorch

Introduction to Machine Learning for Natural Language Processing: supervised vs. unsupervised vs. reinforcement learning. Features, feature vector representations.
Introduction to PyTorch. Introduction to deep learning for NLP. The perceptron. Colab notebook with PyTorch basics.


Lecture 2 (25/02/2021, 2 hours): more on NLP

More on Natural Language Processing and its applications.

Lecture 1 (22/02/2021, 3 hours): introduction to NLP

We gave an introduction to the course and the field it is focused on, i.e., Natural Language Processing and its challenges.