Friday, March 29, 2019

Lecture 9 (29/03/2019) (P): LSTM hands-on (with sentiment analysis) + homework 1 assignment

LSTM hands-on with Keras + Tensorflow. Homework 1 assignment: neural word segmentation.

Lecture 8 (28/03/2019): recurrent neural networks and LSTMs

Introduction to Recurrent Neural Networks (RNNs): definitions and configurations. Simple RNN, CBOW as RNN, gated architectures, Long-Short Term Memory networks (LSTMs).

Lecture 7 (22/03/2019): part-of-speech tagging

Stochastic part-of-speech tagging. Hidden markov models. Deleted interpolation. Linear and logistic regression: Maximum Entropy models. Transformation-based POS tagging. Handling out-of-vocabulary words. The Stanford POS tagger. Neural POS tagging with character-based LSTMs.


Friday, March 22, 2019

Lecture 6 (21/03/2019): language modeling

We introduced N-gram models (unigrams, bigrams, trigrams), together with their probability modeling and issues. We discussed perplexity and its close relationship with entropy, we introduced smoothing


Tuesday, March 19, 2019

Lecture 5 (15/03/2019) (D): lab on FF networks and word2vec

A practical lesson on TensorFlow. Language classifier implemented both in Tensorflow and Keras. Word2vec model implementation and more on analogical task. Differences between CBOW and skip-gram.

Thursday, March 14, 2019

Lecture 4 (14/03/2019): deep learning and word embeddings

Introduction to neural networks. The perceptron. Neural units. Activation functions. MaxEnt and softmax. Word embeddings: rationale and word2vec. CBOW and skipgram.


Monday, March 11, 2019

Lecture 3 (07/03/2019) (D): more on TensorFlow and TensorBoard

More on TensorFlow: variables, placeholders, sessions, training. Linear and polynomial regression. TensorBoard.

Saturday, March 2, 2019

Lecture 2 (01/03/2019): more on NLP + introduction to machine learning and deep learning (1)

Introduction to Machine Learning for Natural Language Processing: supervised vs. unsupervised vs. reinforcement learning. Features, feature vector representations. TensorFlow.

Lecture 1 (28/02/2019): introduction to Natural Language Processing

We gave an introduction to the course and the field it is focused on, i.e., Natural Language Processing. We talked about the Turing Test as a tool to understand whether "machines can think". We also discussed the pitfalls of the test, including Searle's Chinese Room argument.