Friday, March 23, 2018

Lecture 8 (23/03/2018): more on word2vec; smoothing for language modeling; introduction to part-of-speech tagging

Use of word embeddings produced with word2vec. Smoothing techniques for probabilistic language modeling. Introduction to part-of-speech tagging: word classes; universal tag set.


Lecture 7 (22/03/2018): word2vec and its implementation

Word2vec: CBOW and skipgram; explanation, derivation of the loss function and implementation in TensorFlow.


Friday, March 16, 2018

Lecture 6 (16/03/2018): deep learning and word embeddings (1)

Introduction to neural networks. The perceptron. Neural units. Activation functions. MaxEnt and softmax. Word embeddings: rationale and word2vec. CBOW and skipgram. Homework 1 assignment!


Lecture 5 (15/03/2018): language modeling

We introduced N-gram models (unigrams, bigrams, trigrams), together with their probability modeling and issues. We discussed perplexity and its close relationship with entropy, we introduced smoothing


Friday, March 9, 2018

Lecture 4 (09/03/2018): TensorFlow for linear and polynomial regression; TensorBoard

More on TensorFlow: variables, placeholders, sessions, training. Linear and polynomial regression. TensorBoard.


Lecture 3 (08/03/2018): introduction to Machine Learning for NLP / TensorFlow

Introduction to Machine Learning for Natural Language Processing: supervised vs. unsupervised vs. reinforcement learning. Features, feature vector representations. TensorFlow.

Friday, March 2, 2018

Lecture 2 (02/03/2018): Introduction to NLP (2)

We continued our introduction to NLP, with a focus on the Turing Test as a tool to understand whether "machines can think". We also discussed the pitfalls of the test, including Searle's Chinese Room argument.

Lecture 1 (01/03/2018): Introduction to NLP (1)

We gave an introduction to the course and the field it is focused on, i.e., Natural Language Processing.