Monday, June 5, 2023

Lecture 22 (29/05/2023, 2.5 hours): text summarization, open issues in NLP, topics for thesis and more, closing

Introduction to text summarization and evaluation metrics (BLEU, ROUGE, BERTScore, alternatives). Open issues in NLP: superhuman performance in current benchmarks, stochastic parrots, evaluation of text quality. Thesis topics and more. Closing.

 

Friday, May 26, 2023

Lecture 21 (26/05/2023, 4.5 hours): seq2seq, Machine Translation

Foundations of sequence-to-sequence models and their use within Huggingface.

Introduction to machine translation (MT) and history of MT. Overview of statistical MT. Beam search for decoding. Introduction to neural machine translation: the encoder-decoder neural architecture. The BLEU evaluation score. Performances and recent improvements. Neural MT: the encoder-decoder architecture; Attention in NMT.

Monday, May 22, 2023

Lecture 20 (22/05/2023, 2.5 hours): More on semantic role labeling; Semantic Parsing

More on Semantic Role Labeling. Semantic Parsing: task, motivation and applications, Abstract Meaning Representation (AMR) and BabelNet Meaning Representation (BMR), Natural Language Generation from semantic parses

Immagine

Lecture 19 (19/05/2023, 4 hours): Semantic Role Labeling

Semantic roles. Frame resources: PropBank, FrameNet, VerbAtlas. Semantic Role Labeling (SRL). Multilingual SRL. Cross-inventory approaches to SRL. Topics for thesis or excellence path.

Monday, May 15, 2023

Lecture 18 (15/05/2023, 2.5 hours): Overview of NLP libraries and tools; HW 3 assignment

Overview of NLP libraries: Hugginface Transformers, datasets and eval. FairSeq, Lightning Transformer, Sentence Transformers, Classy. PyTorch Lightning. Assignment of homework 3: Relation Extraction.

Friday, May 12, 2023

Lecture 17 (12/05/2023, 4.5 hours): More on sense embeddings; Word Sense Disambiguation

Word Sense Disambiguation (WSD): introduction to the task. Purely data-driven, and neuro-symbolic approaches. WSD cast as sense comprehension. Issues. Semantic Role Labeling: introduction to the task. Inventories. Neural approaches. Issues.

Lecture 16 (05/05/2023, 4.5 hours): Homework 2 assignment on Word Sense Disambiguation; sense embeddings

Assignment of homework2: Word Sense Disambiguation. Introduction to Word Sense Disambiguation. First introduction to explicit and latent sense embeddings. SensEmbed.

 

Wednesday, April 26, 2023

Lecture 14 (21/04/2023, 4.5 hours): more on the Transformer, pre-trained language models; introduction to lexical semantics

More on the Transformer architecture. Pre-trained language models: BERT, GPT, RoBERTa, XLM. Introduction to lexical semantics: meaning representations, WordNet, BabelNet. Neurosymbolic NLP.

Tuesday, April 18, 2023

Lecture 13 (17/04/2023, 2 hours, E): notebook on LSTMs

Q&A on Homework 1, Part-of-Speech tagging brief introduction, LSTMs recap, Notebook on Part-of-Speech Tagging with LSTMs, data preprocessing and training procedure best practices.

Lecture 12 (14/04/2023, 4.5 hours): neural language modeling, the attention mechanism, the Transformer

Neural language modeling. Context2vec. Neural language models with BiLSTMs. Contextualized word representations. Introduction to the attention. Introduction to the Transformer architecture.

Friday, April 14, 2023

Lecture 11 (03/04/2023, 2 hours): computational lexical semantics

Introduction to lexical semantics. Lexiconlemmas and word forms. Introduction to the notion of concepts, the triangle of meaning, concepts vs. named entities. Word sensesmonosemy vs. polysemy. Key tasks for Natural Language Understanding: Word Sense Disambiguation (within lexical semantics), Semantic Role Labeling and Semantic Parsing (sentence-level).

Monday, March 27, 2023

Lecture 9 (27/03/2023, 1 hour): probabilistic language modeling

What is a language model? N-gram models (unigrams, bigrams, trigrams), together with their probability modeling and issues. Chain rule and n-gram estimation.


Friday, March 24, 2023

Lecture 8 (24/03/2023, 4 hours): definition of LSTM; handbook of a real-world classification problem; homework 1

More on LSTMs. Notebook on training, dev, test. Notebook on a real-world NLP problem. Assignment of Homework 1!

Lecture 7 (20/03/2023, 2 hours): more on word embeddings and RNNs

More on word embeddings. Lookup tables. Cooccurence matrices. GloVe. Stopwords. Static vs. contextualized embeddings. Different inputs and outputs for RNNs.

Monday, March 20, 2023

Lecture 6 (17/03/2023, 3 hours, E): word2vec, recurrent neural networks (RNNs), Long-Short Term Memory networks (LSMTs)

word2vec (CBOW and skipgram), PyTorch notebook on word2vec, recurrent neural networks, optimization for RNNs, Long-Short Term Memory (LSMT) networks.

Wednesday, March 15, 2023

Lecture 5 (13/03/2023, 2 hours, E): first hands-on with PyTorch with language detection

Recap of the Supervised Learning framework, hands on practice with PyTorch on the Language Detection Model: tensors, gradient tracking, the Dataset class, the Module class, the backward step, the training loop, evaluating a model.


Lecture 4 (10/03/2023, 2 hours, E): Machine Learning for NLP and intro to neural networks

Introduction to Supervised, Unsupervised & Reinforcement Learning. The Supervised Learning framework. From real to computational: features extraction and features vectors. Feature Engineering and inferred features. PyTorch vs Tensorflow. The perceptron model. What is Deep Learning, training weights and Backpropagation.

Friday, March 10, 2023

Lecture 3 (06/03/2023, 2.5 hours)

Introduction to classification in NLP. The task of Sentiment Analysis. Probabilistic classification. Logistic Regression and its use for classification. Explicit vs. implicit features. The cross-entropy loss function. 


 


Tuesday, March 7, 2023

Lecture 2 (3/3/2023, 3 hours): Introduction to NLP (2/2)

Introduction to NLP in Rome. Introduction to Natural Language Processing: understanding and generation. What is NLP? The Turing Test, criticisms and alternatives. Tasks in NLP and its importance (with examples). Key areas.

 




Lecture 1 (27/02/2023, 2 hours): Introduction to NLP (1/2)

 We gave an introduction to the course and the field it is focused on, i.e., Natural Language Processing and its challenges.

Wednesday, February 22, 2023

New class hours! Monday and Friday

  • New class hours!!! Monday 15-17 and Friday 13-17 at DIAG, via Ariosto 25. The NLP class will start on Feb 27th.
  • If you have the former Natural Language Processing course in your curriculum and wish to attend and pass my new multilingual NLP course instead (same programme), there is no problem.

News Background Stock Video Footage for Free Download


Tuesday, February 14, 2023

News: starting on Feb 27th and your curriculum

  • All courses (including multilingual NLP) will start on Feb 27th, so NLP will start on Feb 28th.
  • If you have the former Natural Language Processing course in your curriculum and wish to pass my new multilingual NLP course instead (same programme), just contact me at <surname>@diag.uniroma1.it

News Background Stock Video Footage for Free Download


Monday, February 6, 2023

Ready, steady, go!!!

Welcome to the Sapienza NLP course blog 2023! The course is held at DIAG! Cool things about to happen:

  1. The course will contain lots of up-to-date content on deep learning, neural networks, and an improved hands-on with PyTorch and PyTorch Lightning!
  2. For attending students, there will be only TWO homeworks (and no additional duty), one of which will be done with delivery by the end of September and will replace the project. Non-attending students, instead, will have to work on three homeworks.
  3. There will be cool challenges throughout the whole course, including the possibility of writing and publishing papers. 
  4. We will start from ChatGPT!!! 
Class hours are: Tuesday 11-13 (room A6) and Friday 9-13 (room A2), DIAG, via Ariosto 25


 

IMPORTANT: The current lecture model is in-person attendance. Please get access to the Facebook group by signing up via the link below. 

IMPORTANT (bis): Note that the course has been renamed into Multilingual Natural Language Processing (if you have NLP in your plan and want to attend my course, please contact me at [surname]@diag.uniroma1.it). 


Please sign up to the NLP class!