Monday, March 24, 2014

Lecture 3: language modeling (2)

The third lecture was about language models. You discovered how important language models are and how we can approximate real language with them. N-gram models (unigrams, bigrams, trigrams) were discussed, together with their probability modeling and issues. We discussed perplexity and its close relationship with entropy, we introduced smoothing and interpolation techniques to deal with the issue of data sparsity.

In the second part of the class, we discussed the first homework in more detail.

No comments:

Post a Comment