We introduced N-gram models (unigrams, bigrams, trigrams), together with their probability modeling and issues. We discussed perplexity and its close relationship with entropy, we introduced smoothing and interpolation techniques to deal with the issue of data sparsity.
We also discussed the homework 1a in detail (see slides on the class group).
We also discussed the homework 1a in detail (see slides on the class group).
No comments:
Post a Comment