Knowledge Quiz
Test your understanding of this article
1.According to the article, what is one of the primary purposes of 'perplexity' in the context of language models?
2.What problem do 'unseen n-grams' introduce in language models, as mentioned in the article?
3.Which technique is mentioned as a way to 'patch the holes' caused by unseen n-grams?
4.What was a significant finding when applying the MLE bigram formula to the Berkeley Restaurant Project corpus?
