N-Grams Smoothing
After completing this experiment, students will be able to:
- Understand Smoothing in N-gram Models: Explain the need for smoothing in N-gram language models and describe common smoothing techniques.
- Apply Add-One Smoothing: Calculate smoothed bigram probabilities using Add-One (Laplace) Smoothing.
- Analyze Sparse Data: Identify and address the issue of zero-probability N-grams in sparse bigram tables.
- Compare Probability Distributions: Observe and interpret the effects of smoothing on probability distributions in N-gram models.
- Practice with Interactive Simulation: Gain hands-on experience by filling in bigram probability tables and checking answers interactively.
Learning Focus
- Apply Add-One Smoothing to bigram tables
- Understand the impact of smoothing on language model probabilities
- Address zero-probability issues in sparse data
- Practice probability calculations in an interactive environment