Hi, 

the goal of the language model is to guess/predict which word will follow 
any given group of words.  Perplexity measures how well the LM is at
guessing words following some given context. A perplexity of e.g. 100 
means that the LM on average guesses the right word with chance 1/100.  
This number must be compared against the size of the vocabulary. PP=100 
is a pretty good figure  good for a very large vocabulary is very large, e.g. 
1,500,000 word forms. It does not look so good for a small vocabulary, 
e.g. 15,000 words.  The lower the perplexity the more useful the LM is.
In MT a better language model often means better (more fluent) translations. 

Marcello



> On 07 Jan 2015, at 09:47, Leandra Vogel <[email protected]> wrote:
> 
> hello mt folks,
> 
> can someone please explain (if possible in layman's terms) what perplexity 
> means and why it is important? 
> 
> maybe also give a simple example?  
> 
> Best, 
> Lea
> _______________________________________________
> Mt-list site list
> [email protected]
> http://lists.eamt.org/mailman/listinfo/mt-list

_______________________________________________
Mt-list site list
[email protected]
http://lists.eamt.org/mailman/listinfo/mt-list

Reply via email to