Hi,

Chris Dyer wrote:
> I allow pass through of all words, with a penalty that is also learned
> by MERT.
Interesting stuff. Do you have results published on this?

> With the open-class LM, I use the -unk option in SRILM,
> which reserves a bit of probability mass for OOVs. What exactly it
> does is a bit unclear to me (it's more than just replacing singletons
> with <unk>, but that's probably a reasonable approximation).

I would assume that it does the usual discounting (GoodTuring
or Kneser Ney), and gives the discounted probability mass to
<unk>.

-phi
_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to