Hi,
You could do this, avoiding heap allocation:
float moses_score_unigram(lmi_data_t data, wid_t a) {
const lm::base::Model &model = *((LanguageModel*)data)->GetKen();
lm::ngram::State ignored;
float score = model.Score(model.NullContextMemory(), a.m_wid, &ignored);
// Probabilities are in log10 space
return pow(score, 10.0);
}
Also, if you're only interested in unigram scores, I strongly suggest
separate unigram language model because Kneser-Ney, modified or
unmodified, assumes that you've backed off to a unigram and that
assumption doesn't hold in this case.
Kenneth
On 02/14/2012 10:49 AM, Sylvain Raybaud wrote:
> float moses_score_unigram(lmi_data_t data, wid_t a) {
> const lm::base::Model* model = ((LanguageModel*)data)->GetKen();
> lm::ngram::State * inState = new lm::ngram::State();
> inState->length = 0;
> lm::ngram::State * outState = new lm::ngram::State();
>
> float score = model->Score((const void*)&inState, a.m_wid,
> (void*)&outState);
>
> delete inState;
> delete outState;
>
> return exp(score);
> }
_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support