Hi Marcin, We currently have Vowpal Wabbit integrated as a discriminative phrase lexicon model (work at the Hopkins workshop on domain adaptation), and as a discriminative target word selection model (work at the 2013 MTM).
This is different from sparse features. Sparse feature weights are trained on the dev set to maximize BLEU. We train on the training corpus (the same corpus as phrases are extracted from) to maximize classification accuracy (meaning, we try to make the classifier select the correct target phrase or word given the source context). The integration of one of our classifiers is as one "dense" feature (the output is a probability distribution like phrase-based p(e|f)), and the single weight of the dense feature is tuned using MERT, MIRA or PRO along with the other dense feature weights to maximize BLEU on dev. The code for phrase-based is available in the damt_phrase branch; there will also be a release for Moses hierarchical. Cheers, Alex On Tue, Jan 21, 2014 at 10:17 AM, Marcin Junczys-Dowmunt <[email protected] > wrote: > Hi, > I remember during the last MT Marathon someone was working on > integrating Vowpal Wabbit into Moses, mainly for morphology handling. If > these people are reading the list I'd like to ask what is the current > status of this? This idea seems to be somewhat in application > possibilities to sparse Moses, doesn't it? > Best, > Marcin > _______________________________________________ > Moses-support mailing list > [email protected] > http://mailman.mit.edu/mailman/listinfo/moses-support >
_______________________________________________ Moses-support mailing list [email protected] http://mailman.mit.edu/mailman/listinfo/moses-support
