Are there any such placeholders in your language modeling data and your
parallel training data? If not, all the models are going to treat them as
unknown words. In the case of the language model, it doesn't surprise me too
much that the placeholders all get pushed together, as that will produce fewer
discontiguous subsequences, which the language model will prefer.
- John Burger
MITRE
On Jul 31, 2012, at 03:05 , Henry Hu wrote:
> Hi,
>
> I use a model to translate English to French. First, I replaced HTML
> tags such as <a>, <b>, with the placeholder {}, like this:
>
> {}Processor{}
>
> Then decoding. To my confusion, I got the result:
>
> {}{} processeur
>
> instead of {}processeur{}. Why did the placeholder move? How can I
> make it fixed? Thanks for any suggestion.
>
> Henry
> _______________________________________________
> Moses-support mailing list
> [email protected]
> http://mailman.mit.edu/mailman/listinfo/moses-support
_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support