Guys,

I got a question to the mathematicians that you all are :)

I have been working and testing Moses as well as Groundhog for months now.
When I compare results (when comparability is possible, using same 
corpus, in-domain, blablabla, ...) I do not see much difference in both 
systems.

So when I read the statement below (from : 
http://www.cis.uni-muenchen.de/~fraser/intensive_nmt_2015/)

I am confused. Is this really a new paradigm?

I have the feeling that at the end of the day, it ends up with 
probabilities to get a word after another word after another word, given 
some sequence input.
Whether machine learning gets deeply neural or in huge tables, don't we 
get the same results if the concepts are similar ?


I would much appreciate you're feedback.
Cheers,
Vincent



Neural Machine Translation (NMT) is a new paradigm in data-driven 
machine translation. Previous generation Statistical Machine Translation 
(SMT) systems are built using a collection of heuristic models, 
typically combined in a log-linear model with a small number of 
parameters. In Neural Machine Translation, the entire translation 
process is posed as an end-to-end supervised classification problem, 
where the training data is pairs of sentences. While in SMT systems, 
word-alignment is carried out, and then fixed, and then various 
sub-models are estimated from the word-aligned data, this is not the 
case in NMT. In NMT, fixed word-alignments are not used, and instead the 
full sequence to sequence task is handled in one model.

The course will work backwards from the current state of the art in NMT, 
which is the "ensemble" system submitted by the Bengio group in Montreal 
to the 2015 shared task on machine translation (Jean et al. 2015, see 
below, with some additional details to be published). Depending on the 
background of the participants, some basics of SMT may also be covered.
_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to