I had a few good conversations over dinner with this team at AMTA in Austin
in October.
They seem to be in the interesting position where their work is good, but
is in danger of being superseded by neural MT as they come out of the gate.
Clearly, it has benefits over NMT, and is easier to adopt, but may not be
the winner over the long run.

Here's the link
<https://amtaweb.org/wp-content/uploads/2016/11/MMT_Tutorial_FedericoTrombetti_wide-cover.pdf>
to their AMTA tutorial.

-John

On Thu, Dec 1, 2016 at 10:17 AM, Mattmann, Chris A (3010) <
chris.a.mattm...@jpl.nasa.gov> wrote:

> Wow seems like this kind of overlaps with BigTranslate as well.. thanks
> for passing
> along Matt
>
> ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
> Chris Mattmann, Ph.D.
> Principal Data Scientist, Engineering Administrative Office (3010)
> Manager, Open Source Projects Formulation and Development Office (8212)
> NASA Jet Propulsion Laboratory Pasadena, CA 91109 USA
> Office: 180-503E, Mailstop: 180-503
> Email: chris.a.mattm...@nasa.gov
> WWW:  http://sunset.usc.edu/~mattmann/
> ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
> Director, Information Retrieval and Data Science Group (IRDS)
> Adjunct Associate Professor, Computer Science Department
> University of Southern California, Los Angeles, CA 90089 USA
> WWW: http://irds.usc.edu/
> ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>
>
> On 12/1/16, 4:47 AM, "Matt Post" <p...@cs.jhu.edu> wrote:
>
>     Just came across this, and it's really cool:
>
>         https://github.com/ModernMT/MMT
>
>     See the README for some great use cases. I'm surprised I'd never heard
> of this before as it's EU funded and associated with U Edinburgh.
>
>

Reply via email to