Hi All,

Here is an update for the Java Inference API design doc on CWIKI: 
Currently, MXNet Java bindings is an extension of MXNet Scala API that allow 
users to use Java to do inference on MXNet. Users will be able to import 
pre-trained MXNet model and do single/batch inference on it.

Please take a look the design document again and feel free to leave any 
thoughts you have.


´╗┐On 5/10/18, 11:08 AM, "Andrew Ayres" <andrew.f.ay...@gmail.com> wrote:

    Hi Kellen,
    Thanks for the feedback. You bring up an interesting idea about the
    dependencies. I'll add that to the list of things to look into.
    As for the threading, my current thinking is that we implement a dispatcher
    thread like suggested in the Scala threading discussion
    I would definitely like to hide such complexities from the user.
    On Thu, May 10, 2018 at 3:22 AM, kellen sunderland <
    kellen.sunderl...@gmail.com> wrote:
    > Hey Andrew, thanks for the write-up.  I think having a Java binding will 
    > very useful for enterprise users.  Doc looks good but two things I'm
    > curious about:
    > How are you planning to handle thread safe inference?   It'll be great if
    > you can hide the complexity of dealing with dispatch threading from users.
    > The other thing I think a solid Java API could provide is a limited number
    > of dependencies.  There's some simple things we can do to make this happen
    > (create a statically linked, portable so) but there's also some complexity
    > around minimizing dependencies MXNet.  For example we'll likely want to
    > release MKL flavoured binaries, we should have a few versions of CUDA
    > supported.  We could try and have one version that has an absolute minimum
    > of dependencies (maybe statically linking with openblas).  It might be 
    > to document exactly the packages you're planning to release, and give some
    > more details about what the dependencies for the packages would be.
    > Many thanks for looking into this, I think it'll be a big improvement for
    > many of our users.
    > -Kellen
    > On Thu, May 10, 2018, 12:57 AM Andrew Ayres <andrew.f.ay...@gmail.com>
    > wrote:
    > > Hi all,
    > >
    > > There has been a lot of interest expressed in having a Java API for 
    > > inference. The general idea is that after training a model using python,
    > > users would like to be able to load the model for inference inside their
    > > existing production eco-system.
    > >
    > > We've begun exploring a few options for the implementation at <
    > > https://cwiki.apache.org/confluence/display/MXNET/
    > MXNet+Java+Inference+API
    > > >
    > > and would appreciate any insights/feedback.
    > >
    > > Thanks,
    > > Andrew
    > >

Reply via email to