Hello, the approach which is taken by Spark is described here [1].
As far as I see this has the great disadvantage that the Java API would force Scala as a dependency onto the java users. For a library it is always a great advantage if it doesn't have many dependencies, or zero dependencies. In our case it could be quite realistic to have a thin wrapper around the C API without needing any other dependencies (or only dependencies which can't be avoided). The JNI layer could easily be shared between the Java and Scala API. As far as I understand is the JNI layer in the Scala API anyway private and a change to it wouldn't require that the public part of the Scala API is changed. What do you think? Jörn [1] https://cwiki.apache.org/confluence/display/SPARK/Java+API+Internals On Wed, Aug 16, 2017 at 3:39 PM, YiZhi Liu <[email protected]> wrote: > Hi Joern, > > I suggest to build Java API as a wrapper of Scala API, re-use most of > the procedures. Referring to the Java API in Apache Spark. > > 2017-08-16 18:21 GMT+08:00 Joern Kottmann <[email protected]>: >> Hello all, >> >> I would like to propose the addition of a Java API to MXNet. >> >> There has been some previous work done for the Scala API, and it makes >> sense to at least share the JNI layer between the two. >> >> The Java API probably should be aligned with the Python API (and >> others which exist already) with a few changes to give it a native >> Java feel. >> >> As far as I understand there are multiple people interested to work on >> this and it would be good to maybe come up with a written proposal on >> how things should be. >> >> My motivation is to get a Java API which can be used by Apache OpenNLP >> to solve various NLP tasks using Deep Learning based approaches and I >> am also interested to work on MXNet. >> >> Jörn > > > > -- > Yizhi Liu > DMLC member > Technical Manager > Qihoo 360 Inc, Shanghai, China
