lanking520 edited a comment on issue #17783: [RFC] MXNet 2.0 JVM Language development URL: https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-595923396 I would propose Option 3. and 4. DJL is a new Java framework that build on top of any engines. It brings the ease for Java developers to have close to numpy experience in Java. It introduced an interface that defined Java to train and run inference on different ML/DL models. In the engine layer, we implemented the MXNet specific engine that allows users to achieve most of the up-to-date functionalities: ### MXNet specific - deep-numpy ops (MXNet): numpy operators introduced in MXNet 1.6 - autograd (MXNet) : imperative training are supported by doing autogradient collections - Block concept (MXNet Gluon): Java blocks for training and inference - CachedOp (MXNet): new Symbolic Inference Engine ### DJL - Full training support: We support on Imperative/Symbolic training in Java. - Memory collection: NDManager introduced to lively track and collect memory, ensured 100 hrs stable run in prod. - MKLDNN: All ops and NDArray are built on top of MKLDNN acceleration - ModelZoo: Model zoo is a central hub for all model storage, it contains model files, preprocessing and post processing logics. ### Maintainance - Better loading experience: All native MXNet files are uploaded to maven and users can easily switch different engine versions. - JNA simplification: JNI layers are generated to save maintainance time and brought more logic into Java layer. With the benefit listed above, I would recommend Option 3 to go for the DJL path since it already covered most up-to-date MXNet feature and supporting all different symbolic/imperative training/inference. For Option 4: I am also thinking of bring our JNA layer back to MXNet so the community can build up their own Java/Scala frontend if they don't like DJL.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
