Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-07-31 Thread Samuel Audet
> We are looking for a robust solution for MXNet Java developers to use 
> especially owned and maintained by the Apache MXNet's community. I will be 
> more than happy to see if you would like to contribute the source code that 
> generate MXNet JavaCpp package to this repo. So we can own the maintainance 
> and responsible for the end users that the package is reliable.
> 
> At the beginning, we were discussing several ways that we can try to preserve 
> a low level Java API for MXNet that anyone who use Java can start with. Most 
> of the problems were lying under the ownership and maintainance part. I have 
> placed JavaCpp option to option 5 so we can see which one works the best in 
> the end.

Sounds good, thanks! If you have any specific concerns about the above, please 
let me know. JNA seems to be maintained by a single person with apparently no 
connections to the AI industry 
(https://dzone.com/articles/scratch-netbeans-itch-matthias) whereas I have to 
maintain anyway as part of my work APIs mainly for OpenCV, FFmpeg, ONNX 
Runtime, and TensorFlow at the moment, but others as well and it tends to vary 
with time, MXNet could become part of those eventually, and I have users paying 
for commercial support of proprietary libraries too, so I think JavaCPP is the 
better option here, but I'm obviously biased. :)

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-667041968

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-07-29 Thread Yuan Tang
This is great discussion. Thanks @lanking520 for initiating this. Perhaps we 
can define some key metrics here so we can compare the solutions later? 

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-665775006

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-07-29 Thread Lanking
@saudet Thanks for your reply. Still, I am concerned about the first question:

you mentioned:
> We can go either way, but I found that for contemporary projects like 
> Deeplearning4j, MXNet, PyTorch, or TensorFlow that > need to develop 
> high-level APIs on top of something like JavaCPP prefer to have control over 
> everything in their own
> repositories, and use JavaCPP pretty much like we would use cython or 
> pybind11 with setuptools for Python.

We are looking for a robust solution for MXNet Java developers to use 
especially owned and maintained by the Apache MXNet's community. I will be more 
than happy to see if you would like to contribute the source code that generate 
MXNet JavaCpp package to this repo. So we can own the maintainance and 
responsible for the end users that the package is reliable.

At the beginning, we were discussing several ways that we can try to preserve a 
low level Java API for MXNet that anyone who use Java can start with. Most of 
the problems were lying under the ownership and maintainance part. I have 
placed JavaCpp option to option 5 so we can see which one works the best in the 
end.

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-665771813

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-07-25 Thread Samuel Audet
> ## What's missing
> 
> javacpp-presets-mxnet doesn't expose APIs form nnvm/c_api.h (some of current 
> python/gluon API depends on APIs in nnvm/c_api.h)

I've added that the other day, thanks to @frankfliu for pointing this out: 
https://github.com/bytedeco/javacpp-presets/commit/976e6f7d307b3f3855f39413c494d8f482c9adf6

> See javadoc: http://bytedeco.org/javacpp-presets/mxnet/apidocs/
> 
> 1. Java class name is “mxnet”, which is not following java naming conventions

That's not hardcoded. We can use whatever name we want for that class.

> 2. Each pointer has a corresponding java class, which is arguable. It's 
> necessary to expose them as strong type class if they meant to be used 
> directly by end developer. But they really should only be internal 
> implementation of the API. It's overkill to expose them as a Type instead of 
> just a pointer.

We can map everything to `Pointer`, that's not a problem either.

> 3. All the classes (except mxnet.java) are hand written.

No, they are not. Everything in the `src/gen` directory here is generated at 
build time:
https://github.com/bytedeco/javacpp-presets/tree/master/mxnet/src/gen/java/org/bytedeco/mxnet

> 4. API mapping are hand coded as well.

If you're talking about this file, yes, that's the only thing that is written 
manually:
https://github.com/bytedeco/javacpp-presets/blob/master/mxnet/src/main/java/org/bytedeco/mxnet/presets/mxnet.java
(The formatting is a bit crappy, I haven't touched it in a while, but we can 
make it look prettier like this:
https://github.com/bytedeco/javacpp-presets/blob/master/onnxruntime/src/main/java/org/bytedeco/onnxruntime/presets/onnxruntime.java
 )

> ## Performance
> 
> JavaCPP native library load takes a long time, it takes average _2.6 seconds_ 
> to initialize libmxnet.so with javacpp.
> 
> Loader.load(org.bytedeco.mxnet.global.mxnet.class);

Something's wrong, that takes less than 500 ms on my laptop, and that includes 
loading OpenBLAS, OpenCV, and a lookup for CUDA and MKL, which can obviously be 
optimized... In any case, we can debug that later to see what is going wrong on 
your end.

> ## Issues
> 
> The open source code on github doesn't match the binary release on maven 
> central:
> 
> * the maven group and the java package name are different.

Both the group ID and the package names are `org.bytedeco`, but in any case, if 
that gets maintained somewhere here, I imagine it would be changed to something 
like `org.apache.mxnet.xyz.internal.etc`

> * c predict API is not included in maven version

Yes it is: 
http://bytedeco.org/javacpp-presets/mxnet/apidocs/org/bytedeco/mxnet/global/mxnet.html
 
> * Example code doesn't work with maven artifacts, it can only build with 
> snapshot version locally.

https://github.com/bytedeco/javacpp-presets/tree/master/mxnet/samples works 
fine for me on Linux:
```
$ mvn -U clean compile exec:java -Djavacpp.platform.custom 
-Djavacpp.platform.host -Dexec.args=apple.jpg
...
Downloading from sonatype-nexus-snapshots: 
https://oss.sonatype.org/content/repositories/snapshots/org/bytedeco/mxnet-platform/1.7.0.rc1-1.5.4-SNAPSHOT/maven-metadata.xml
Downloaded from sonatype-nexus-snapshots: 
https://oss.sonatype.org/content/repositories/snapshots/org/bytedeco/mxnet-platform/1.7.0.rc1-1.5.4-SNAPSHOT/maven-metadata.xml
 (1.3 kB at 2.5 kB/s)
Downloading from sonatype-nexus-snapshots: 
https://oss.sonatype.org/content/repositories/snapshots/org/bytedeco/mxnet-platform/1.7.0.rc1-1.5.4-SNAPSHOT/mxnet-platform-1.7.0.rc1-1.5.4-20200725.115300-20.pom
Downloaded from sonatype-nexus-snapshots: 
https://oss.sonatype.org/content/repositories/snapshots/org/bytedeco/mxnet-platform/1.7.0.rc1-1.5.4-SNAPSHOT/mxnet-platform-1.7.0.rc1-1.5.4-20200725.115300-20.pom
 (4.7 kB at 9.3 kB/s)
Downloading from sonatype-nexus-snapshots: 
https://oss.sonatype.org/content/repositories/snapshots/org/bytedeco/javacpp-presets/1.5.4-SNAPSHOT/maven-metadata.xml
Downloaded from sonatype-nexus-snapshots: 
https://oss.sonatype.org/content/repositories/snapshots/org/bytedeco/javacpp-presets/1.5.4-SNAPSHOT/maven-metadata.xml
 (610 B at 1.5 kB/s)
Downloading from sonatype-nexus-snapshots: 
https://oss.sonatype.org/content/repositories/snapshots/org/bytedeco/javacpp-presets/1.5.4-SNAPSHOT/javacpp-presets-1.5.4-20200725.155410-6590.pom
Downloaded from sonatype-nexus-snapshots: 
https://oss.sonatype.org/content/repositories/snapshots/org/bytedeco/javacpp-presets/1.5.4-SNAPSHOT/javacpp-presets-1.5.4-20200725.155410-6590.pom
 (84 kB at 91 kB/s)
Downloading from sonatype-nexus-snapshots: 
https://oss.sonatype.org/content/repositories/snapshots/org/bytedeco/opencv-platform/4.4.0-1.5.4-SNAPSHOT/maven-metadata.xml
Downloaded from sonatype-nexus-snapshots: 
https://oss.sonatype.org/content/repositories/snapshots/org/bytedeco/opencv-platform/4.4.0-1.5.4-SNAPSHOT/maven-metadata.xml
 (1.2 kB at 2.6 kB/s)
Downloading from sonatype-nexus-snapshots: 

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-07-25 Thread Samuel Audet
> @saudet Thanks for your proposal. I have four questions would like to ask you:
> 
> 1. If we adopt JavaCpp package, how will that be consumed? Under byteco or 
> apache MXNet? Essentially from our previous discussion, we really don't want 
> another 3rdparty checkin.

We can go either way, but I found that projects like MXNet or TensorFlow that 
need to develop high-level APIs on top of something like JavaCPP prefer to have 
control over everything in their own repositories, and use JavaCPP pretty much 
like we would use pybind and pip for Python.

I started the JavaCPP Presets because for projects such as OpenCV, FFmpeg, 
LLVM, etc, high-level APIs for other languages than C/C++ are not being 
developed as part of those projects. I also realized the Java community needed 
something like Anaconda...

> 2. Can you also do a benchmark on the MXNet's API's performance and possibly 
> share the reproducible code? We did test the performance on JavaCpp vs JNA vs 
> JNI and didn't see much difference on performance (under 10%).
> 
> 
> * MXImperativeInvokeEx
> 
> * CachedOpForward
> 
> 
> The above two methods are most frequently used methods in order to do minimum 
> inference request, please try on these two to see how performance goes.
> 

If you're doing only batch operations, as would be the case for Python 
bindings, you're not going to see much difference, no. What you need to look at 
are things like the Indexer package, which allows us to implement fast custom 
operations in Java like this: http://bytedeco.org/news/2014/12/23/third-release/
You're not going to be able to do that with JNA or JNI without essentially 
recoding that kind of thing.

> 3. We do have some additional technical issue with JavaCpp, is there any plan 
> to fix it? (I will put it into a separate comment since it is really big.
> 
> 4. How do you ensure the performance if the build flag is different? Like the 
> mxnet has to build from source (with necessary modification on source code) 
> in order to work along with javacpp
> 
> 5. regarding to the dependencies issue, can we go without additional opencv 
> and openblas in the package?

Yes, that's the kind of issues that would be best dealt with by using only 
JavaCPP as a low-level tool, instead of the presets, which is basically a 
high-level distribution like Anaconda.

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-663916338

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-07-23 Thread Lanking
## What's inside of javacpp-presets-mxnet

* Native shared libraries:
* libmxnet.so
* libjnimxnet.so
* libmkldnn.0.so
* MXNet scala and java classes
* javacpp-presets-mxnet java API implemenations
* javacpp generated native bindings
* mxnet C_API
* mxnet-predict C_API

## What's missing

javacpp-presets-mxnet doesn't expose APIs  form nnvm/c_api.h (some of current 
python/gluon API depends on APIs in nnvm/c_api.h)


## What's the dependencies
```
org.bytedeco.mxnet:ImageClassificationPredict:jar:1.5-SNAPSHOT
+- org.bytedeco:mxnet-platform:jar:1.4.0-1.5-SNAPSHOT:compile
|  +- org.bytedeco:opencv-platform:jar:4.0.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:opencv:jar:android-arm:4.0.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:opencv:jar:android-arm64:4.0.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:opencv:jar:android-x86:4.0.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:opencv:jar:android-x86_64:4.0.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:opencv:jar:ios-arm64:4.0.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:opencv:jar:ios-x86_64:4.0.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:opencv:jar:linux-x86:4.0.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:opencv:jar:linux-x86_64:4.0.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:opencv:jar:linux-armhf:4.0.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:opencv:jar:linux-ppc64le:4.0.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:opencv:jar:macosx-x86_64:4.0.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:opencv:jar:windows-x86:4.0.1-1.5-SNAPSHOT:compile
|  |  \- org.bytedeco:opencv:jar:windows-x86_64:4.0.1-1.5-SNAPSHOT:compile
|  +- org.bytedeco:openblas-platform:jar:0.3.5-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:openblas:jar:android-arm:0.3.5-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:openblas:jar:android-arm64:0.3.5-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:openblas:jar:android-x86:0.3.5-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:openblas:jar:android-x86_64:0.3.5-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:openblas:jar:ios-arm64:0.3.5-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:openblas:jar:ios-x86_64:0.3.5-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:openblas:jar:linux-x86:0.3.5-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:openblas:jar:linux-x86_64:0.3.5-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:openblas:jar:linux-armhf:0.3.5-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:openblas:jar:linux-ppc64le:0.3.5-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:openblas:jar:macosx-x86_64:0.3.5-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:openblas:jar:windows-x86:0.3.5-1.5-SNAPSHOT:compile
|  |  \- org.bytedeco:openblas:jar:windows-x86_64:0.3.5-1.5-SNAPSHOT:compile
|  +- org.bytedeco:mkl-dnn-platform:jar:0.18.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:mkl-dnn:jar:linux-x86_64:0.18.1-1.5-SNAPSHOT:compile
|  |  +- org.bytedeco:mkl-dnn:jar:macosx-x86_64:0.18.1-1.5-SNAPSHOT:compile
|  |  \- org.bytedeco:mkl-dnn:jar:windows-x86_64:0.18.1-1.5-SNAPSHOT:compile
|  \- org.bytedeco:mxnet:jar:1.4.0-1.5-SNAPSHOT:compile
\- org.bytedeco:mxnet:jar:macosx-x86_64:1.4.0-1.5-SNAPSHOT:compile
   +- org.bytedeco:opencv:jar:4.0.1-1.5-SNAPSHOT:compile
   +- org.bytedeco:openblas:jar:0.3.5-1.5-SNAPSHOT:compile
   +- org.bytedeco:mkl-dnn:jar:0.18.1-1.5-SNAPSHOT:compile
   +- org.bytedeco:javacpp:jar:1.5-SNAPSHOT:compile
   +- org.slf4j:slf4j-simple:jar:1.7.25:compile
   |  \- org.slf4j:slf4j-api:jar:1.7.25:compile
   \- org.scala-lang:scala-library:jar:2.11.12:compile
```


## Build the project form source

I spent 40 min to build the project on my mac, and has to make some hack to 
build it.

* It downloads mxnet source code, and making some hack around the source code
* It uses it's own set of compiler flags to build libmxnet.so
* It also build MXNet Scala project.

Classes

See javadoc:  http://bytedeco.org/javacpp-presets/mxnet/apidocs/


1. Java class name is “mxnet”, which is not following java naming conventions
2. Each pointer has a corresponding java class, which is arguable. It's 
necessary to expose them as strong type class if they meant to be used directly 
by end developer. But they really should only be internal implementation of the 
API. It's overkill to expose them as a Type instead of just a pointer.
3. All the classes (except mxnet.java) are hand written.
4. API mapping are hand coded as well.



## Performance

JavaCPP native library load takes a long time, it takes average *2.6 seconds* 
to initialize libmxnet.so with javacpp.

Loader.load(org.bytedeco.mxnet.global.mxnet.class);



## Issues

The open source code on github doesn't match the binary release on maven 
central:

*  the maven group and the java package name are different.
* c predict API is not included in maven version
* Example code doesn't work with maven artifacts, it can only build with 
snapshot version locally.





-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-663138354

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-07-23 Thread Lanking
@saudet Thanks for your proposal. I have three questions would like to ask you:

1. If we adopt JavaCpp package, how will that be consumed? Under byteco or 
apache MXNet? Essentially from our previous discussion, we really don't want 
another 3rdparty checkin.

2. Can you also do a benchmark on the MXNet's API's performance and possibly 
share the reproducible code? I did have

3. We do have some additional technical issue with JavaCpp, is there any plan 
to fix it? (I will put it into a separate comment since it is really big.

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-663137329

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-07-23 Thread Carin Meier
@saudet @szha - I think we be a good path forward (from the Clojure perspective)

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-663085890

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-07-23 Thread Sheng Zha
@saudet this looks awesome! An 18% improvement in throughput is quite 
significant for switching the way of integration for a frontend binding. I 
think we should definitely start with this offering. @lanking520 @gigasquid 
what do you think?

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-663064169

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-07-23 Thread Samuel Audet
Hi, instead of JNA, I would be happy to provide bindings for the C API and 
maintain packages based on the JavaCPP Presets here:
https://github.com/bytedeco/javacpp-presets/tree/master/mxnet
JavaCPP adds no overhead, unlike JNA, and is often faster than manually written 
JNI. Plus JavaCPP provides more tools than JNA to automate the process of 
parsing header files as well as packaging native libraries in JAR files. I have 
been maintaining modules for TensorFlow based on JavaCPP, and we actually got a 
boost in performance when compared to the original JNI code:
https://github.com/tensorflow/java/pull/18#issuecomment-579600568
I would be able to do the same for MXNet and maintain the result in a 
repository of your choice. Let me know if this sounds interesting! BTW, the 
developers of DJL also seem opened to switch from JNA to JavaCPP even though it 
is not a huge priority. Still, standardizing how native bindings are created 
and loaded with other libraries for which JavaCPP is pretty much already the 
standard (such as OpenCV, TensorFlow, CUDA, FFmpeg, LLVM, Tesseract) could go a 
long way in alleviating concerns of stability.

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-662994965

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-04-28 Thread Sheng Zha
My understanding is that DJL depends on MXNet, so if you want to bring JNA from 
DJL into MXNet, it will create circular dependency as a 3rdparty module. In 
terms of stability, I was referring to the development of code base rather than 
the performance.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-620815186

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-04-28 Thread Lanking
There is no code for JNA, everything is generated. It ensure the general 
standard and minimum layer in C to avoid error and mistakes.

About JNA, you can find more information here: 
[jnarator](https://github.com/awslabs/djl/tree/master/mxnet/jnarator). We build 
an entire project for the jna generation pipeline. All we need is a header file 
from MXNet to build everything. The dependency required by the gradle build is 
minimum, as you can fine in 
[here](https://github.com/awslabs/djl/blob/master/mxnet/jnarator/build.gradle#L5-L15).
  

To address the concern of stability, we tested DJL MXNet with 100 hour 
inference run on server and it remains stable. Training experience is also 
smooth, multi-gpu run 48 hours is also stable. The performance is very close to 
python with large models and may bring huge boost if model is smaller or equals 
to "squeezenet level".

@frankfliu can bring more information about the JNA layer.

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-620811482

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-04-28 Thread Sheng Zha
@lanking520 would it create circular dependency? and how stable is the JNA and 
what changes are expected? it would be great if you could share a pointer on 
the JNA code to help clarify these concerns.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-620803313

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-04-28 Thread Lanking
@szha For option 4, I would recommend to consume the JNA layer as a submodule 
from DJL. I am not sure if this is recommendation serves as "add a dependency 
in mxnet".

There are two key reason that support for that:

1. DJL moves really fast and we can quickly change the JNA layer whenever in 
need. Comparing to the merging speed in MXNet.

2. Consume as a submodule means MXNet community don't have to take care much on 
the maintainance. DJL team will regularly provide Jar for MXNet user to consume.

We can also contribute code back in MXNet repo, since it is open source. But we 
may still keep a copy in our repo for fast iteration. It may cause diverged 
version on JNA layer.

Overally speaking, my recommendation on option 4 leads towards a direction to 
consume DJL JNA as a submodule.

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-620750376

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-04-23 Thread Leonard Lausen
Another data point is that we currently only support OpenJDK 8 but the JVM 
languages are broken with OpenJDK 11 which is used on Ubuntu 18.04 for example. 
See https://github.com/apache/incubator-mxnet/issues/18153

-- 
You are receiving this because you commented.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-618745562

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-04-12 Thread Leonard Lausen
Another data point is that all of our Scala tests fail randomly with 
`src/c_api/c_api_profile.cc:141: Check failed: 
!thread_profiling_data.calls_.empty():`, so there seem to be some underlying 
issues.

https://github.com/apache/incubator-mxnet/issues/17067

-- 
You are receiving this because you commented.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-612726875

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-03-16 Thread Sheng Zha
+1 for option 1 and 2

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-599800518

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-03-09 Thread Carin Meier
For the Clojure package. It is a lot easier to interop with Java than with 
Scala - so if the the base is Java that  every thing is using - it will be 
better for Clojure.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-596779618

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-03-09 Thread Lanking
Reopened #17783.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#event-3112408479

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-03-09 Thread Lanking
Closed #17783.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#event-3112408330

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-03-09 Thread Lanking
> It is going to be closer to a complete rewrite. On the other hand, making a 
> new Scala API would be imperative instead of symbolic and I think there are 
> going to be a lot of operator changes to better match numpy in 2.0. I don't 
> think the migration costs for a Scala 2.0 would be that much less anyway
> 
> For users who don't want a full rewrite, they can continue using an old 
> release or whatever new releases we make on the v1.x branch.

I would be cautious to say a complete rewrite, especially talking about 
inference use cases. They can still use similar 

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-596779343

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-03-09 Thread Zach Kimberg
It is going to be closer to a complete rewrite. On the other hand, making a new 
Scala API would be imperative instead of symbolic and I think there are going 
to be a lot of operator changes to better match numpy in 2.0. I don't think the 
migration costs for a Scala 2.0 would be that much less anyway

For users who don't want a full rewrite, they can continue using an old release 
or whatever new releases we make on the v1.x branch.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-596777456

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-03-09 Thread Carin Meier
@lanking520 thanks for the clarification above. A further question - How do 
envision a current Scala MXNet user migrate their code? Is it going to be 
mostly reusable or is it going to be a current rewrite for them?

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-596714532

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-03-06 Thread Yuan Tang
I propose option 1 and 2 since it took us a lot of efforts to bring MXNet to 
Scala originally and there are already adopters of Scala API in industries 
(some may not have been disclosed yet). But I am open to other options. Not 
familiar with DJL though but I assume @frankfliu  and @lanking520 are the 
masters behind DJL.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-595945011

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-03-06 Thread Lanking
I would propose Option 3. and 4. Will add more comments here to explain why

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-595923396