andrewfayres commented on issue #12772: [MXNET-984] Add Java NDArray and 
introduce Java Operator Builder class
URL: https://github.com/apache/incubator-mxnet/pull/12772#issuecomment-429208229
 
 
   A few comments, what this is doing is essentially spark's approach (see 
their 
[JavaDoubleRDD](https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/api/java/JavaDoubleRDD.scala)).
 They have a JavaDoubleRDD class that they use to just call methods of their 
Scala RDD and internally handle the conversions. You can also see this in their 
java 
[examples](https://github.com/apache/spark/tree/master/examples/src/main/java/org/apache/spark/examples),
 throughout all of the examples that I've looked at the Java examples never use 
any part of the core library outside of the api.java package. 
   
   If we don't provide a Java friendly wrapper for all of the classes we want 
to expose to Java developers it'll lead to a rather confusing experience. 
They'll be expected to learn what is and is not meant for them to use from 
Scala's package while also having to know when to use the javaapi package 
instead.
   
   Take the IO wrapper for example, all the Java wrapper is exposing is a Java 
friendly version of DataDesc. If we don't wrap that and instead expect them to 
use the Scala one then they'll also get exposed to all the other classes in IO 
that aren't intended for them to use (aren't Java friendly). Any methods that 
we add there to make it Java friendly will pollute the Scala API with stuff 
that scala developers don't need.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to