Re: disposing all ndarray in a given context

2017-10-18 Thread Joern Kottmann
Have a look at this code: https://github.com/apache/incubator-mxnet/blob/master/scala-package/core/src/main/scala/ml/dmlc/mxnet/optimizer/AdaDelta.scala There they have the same problem and use disposeDepsExcept to release resources. Jörn On Tue, Oct 17, 2017 at 4:18 PM, TongKe Xue

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
> > - “More hardware backends to mxnet” – MXNet users get the same benefit of > HW support implementing ONNX import on top of MXNet symbolic, right? > The support of nnvm compiler compilation comes directly from going into nnvm/top. This include supporting interesting operators onnx do not yet

Re: mxnet Scala Convolution

2017-10-18 Thread Rahul Huilgol
Hi TongKe, These are operators defined in the c++ backend under src/operator. For example convolution is here https://github.com/apache/incubator-mxnet/blob/master/src/operator/convolution.cc . The operators are registered using nnvm, which helps automatically generate the frontend functions.

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Mu Li
Hi Hagay, As mentioned on my previous thread, " MXNet has lots of users already using the Symbolic API which hopefully mean that is a mature API that is not likely to have breaking changes or major issues." actually indicate NNVM is stable. Because MXNet uses NNVM's symbolic.h directly, see

[BUILD FAILED] Branch master build 545

2017-10-18 Thread Apache Jenkins Server
Build for MXNet branch master has broken. Please view the build at https://builds.apache.org/job/incubator-mxnet/job/master/545/

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Chris Olivier
My $0.02: NNVM is not currently an Apache module. It’s under dmlc umbrella, whose direction and governance is unclear. For this reason, I am inclined to support new effort being places in Apache MXNet -Chris On Wed, Oct 18, 2017 at 5:19 PM Tianqi Chen wrote: > > >

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
We plan to incubate nnvm and it and make it apache eventually. NNVM as it is now adopted apache model, as did MXNet originally. My suggestion is mainly for evolving the Apache MXNet to become healthier and cleaner in the longer term, with fewer number lines of code while supporting more

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Mu Li
I cannot get the point. MXNet relies on NNVM. In fact, the Symbol object in MXNet is defined on NNVM. On Wed, Oct 18, 2017 at 6:09 PM, Chris Olivier wrote: > My $0.02: > > NNVM is not currently an Apache module. It’s under dmlc umbrella, whose > direction and governance

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
Hi Chris: There is no intention to move things away from mxnet. The reduction of lines of code by having a better design in general, and usually, you write less redundant code by benefiting from better design. As I may quote: "the best design is not achieved not when there is nothing to add, but

Re: mxnet Scala Convolution

2017-10-18 Thread TongKe Xue
Hi Rahul, Thanks for explaining the high level design + pointing to the implementation details. Besides reading the C++ code and mentally translating the Scala calls, is there a way to get a list of all generated Scala functions? I have looked at: 1.

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
To better answer Hagay's question, I would like to dive down a bit deeper on the relation between MXNet, NNVM and model exchange format like ONNX. There are two major trends in deep learning systems now: - Common serializable formats, like ONNX and CoreML, that defines the model exchange format.

Re: mxnet Scala Convolution

2017-10-18 Thread YiZhi Liu
Hi TongKe, The symbols you are looking for are auto-generated by scala macros. Pls refer to scala-package/macros 2017-10-19 0:40 GMT+00:00 TongKe Xue : > Hi Rahul, > > Thanks for explaining the high level design + pointing to the > implementation details. > > Besides reading

Re: mxnet Scala Convolution

2017-10-18 Thread TongKe Xue
My earlier question was a bit messy. To rephrase my question: 1. Scala AlexNet sample code calls Symbol.Convolution: https://github.com/apache/incubator-mxnet/blob/master/scala-package/examples/src/main/scala/ml/dmlc/mxnetexamples/visualization/AlexNet.scala#L30 2. Symbol.scala does not

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Lupesko, Hagay
Roshani – this is an exciting initiative, ONNX support on MXNet will enable more users to ramp up on MXNet, which is great. Tianqi – a few questions and thoughts about your note: - “More hardware backends to mxnet” – MXNet users get the same benefit of HW support implementing ONNX import on top

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
I think the point here is that API stays the same, and the discussion is only about how we should implement it. Tianqi On Wed, Oct 18, 2017 at 6:43 PM, Dom Divakaruni < dominic.divakar...@gmail.com> wrote: > I imagine users would want to interact with MXNet as they normally do to > consume or

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Dom Divakaruni
I imagine users would want to interact with MXNet as they normally do to consume or export an ONNX format. How would that work with NNVM? Not sure users care about the implementation, as long as it doesn’t add another layer of complexity to the workflow. Regards, Dom > On Oct 18, 2017, at

[BUILD FAILED] Branch master build 543

2017-10-18 Thread Apache Jenkins Server
Build for MXNet branch master has broken. Please view the build at https://builds.apache.org/job/incubator-mxnet/job/master/543/

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
I am strongly recommending going through the nnvm/top. One major reason in here is that the support of nnvm/top layer NOT ONLY mean compatibility of model format with onnx. These are the major benefits: - More hardware backends to mxnet, including opencl, metal, Raspberry Pi, web browser. These

Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Roshani Nagmote
Hi guys, I am working on supporting ONNX pre-trained models in Apache MXNet and would like to seek your opinion on the choice of implementation. I also have created a GitHub issue . Supporting ONNX in MXNet

mxnet Scala Convolution

2017-10-18 Thread TongKe Xue
Hi, I am reading: https://mxnet.incubator.apache.org/api/scala/symbol.html I see Symbol.Variable, Symbol.Convolution When I look at Symbol.scala, I see Symbol.Variable at: https://github.com/apache/incubator-mxnet/blob/master/scala-package/core/src/main/scala/ml/dmlc/mxnet/Symbol.scala#L982