Re: disposing all ndarray in a given context

2017-10-18 Thread Joern Kottmann
Have a look at this code: https://github.com/apache/incubator-mxnet/blob/master/scala-package/core/src/main/scala/ml/dmlc/mxnet/optimizer/AdaDelta.scala There they have the same problem and use disposeDepsExcept to release resources. Jörn On Tue, Oct 17, 2017 at 4:18 PM, TongKe Xue wrote: > Fol

[BUILD FAILED] Branch master build 543

2017-10-18 Thread Apache Jenkins Server
Build for MXNet branch master has broken. Please view the build at https://builds.apache.org/job/incubator-mxnet/job/master/543/

Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Roshani Nagmote
Hi guys, I am working on supporting ONNX pre-trained models in Apache MXNet and would like to seek your opinion on the choice of implementation. I also have created a GitHub issue . Supporting ONNX in MXNet will

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Dominic Divakaruni
very happy you are doing this Roshani! On Wed, Oct 18, 2017 at 1:41 PM, Roshani Nagmote wrote: > Hi guys, > > > I am working on supporting ONNX pre-trained > models in Apache MXNet and would like to seek your opinion on the choice of > implementation. I also have c

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
I am strongly recommending going through the nnvm/top. One major reason in here is that the support of nnvm/top layer NOT ONLY mean compatibility of model format with onnx. These are the major benefits: - More hardware backends to mxnet, including opencl, metal, Raspberry Pi, web browser. These t

mxnet Scala Convolution

2017-10-18 Thread TongKe Xue
Hi, I am reading: https://mxnet.incubator.apache.org/api/scala/symbol.html I see Symbol.Variable, Symbol.Convolution When I look at Symbol.scala, I see Symbol.Variable at: https://github.com/apache/incubator-mxnet/blob/master/scala-package/core/src/main/scala/ml/dmlc/mxnet/Symbol.scala#L982 How

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Lupesko, Hagay
Roshani – this is an exciting initiative, ONNX support on MXNet will enable more users to ramp up on MXNet, which is great. Tianqi – a few questions and thoughts about your note: - “More hardware backends to mxnet” – MXNet users get the same benefit of HW support implementing ONNX import on top

Re: mxnet Scala Convolution

2017-10-18 Thread TongKe Xue
My earlier question was a bit messy. To rephrase my question: 1. Scala AlexNet sample code calls Symbol.Convolution: https://github.com/apache/incubator-mxnet/blob/master/scala-package/examples/src/main/scala/ml/dmlc/mxnetexamples/visualization/AlexNet.scala#L30 2. Symbol.scala does not contain

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
> > - “More hardware backends to mxnet” – MXNet users get the same benefit of > HW support implementing ONNX import on top of MXNet symbolic, right? > The support of nnvm compiler compilation comes directly from going into nnvm/top. This include supporting interesting operators onnx do not yet sup

Re: mxnet Scala Convolution

2017-10-18 Thread Rahul Huilgol
Hi TongKe, These are operators defined in the c++ backend under src/operator. For example convolution is here https://github.com/apache/incubator-mxnet/blob/master/src/operator/convolution.cc . The operators are registered using nnvm, which helps automatically generate the frontend functions. Thi

Re: mxnet Scala Convolution

2017-10-18 Thread TongKe Xue
Hi Rahul, Thanks for explaining the high level design + pointing to the implementation details. Besides reading the C++ code and mentally translating the Scala calls, is there a way to get a list of all generated Scala functions? I have looked at: 1. https://mxnet.incubator.apache.org/api

[BUILD FAILED] Branch master build 545

2017-10-18 Thread Apache Jenkins Server
Build for MXNet branch master has broken. Please view the build at https://builds.apache.org/job/incubator-mxnet/job/master/545/

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Chris Olivier
My $0.02: NNVM is not currently an Apache module. It’s under dmlc umbrella, whose direction and governance is unclear. For this reason, I am inclined to support new effort being places in Apache MXNet -Chris On Wed, Oct 18, 2017 at 5:19 PM Tianqi Chen wrote: > > > > - “More hardware backends

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Mu Li
I cannot get the point. MXNet relies on NNVM. In fact, the Symbol object in MXNet is defined on NNVM. On Wed, Oct 18, 2017 at 6:09 PM, Chris Olivier wrote: > My $0.02: > > NNVM is not currently an Apache module. It’s under dmlc umbrella, whose > direction and governance is unclear. For this rea

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
We plan to incubate nnvm and it and make it apache eventually. NNVM as it is now adopted apache model, as did MXNet originally. My suggestion is mainly for evolving the Apache MXNet to become healthier and cleaner in the longer term, with fewer number lines of code while supporting more features,

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Mu Li
Hi Hagay, As mentioned on my previous thread, " MXNet has lots of users already using the Symbolic API which hopefully mean that is a mature API that is not likely to have breaking changes or major issues." actually indicate NNVM is stable. Because MXNet uses NNVM's symbolic.h directly, see https:/

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Dom Divakaruni
I imagine users would want to interact with MXNet as they normally do to consume or export an ONNX format. How would that work with NNVM? Not sure users care about the implementation, as long as it doesn’t add another layer of complexity to the workflow. Regards, Dom > On Oct 18, 2017, at 6:

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
I think the point here is that API stays the same, and the discussion is only about how we should implement it. Tianqi On Wed, Oct 18, 2017 at 6:43 PM, Dom Divakaruni < dominic.divakar...@gmail.com> wrote: > I imagine users would want to interact with MXNet as they normally do to > consume or ex

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
To better answer Hagay's question, I would like to dive down a bit deeper on the relation between MXNet, NNVM and model exchange format like ONNX. There are two major trends in deep learning systems now: - Common serializable formats, like ONNX and CoreML, that defines the model exchange format.

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Chris Olivier
Reduce code base of mxnet? By increasing scope of the dmlc modules? Is the intent to make mxnet a thin language wrapper around a group of dmlc modules? On Wed, Oct 18, 2017 at 6:58 PM Tianqi Chen wrote: > To better answer Hagay's question, I would like to dive down a bit deeper > on the relatio

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
Hi Chris: There is no intention to move things away from mxnet. The reduction of lines of code by having a better design in general, and usually, you write less redundant code by benefiting from better design. As I may quote: "the best design is not achieved not when there is nothing to add, but w

Re: mxnet Scala Convolution

2017-10-18 Thread YiZhi Liu
Hi TongKe, The symbols you are looking for are auto-generated by scala macros. Pls refer to scala-package/macros 2017-10-19 0:40 GMT+00:00 TongKe Xue : > Hi Rahul, > > Thanks for explaining the high level design + pointing to the > implementation details. > > Besides reading the C++ code and

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Lupesko, Hagay
Tianqi: Thanks for detailing the trends. I fully agree that ONNX is just a graph serialization format – nothing more, nothing less. I also think we all agree that this simple mechanism holds lots of value to DL users since it allows them to move between frameworks easily (e.g. train with MXNet,

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
OK, there is some miscommunication in here I guess. We only need to do a "canonization" step in python API that goes a symbol to symbol translation layer. It can be done in purely in python, and there is no need for going "down" into c++ to do this. For example, the current nnvm.from_mxnet API ta

Re: Request for suggestions- Supporting onnx in mxnet

2017-10-18 Thread Tianqi Chen
I want to offer one last thing in terms of technical details. I mentioned two trends in the deep learning systems. There is one last thing that is omitted. How should we build a good deploy end for deep learning models. There is always a paradox to this problem: - On one hand, the deployment end