mxnet slack channel

2017-11-18 Thread TongKe Xue
Requesting an invitation to slack.

Thanks,
--TongKe


Re: Join slack channel

2017-11-07 Thread TongKe Xue
Hi,

  I would also like to join the slack channel. I'm trying to use mxnet
in production. (Java API seems most production ready among all DL
libraries.)

--TongKe

On Tue, Nov 7, 2017 at 1:11 AM, leonardo espinosa
 wrote:
> Hi there,
>
> I'm using MXNet for teaching and research, I'd love to join
> the slack channel for some questions.
>
> Thanks,
>
> ---
> Dr. Leonardo Espinosa
> aboutMe: 
> http://www.espinosaleal.me


Re: mxnet ndarray inference in js

2017-11-06 Thread TongKe Xue
Hi Hagay,

  Good point. The high level problem is:

  I want to run mxnet training on GPU, and inference on CPU --
browsers / javascript in particular.

  On the training side, I'm dealing mostly with NDArray / doing my own
gradient calculations / optimization. I would like some library for
the client side, where I can just 'port over my mxnet ndarray graph'
and start running inference.

  I hope this motivates the issue.

--TongKe

On Mon, Nov 6, 2017 at 10:58 AM, Lupesko, Hagay <lupe...@gmail.com> wrote:
> TongKe,
>
> What’s the use case you are after?
> Answering this question may help us help you (
>
> Hagay
>
> On 11/2/17, 12:10, "TongKe Xue" <tk...@tkxue.org> wrote:
>
> Hi,
>
>   I'm looking for a js library compatible with mxnet/ndarray.
>
> 1. I am aware of https://github.com/dmlc/mxnet.js/
> However:
> a. that appears to be all of mxnet, not ndarray
> b. that appears to only support Python models, whereas I'm using 
> Java/Scala
>
> 2. I am aware of https://github.com/scijs/ndarray
> However:
> this appears to be in different library, try to create a unifying API 
> over both
>
>   Back to my original question -- is there some JS API that is
> directly compatible with mxnet's ndarray interface?
>
> Thanks,
> --TongKe
>
>
>


implementing numpy.cumsum in mxnet scala ?

2017-11-04 Thread TongKe Xue
Hi,

** 1. Here is the documentation of numpy.cumsum :

https://docs.scipy.org/doc/numpy-1.13.0/reference/generated/numpy.cumsum.html

** 2. The key example is as follows:

>>> np.cumsum(a,axis=0)  # sum over rows for each of the 3 columns
array([[1, 2, 3],
   [5, 7, 9]])
>>> np.cumsum(a,axis=1)  # sum over columns for each of the 2 rows
array([[ 1,  3,  6],
   [ 4,  9, 15]])

** 3. I realize this can be easily implemented via for loops. However,
for NDArrays on the GPU context, I'd prefer to use vectorized ops.

In mxnet, what is the best way to do cumulative sum along axis?

The final goal is to compute the integral image of a 2d matrix
( https://en.wikipedia.org/wiki/Summed-area_table )

Thus, solutions that compute the integral image without going through
cumulative sum is fine too.

--TongKe


Re: mxnet ndarray inference in js

2017-11-02 Thread TongKe Xue
If I'm understanding mxnet.js correctly, any mxnet model can be
converted to mxnet.js via:

https://github.com/dmlc/mxnet.js/blob/master/tools/model2json.py

and then executed on mxnet.js


Questions: is this 'mxnet json model' format documented anywhere?

On Thu, Nov 2, 2017 at 12:10 PM, TongKe Xue <tk...@tkxue.org> wrote:
> Hi,
>
>   I'm looking for a js library compatible with mxnet/ndarray.
>
> 1. I am aware of https://github.com/dmlc/mxnet.js/
> However:
> a. that appears to be all of mxnet, not ndarray
> b. that appears to only support Python models, whereas I'm using Java/Scala
>
> 2. I am aware of https://github.com/scijs/ndarray
> However:
> this appears to be in different library, try to create a unifying API over 
> both
>
>   Back to my original question -- is there some JS API that is
> directly compatible with mxnet's ndarray interface?
>
> Thanks,
> --TongKe


mxnet java prebuilt

2017-10-29 Thread TongKe Xue
Hi,

  Looking at https://mvnrepository.com/artifact/ml.dmlc.mxnet?sort=newest
it appears there are no new maven packages since Jun 3 2017.

  For using the mxnet scala api, is the standard solution to:

1. always build from github / source via
https://mxnet.incubator.apache.org/get_started/build_from_source.html

or

2. there's some other repo to use?


Thanks,
--TongKe


Does mxnet NDArray have autodiff ?

2017-10-20 Thread TongKe Xue
Hi!

  [I'm referring to the Scala API]

  1. mxnet symbol has auto diff:
https://mxnet.incubator.apache.org/tutorials/r/symbol.html

  2. from Googling around, I can't find NDArray auto diff anywhere

  3. however, NDArray tracks dependencies -- which, in theory, should
be enough for doing autodiff

  Does NDArray have autodiff somewhere?

Thanks,
--TongKe


Re: mxnet Scala Convolution

2017-10-18 Thread TongKe Xue
Hi Rahul,

  Thanks for explaining the high level design + pointing to the
implementation details.

  Besides reading the C++ code and mentally translating the Scala
calls, is there a way to get a list of all generated Scala functions?

  I have looked at:

1. https://mxnet.incubator.apache.org/api/scala/symbol.html
shows a few examples, but is not exhaustive

2. 
https://mxnet.incubator.apache.org/api/scala/docs/index.html#ml.dmlc.mxnet.Symbol
appears more comprehensive, but I find neither Convolution nor Softmax there.


More specifically, my question is: nnvm adds a bunch of Scala bindings
to C++ code. How do I get a list of all these bindings (name, type of
inputs, type of output).


Thanks!
--TongKe


On Wed, Oct 18, 2017 at 5:28 PM, Rahul Huilgol <rahulhuil...@gmail.com> wrote:
> Hi TongKe,
>
> These are operators defined in the c++ backend under src/operator. For
> example convolution is here
> https://github.com/apache/incubator-mxnet/blob/master/src/operator/convolution.cc
> . The operators are registered using nnvm, which helps automatically
> generate the frontend functions.
>
> This tutorial on how to add a backend operator
> <https://github.com/apache/incubator-mxnet/blob/master/docs/how_to/add_op_in_backend.md>
> contains information on how to register such operators, which would help
> you understand the above file.
> An excerpt from there (for quadratic operator) : "If you use python, when
> you type import mxnet as mx, two python functions for invoking your backend
> implementation are generated on the fly: one is for imperative programming
> registered as mxnet.ndarray.quadratic or mx.nd.quadratic for short; the
> other one is for symbolic programming registered under module
> mxnet.symbol.quadratic or mx.sym.quadratic for short."
>
> I'd think the Scala package works similarly.
>
> Regards,
> Rahul
>
>
>
>
> On Wed, Oct 18, 2017 at 5:06 PM, TongKe Xue <tk...@tkxue.org> wrote:
>
>> My earlier question was a bit messy.
>>
>> To rephrase my question:
>>
>> 1. Scala AlexNet sample code calls Symbol.Convolution:
>>
>> https://github.com/apache/incubator-mxnet/blob/master/
>> scala-package/examples/src/main/scala/ml/dmlc/mxnetexamples/visualization/
>> AlexNet.scala#L30
>>
>> 2. Symbol.scala does not contain the string "Convolution"
>>
>> https://github.com/apache/incubator-mxnet/blob/master/
>> scala-package/core/src/main/scala/ml/dmlc/mxnet/Symbol.scala#L982
>>
>> Question: where/how is Symbol.Convolution defined?
>>
>> On Wed, Oct 18, 2017 at 4:10 PM, TongKe Xue <tk...@tkxue.org> wrote:
>> > Hi,
>> >
>> > I am reading: https://mxnet.incubator.apache.org/api/scala/symbol.html
>> >
>> > I see Symbol.Variable, Symbol.Convolution
>> >
>> > When I look at Symbol.scala, I see Symbol.Variable at:
>> > https://github.com/apache/incubator-mxnet/blob/master/
>> scala-package/core/src/main/scala/ml/dmlc/mxnet/Symbol.scala#L982
>> >
>> > However, I can't find where Convolution, SoftMax, FullyConnected, ...
>> > are defined.
>> >
>> > Where are these Symbols defined?
>> >
>> > (I have also tried: grep "Convolution" . -R | grep scala | grep def --
>> > but found nothing).
>> >
>> > Thanks,
>> > --TongKe
>>
>
>
>
> --
> Rahul Huilgol


Re: mxnet Scala Convolution

2017-10-18 Thread TongKe Xue
My earlier question was a bit messy.

To rephrase my question:

1. Scala AlexNet sample code calls Symbol.Convolution:

https://github.com/apache/incubator-mxnet/blob/master/scala-package/examples/src/main/scala/ml/dmlc/mxnetexamples/visualization/AlexNet.scala#L30

2. Symbol.scala does not contain the string "Convolution"

https://github.com/apache/incubator-mxnet/blob/master/scala-package/core/src/main/scala/ml/dmlc/mxnet/Symbol.scala#L982

Question: where/how is Symbol.Convolution defined?

On Wed, Oct 18, 2017 at 4:10 PM, TongKe Xue <tk...@tkxue.org> wrote:
> Hi,
>
> I am reading: https://mxnet.incubator.apache.org/api/scala/symbol.html
>
> I see Symbol.Variable, Symbol.Convolution
>
> When I look at Symbol.scala, I see Symbol.Variable at:
> https://github.com/apache/incubator-mxnet/blob/master/scala-package/core/src/main/scala/ml/dmlc/mxnet/Symbol.scala#L982
>
> However, I can't find where Convolution, SoftMax, FullyConnected, ...
> are defined.
>
> Where are these Symbols defined?
>
> (I have also tried: grep "Convolution" . -R | grep scala | grep def --
> but found nothing).
>
> Thanks,
> --TongKe


mxnet Scala Convolution

2017-10-18 Thread TongKe Xue
Hi,

I am reading: https://mxnet.incubator.apache.org/api/scala/symbol.html

I see Symbol.Variable, Symbol.Convolution

When I look at Symbol.scala, I see Symbol.Variable at:
https://github.com/apache/incubator-mxnet/blob/master/scala-package/core/src/main/scala/ml/dmlc/mxnet/Symbol.scala#L982

However, I can't find where Convolution, SoftMax, FullyConnected, ...
are defined.

Where are these Symbols defined?

(I have also tried: grep "Convolution" . -R | grep scala | grep def --
but found nothing).

Thanks,
--TongKe


disposing all ndarray in a given context

2017-10-16 Thread TongKe Xue
Quoting: 
https://github.com/apache/incubator-mxnet/blob/master/scala-package/core/src/main/scala/ml/dmlc/mxnet/NDArray.scala#L545-L546

* WARNING: it is your responsibility to clear this object through dispose().
* NEVER rely on the GC strategy

Is there a way to say "dispose all ndarrays of this context" ?