[GitHub] lanking520 commented on issue #10387: Flaky test(scala): test_arange

2018-05-22 Thread GitBox
lanking520 commented on issue #10387: Flaky test(scala): test_arange
URL: 
https://github.com/apache/incubator-mxnet/issues/10387#issuecomment-391241137
 
 
   Any more reports? I cannot reproduce the 'Flaky', tried to run 100 times on 
a single Ubuntu machine and all passed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lanking520 commented on issue #10549: scala-package 1.1.0 build instruction Windows VS2015

2018-05-22 Thread GitBox
lanking520 commented on issue #10549: scala-package 1.1.0 build instruction 
Windows VS2015
URL: 
https://github.com/apache/incubator-mxnet/issues/10549#issuecomment-391240454
 
 
   Unfortunately, Scala is not supported on Windows for now. We are prioritize 
Windows support for Scala this month as many users requested that. Will try to 
provide support that as soon as possible. However, there is still more time 
needed for a stable release.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] HyperGroups commented on issue #10998: Question: problem with import model generated by Wolfram Mathematica?

2018-05-22 Thread GitBox
HyperGroups commented on issue #10998: Question: problem with import model 
generated by Wolfram Mathematica?
URL: 
https://github.com/apache/incubator-mxnet/issues/10998#issuecomment-391239612
 
 
   One solution is add ':' test in the code, but may not be able to solve all 
questions.
   ie,
   Modifid code:
   
 ```
   symbol = sym.load('%s-symbol.json' % prefix)
   save_dict = nd.load('%s-%04d.params' % (prefix, epoch))
arg_params = {}
aux_params = {}
for k, v in save_dict.items():
if ':' in k: 
   tp,name = k.split(':', 1)
 if tp == 'arg':
 arg_params[name] = v
 if tp == 'aux':
 aux_params[name] = v
   else:
   arg_params[k]=v
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lanking520 commented on issue #10436: FeedForward.scala and NDArrayIter.scala leak memory by not disposing of NDArrays

2018-05-22 Thread GitBox
lanking520 commented on issue #10436: FeedForward.scala and NDArrayIter.scala 
leak memory by not disposing of NDArrays
URL: 
https://github.com/apache/incubator-mxnet/issues/10436#issuecomment-391238772
 
 
   Thanks for your message! We are currently working on a draft proposal to 
solve this issue and send to the devlist. The disposing method currently brings 
a great inconvenience to users. In the past, we tried to place that into 
finalizer stage and it sometimes crash for unknown reasons. Will follow up with 
this thread with our progress.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lanking520 commented on issue #10574: Couldn't work well sbt compile using jar file

2018-05-22 Thread GitBox
lanking520 commented on issue #10574: Couldn't work well sbt compile using jar 
file
URL: 
https://github.com/apache/incubator-mxnet/issues/10574#issuecomment-391234559
 
 
   Just to follow up, are you still facing the same issue now? Currently we 
have changed the namespace into Apache org.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] solin319 opened a new pull request #11030: add cpu_pinned in __init__

2018-05-22 Thread GitBox
solin319 opened a new pull request #11030: add cpu_pinned in __init__
URL: https://github.com/apache/incubator-mxnet/pull/11030
 
 
   This make mx.cpu_pinned can be used.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lispc commented on issue #10996: Is "Check failed: ctx == vcontext[nid]..." an internal assertion?

2018-05-22 Thread GitBox
lispc commented on issue #10996: Is  "Check failed: ctx == vcontext[nid]..." an 
internal assertion?
URL: 
https://github.com/apache/incubator-mxnet/issues/10996#issuecomment-391226473
 
 
   Finally I solved the problem by add a ```with mx.AttrScope(ctx_group=...)``` 
before ```mx.sym.Variable(name=...)```
   It is not an internal bug. But I think it may be better to use a more clear 
error message. The current error message is not easy for understanding and 
fixing the problem.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #10401: Gluon nn BatchNorm layer cannot ignore the beta

2018-05-22 Thread GitBox
szha commented on issue #10401: Gluon nn BatchNorm layer cannot ignore the beta
URL: 
https://github.com/apache/incubator-mxnet/issues/10401#issuecomment-391216378
 
 
   Hi @Will0622. I wasn't able to reproduce this issue with simple code 
samples. Could you provide a snippet for reproducing the issue?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhanghang1989 commented on issue #10852: [MXNET-411] Add ROI Align

2018-05-22 Thread GitBox
zhanghang1989 commented on issue #10852: [MXNET-411] Add ROI Align
URL: https://github.com/apache/incubator-mxnet/pull/10852#issuecomment-391214697
 
 
   Finally it passes the CI :) @piiswrong 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] burness commented on issue #10594: [example/sparse/factorization_machine/]add auc and fix the typo in re…

2018-05-22 Thread GitBox
burness commented on issue #10594: [example/sparse/factorization_machine/]add 
auc and fix the typo in re…
URL: https://github.com/apache/incubator-mxnet/pull/10594#issuecomment-391213259
 
 
   I will do some fixes later


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha closed issue #10329: Can't print params on gpu with collect_params

2018-05-22 Thread GitBox
szha closed issue #10329: Can't print params on gpu with collect_params
URL: https://github.com/apache/incubator-mxnet/issues/10329
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] liuzx32 opened a new issue #11029: The example of train_mnist.py can't exit after completes.

2018-05-22 Thread GitBox
liuzx32 opened a new issue #11029: The example of train_mnist.py can't exit 
after completes.
URL: https://github.com/apache/incubator-mxnet/issues/11029
 
 
   Run ../example/image-classification/train_mnist.py --network lenet 
--kv-store dist_sync, but the process can't exit after task completes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] pengzhao-intel commented on issue #11026: Test/mkl dnn act

2018-05-22 Thread GitBox
pengzhao-intel commented on issue #11026: Test/mkl dnn act
URL: https://github.com/apache/incubator-mxnet/pull/11026#issuecomment-391201141
 
 
   @juliusshufan is mkl act test on your test plan? you can work w/ @azai91 for 
this test.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet-site] branch asf-site updated: Bump the publish timestamp.

2018-05-22 Thread zhasheng
This is an automated email from the ASF dual-hosted git repository.

zhasheng pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 51aa78b  Bump the publish timestamp.
51aa78b is described below

commit 51aa78ba49af94f1458f87ca62192ca52cfb57bd
Author: mxnet-ci 
AuthorDate: Wed May 23 02:14:47 2018 +

Bump the publish timestamp.
---
 date.txt | 1 +
 1 file changed, 1 insertion(+)

diff --git a/date.txt b/date.txt
new file mode 100644
index 000..43b19a7
--- /dev/null
+++ b/date.txt
@@ -0,0 +1 @@
+Wed May 23 02:14:47 UTC 2018

-- 
To stop receiving notification emails like this one, please contact
zhash...@apache.org.


[GitHub] vrakesh opened a new issue #11028: Pre-trained Shufflenet model fails during inference on mxnet-mkl==1.2.0

2018-05-22 Thread GitBox
vrakesh opened a new issue #11028: Pre-trained  Shufflenet model fails during 
inference on mxnet-mkl==1.2.0
URL: https://github.com/apache/incubator-mxnet/issues/11028
 
 
   Note: Providing complete information in the most concise form is the best 
way to get help. This issue template serves as the checklist for essential 
information to most of the technical issues and bug reports. For non-technical 
issues and feature requests, feel free to present the information in what you 
believe is the best form.
   
   For Q & A and discussion, please start a discussion thread at 
https://discuss.mxnet.io 
   
   ## Description
   Shufflenet pre-trained model, inference run, produces a error in mxnet-mkl 
1.2, this error does not occur in older versions or mxnet==1.2
   
   
   
   ## Environment info (Required)
   python diagnose.py 
   --Python Info--
   Version  : 3.6.5
   Compiler : GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.39.2)
   Build: ('default', 'Apr 25 2018 14:26:36')
   Arch : ('64bit', '')
   Pip Info---
   Version  : 10.0.1
   Directory: /Users/Workspace/devenv/lib/python3.6/site-packages/pip
   --MXNet Info---
   Version  : 1.2.0
   Directory: /Users/Workspace/devenv/lib/python3.6/site-packages/mxnet
   Commit Hash   : 297c64fd2ee404612aa3ecc880b940fb2538039c
   --System Info--
   Platform : Darwin-16.7.0-x86_64-i386-64bit
   system   : Darwin
   node : localhost
   release  : 16.7.0
   version  : Darwin Kernel Version 16.7.0: Tue Jan 30 11:27:06 PST 2018; 
root:xnu-3789.73.11~1/RELEASE_X86_64
   --Hardware Info--
   machine  : x86_64
   processor: i386
   b'machdep.cpu.extfeatures: SYSCALL XD 1GBPAGE EM64T LAHF LZCNT PREFETCHW 
RDTSCP TSCI'
   b'machdep.cpu.leaf7_features: SMEP ERMS RDWRFSGS TSC_THREAD_OFFSET BMI1 AVX2 
BMI2 INVPCID SMAP RDSEED ADX IPT SGX FPU_CSDS MPX CLFSOPT'
   b'machdep.cpu.features: FPU VME DE PSE TSC MSR PAE MCE CX8 APIC SEP MTRR PGE 
MCA CMOV PAT PSE36 CLFSH DS ACPI MMX FXSR SSE SSE2 SS HTT TM PBE SSE3 PCLMULQDQ 
DTES64 MON DSCPL VMX EST TM2 SSSE3 FMA CX16 TPR PDCM SSE4.1 SSE4.2 x2APIC MOVBE 
POPCNT AES PCID XSAVE OSXSAVE SEGLIM64 TSCTMR AVX1.0 RDRAND F16C'
   b'machdep.cpu.brand_string: Intel(R) Core(TM) i7-7700HQ CPU @ 2.80GHz'
   
   ```
   What to do:
   1. Download the diagnosis script from 
https://raw.githubusercontent.com/apache/incubator-mxnet/master/tools/diagnose.py
   2. Run the script using `python diagnose.py` and paste its output here.
   
   ```
   
   Package used (Python/R/Scala/Julia):
   I'm using python
   
   For Scala user, please provide:
   1. Java version: (`java -version`)
   2. Maven version: (`mvn -version`)
   3. Scala runtime if applicable: (`scala -version`)
   
   For R user, please provide R `sessionInfo()`:
   
   ## Build info (Required if built from source)
   
   Compiler (gcc/clang/mingw/visual studio):
   
   MXNet commit hash:
   (Paste the output of `git rev-parse HEAD` here.)
   
   Build config:
   (Paste the content of config.mk, or the build command.)
   
   ## Error Message:
   File "shuffle.py", line 27, in 
   output = mx_model.get_outputs()[0].asnumpy()
 File 
"/Users/rakvas/Workspace/devenv/lib/python3.6/site-packages/mxnet/ndarray/ndarray.py",
 line 1876, in asnumpy
   ctypes.c_size_t(data.size)))
 File 
"/Users/rakvas/Workspace/devenv/lib/python3.6/site-packages/mxnet/base.py", 
line 149, in check_call
   raise MXNetError(py_str(_LIB.MXGetLastError()))
   mxnet.base.MXNetError: [18:05:45] 
src/operator/tensor/../tensor/elemwise_unary_op.h:301: Check failed: 
inputs[0].dptr_ == outputs[0].dptr_ (0x7fa99f41b000 vs. 0x7fa99f435200) 
   
   Stack trace returned 10 entries:
   [bt] (0) 0   libmxnet.so 0x00010fef8684 
libmxnet.so + 26244
   [bt] (1) 1   libmxnet.so 0x00010fef843f 
libmxnet.so + 25663
   [bt] (2) 2   libmxnet.so 0x000110316021 
libmxnet.so + 4341793
   [bt] (3) 3   libmxnet.so 0x000110ff4993 
MXNDListFree + 139843
   [bt] (4) 4   libmxnet.so 0x000111015705 
MXNDListFree + 274357
   [bt] (5) 5   libmxnet.so 0x000110fe526c 
MXNDListFree + 76572
   [bt] (6) 6   libmxnet.so 0x000110fe7fe1 
MXNDListFree + 88209
   [bt] (7) 7   libmxnet.so 0x000110fe7ef7 
MXNDListFree + 87975
   [bt] (8) 8   libmxnet.so 0x000110fe5fe5 
MXNDListFree + 80021
   [bt] (9) 9   libsystem_pthread.dylib 0x7fffa9e2a93b 
_pthread_body + 180
   
   ## Minimum reproducible example
   reproducible test package 
https://s3.amazonaws.com/shufflenet-package/shufflenet.zip
   
   
   ## Steps to reproduce
   1. Extract zip and 
   2. run `python shuffle.py` under different mxnet version installations 
   

[GitHub] hetong007 opened a new pull request #11027: Add standard ResNet data augmentation for ImageRecordIter

2018-05-22 Thread GitBox
hetong007 opened a new pull request #11027: Add standard ResNet data 
augmentation for ImageRecordIter
URL: https://github.com/apache/incubator-mxnet/pull/11027
 
 
   ## Description ##
   Add standard ResNet data augmentation for ImageRecordIter.
   
   Specifically, this augmentation performs a "random resized crop", which crop 
the image with respect to a random aspect ratio, and a random area, and resize 
it to the desired size.
   
   - If `random_resized_crop=True`, existing `max_random_scale` and 
`min_random_scale` will be ignored.
   - If `random_resized_crop=False`, newly-added `max_random_area` and 
`min_random_area` will be ignored.
   
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [X] Changes are complete (i.e. I finished coding on this PR)
   - [X] All changes have test coverage:
   - Unit tests are added for small changes to verify correctness (e.g. adding 
a new operator)
   - [X] Code is well-documented: 
   - For user-facing API changes, API doc string has been updated. 
   - For new C++ functions in header files, their functionalities and arguments 
are documented. 
   - [X] To the my best knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] 01/01: add Builder and varargs which are java-friendly

2018-05-22 Thread liuyizhi
This is an automated email from the ASF dual-hosted git repository.

liuyizhi pushed a commit to branch v1.2.0-java
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git

commit c887376f24df1e3ce941f600c69b38026e71771d
Author: Yizhi Liu 
AuthorDate: Mon May 21 13:48:31 2018 -0700

add Builder and varargs which are java-friendly
---
 .../core/src/main/scala/org/apache/mxnet/IO.scala  | 56 +-
 .../src/main/scala/org/apache/mxnet/NDArray.scala  |  3 +-
 .../src/main/scala/org/apache/mxnet/Shape.scala|  4 ++
 .../src/main/scala/org/apache/mxnet/Symbol.scala   |  1 -
 .../scala/org/apache/mxnet/module/BaseModule.scala | 30 
 .../scala/org/apache/mxnet/module/Module.scala | 43 -
 .../main/scala/org/apache/mxnet/NDArrayMacro.scala | 52 ++--
 7 files changed, 138 insertions(+), 51 deletions(-)

diff --git a/scala-package/core/src/main/scala/org/apache/mxnet/IO.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/IO.scala
index 7a9c1a7..123e2f8 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/IO.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/IO.scala
@@ -19,9 +19,10 @@ package org.apache.mxnet
 
 import org.apache.mxnet.Base._
 import org.apache.mxnet.DType.DType
-import org.apache.mxnet.io.{MXDataPack, MXDataIter}
+import org.apache.mxnet.io.{MXDataIter, MXDataPack}
 import org.slf4j.LoggerFactory
 
+import scala.annotation.varargs
 import scala.collection.immutable.ListMap
 import scala.collection.mutable.ListBuffer
 
@@ -140,6 +141,7 @@ class DataBatch(val data: IndexedSeq[NDArray],
 // (must match the order of input data/label)
 private val providedData: ListMap[String, Shape] = null,
 private val providedLabel: ListMap[String, Shape] = null) {
+
   /**
* Dispose its data and labels
* The object shall never be used after it is disposed.
@@ -160,6 +162,58 @@ class DataBatch(val data: IndexedSeq[NDArray],
   def provideLabel: ListMap[String, Shape] = providedLabel
 }
 
+object DataBatch {
+  class Builder() {
+private var data: IndexedSeq[NDArray] = null
+private var label: IndexedSeq[NDArray] = null
+private var index: IndexedSeq[Long] = null
+private var pad: Int = 0
+private var bucketKey: AnyRef = null
+private var providedData: ListMap[String, Shape] = ListMap.empty
+private var providedLabel: ListMap[String, Shape] = ListMap.empty
+
+@varargs def setData(data: NDArray*): Builder = {
+  this.data = data.toIndexedSeq
+  this
+}
+
+@varargs def setLabel(label: NDArray*): Builder = {
+  this.label = label.toIndexedSeq
+  this
+}
+
+@varargs def setIndex(index: Long*): Builder = {
+  this.index = index.toIndexedSeq
+  this
+}
+
+def setPad(pad: Int): Builder = {
+  this.pad = pad
+  this
+}
+
+def setBucketKey(bucketKey: AnyRef): Builder = {
+  this.bucketKey = bucketKey
+  this
+}
+
+def provideData(name: String, shape: Shape): Builder = {
+  providedData = providedData.updated(name, shape)
+  this
+}
+
+def provideLabel(name: String, shape: Shape): Builder = {
+  providedLabel = providedLabel.updated(name, shape)
+  this
+}
+
+def build(): DataBatch = {
+  new DataBatch(data, label, index, pad,
+bucketKey, providedData, providedLabel)
+}
+  }
+}
+
 /**
  * DataIter object in mxnet.
  */
diff --git a/scala-package/core/src/main/scala/org/apache/mxnet/NDArray.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/NDArray.scala
index 416f2d7..e8c687e 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/NDArray.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/NDArray.scala
@@ -48,6 +48,7 @@ object NDArray {
 }
   }
 
+  //  private[mxnet] def genericNDArrayFunctionInvoke(
   /**
* Used by NDArrayMacro.
* Invoke this function by passing in parameters.
@@ -57,7 +58,7 @@ object NDArray {
* @param kwargs Key-value arguments of input scalars
* @return The result NDArrays of result of computation.
*/
-  private[mxnet] def genericNDArrayFunctionInvoke(
+  def genericNDArrayFunctionInvoke(
 funcName: String, args: Seq[Any], kwargs: Map[String, Any] = null): 
NDArrayFuncReturn = {
 val function = functions(funcName)
 val ndArgs = ArrayBuffer.empty[NDArray]
diff --git a/scala-package/core/src/main/scala/org/apache/mxnet/Shape.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/Shape.scala
index e632ade..6891762 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/Shape.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/Shape.scala
@@ -17,6 +17,8 @@
 
 package org.apache.mxnet
 
+import scala.annotation.varargs
+
 /**
  * Shape of [[NDArray]] or other data
  */
@@ -28,6 +30,7 @@ class Shape(dims: Traversable[Int]) extends Serializable {
   }
 
   def apply(dim: Int): Int = shape(dim)
+  def

[incubator-mxnet] branch v1.2.0-java updated (9a7719e -> c887376)

2018-05-22 Thread liuyizhi
This is an automated email from the ASF dual-hosted git repository.

liuyizhi pushed a change to branch v1.2.0-java
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


 discard 9a7719e  add Builder and varargs which are easy for java to use
 new c887376  add Builder and varargs which are java-friendly

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (9a7719e)
\
 N -- N -- N   refs/heads/v1.2.0-java (c887376)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:

-- 
To stop receiving notification emails like this one, please contact
liuyi...@apache.org.


[GitHub] eric-haibin-lin commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
eric-haibin-lin commented on a change in pull request #11001: [MXNET-374] 
handle row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190097102
 
 

 ##
 File path: python/mxnet/gluon/parameter.py
 ##
 @@ -396,11 +485,25 @@ def data(self, ctx=None):
 ---
 NDArray on ctx
 """
+if self._stype != 'default':
+raise ValueError("Cannot return a copy of Parameter '%s' on ctx %s 
via data() " \
 
 Review comment:
   Maybe I should change to RuntimeError? There's UserWarning but I am not 
aware of UserError


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] eric-haibin-lin commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
eric-haibin-lin commented on a change in pull request #11001: [MXNET-374] 
handle row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190085297
 
 

 ##
 File path: python/mxnet/gluon/parameter.py
 ##
 @@ -271,12 +313,18 @@ def _init_grad(self):
 self._grad = [ndarray.zeros(shape=i.shape, dtype=i.dtype, 
ctx=i.context,
 stype=self._grad_stype) for i in 
self._data]
 
-autograd.mark_variables(self.list_data(), self.list_grad(), 
self.grad_req)
+autograd.mark_variables(self._check_and_get(self._data, list),
+self._grad, self.grad_req)
 
 def _reduce(self):
 """Reduce data from multiple context."""
-block = self.list_data()
-data = ndarray.add_n(*(w.copyto(context.cpu()) for w in block)) / 
len(block)
+if self._stype == 'default':
+block = self.list_data()
+data = ndarray.add_n(*(w.copyto(context.cpu()) for w in block)) / 
len(block)
+else:
+# fetch all rows for 'row_sparse' param
+all_row_ids = ndarray.arange(0, self.shape[0], dtype='int64', 
ctx=context.cpu())
+data = self.row_sparse_data(all_row_ids)
 
 Review comment:
   Currently when gluon sees rowsparse weight, it always creates a kvstore and 
set update to kvstore=True.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] eric-haibin-lin commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
eric-haibin-lin commented on a change in pull request #11001: [MXNET-374] 
handle row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190082519
 
 

 ##
 File path: python/mxnet/gluon/block.py
 ##
 @@ -443,8 +443,17 @@ class HybridBlock(Block):
 
 Refer `Hybrid tutorial `_ to 
see
 the end-to-end usage.
+
 """
 def __init__(self, prefix=None, params=None):
+# check if any parameter is row_sparse
+if isinstance(params, ParameterDict):
 
 Review comment:
   Removed. Will the checks in `param.list_data()` and `param.data()` be 
sufficient? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] eric-haibin-lin commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
eric-haibin-lin commented on a change in pull request #11001: [MXNET-374] 
handle row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190087027
 
 

 ##
 File path: python/mxnet/gluon/trainer.py
 ##
 @@ -191,6 +224,8 @@ def step(self, batch_size, ignore_stale_grad=False):
 """
 if not self._kv_initialized:
 self._init_kvstore()
+if self._params_to_init:
 
 Review comment:
   I moved the logics of kv.init(param) from `_init_kvstore` to `_init_params`. 
`_params_to_init ` refers to params that are not initialized on kvstore.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhanghang1989 commented on issue #10799: Feature Request for SoftmaxCrossEntropyLoss with Ignore labels

2018-05-22 Thread GitBox
zhanghang1989 commented on issue #10799: Feature Request for 
SoftmaxCrossEntropyLoss with Ignore labels
URL: 
https://github.com/apache/incubator-mxnet/issues/10799#issuecomment-391185587
 
 
   Another component needed is an operator return the ``NDArray.size``


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #10799: Feature Request for SoftmaxCrossEntropyLoss with Ignore labels

2018-05-22 Thread GitBox
szha commented on issue #10799: Feature Request for SoftmaxCrossEntropyLoss 
with Ignore labels
URL: 
https://github.com/apache/incubator-mxnet/issues/10799#issuecomment-391184720
 
 
   For this feature, we will need two more sparse operators for best efficiency:
   1. `where(cond, x, y)` in which `cond` is sparse, x/y can be scalar, and 
return value is dense.
   2. `eq(x, y)` where the return value is sparse.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] branch v1.2.0 updated: [maven-release-plugin] prepare for next development iteration

2018-05-22 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch v1.2.0
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/v1.2.0 by this push:
 new 8b487ef  [maven-release-plugin] prepare for next development iteration
8b487ef is described below

commit 8b487ef3b2ea01f654b54e300dcf0215818723dc
Author: Naveen Swamy 
AuthorDate: Tue May 22 17:20:59 2018 -0700

[maven-release-plugin] prepare for next development iteration
---
 scala-package/assembly/osx-x86_64-cpu/pom.xml | 8 
 scala-package/assembly/pom.xml| 2 +-
 scala-package/core/pom.xml| 6 +++---
 scala-package/examples/pom.xml| 6 +++---
 scala-package/infer/pom.xml   | 4 ++--
 scala-package/init-native/osx-x86_64/pom.xml  | 4 ++--
 scala-package/init-native/pom.xml | 2 +-
 scala-package/init/pom.xml| 2 +-
 scala-package/macros/pom.xml  | 6 +++---
 scala-package/native/osx-x86_64-cpu/pom.xml   | 4 ++--
 scala-package/native/pom.xml  | 2 +-
 scala-package/pom.xml | 4 ++--
 scala-package/spark/pom.xml   | 4 ++--
 13 files changed, 27 insertions(+), 27 deletions(-)

diff --git a/scala-package/assembly/osx-x86_64-cpu/pom.xml 
b/scala-package/assembly/osx-x86_64-cpu/pom.xml
index f02e431..2b3c28c 100644
--- a/scala-package/assembly/osx-x86_64-cpu/pom.xml
+++ b/scala-package/assembly/osx-x86_64-cpu/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-full-parent_2.11
-1.2.0
+1.2.1-SNAPSHOT
 ../pom.xml
   
 
@@ -16,18 +16,18 @@
 
   org.apache.mxnet
   mxnet-core_${scala.binary.version}
-  1.2.0
+  1.2.1-SNAPSHOT
 
 
   org.apache.mxnet
   libmxnet-scala-osx-x86_64-cpu
-  1.2.0
+  1.2.1-SNAPSHOT
   jnilib
 
 
   org.apache.mxnet
   mxnet-infer_${scala.binary.version}
-  1.2.0
+  1.2.1-SNAPSHOT
 
   
 
diff --git a/scala-package/assembly/pom.xml b/scala-package/assembly/pom.xml
index 8d2df25..456f453 100644
--- a/scala-package/assembly/pom.xml
+++ b/scala-package/assembly/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-parent_2.11
-1.2.0
+1.2.1-SNAPSHOT
 ../pom.xml
   
 
diff --git a/scala-package/core/pom.xml b/scala-package/core/pom.xml
index 711a5fb..fb5ea2b 100644
--- a/scala-package/core/pom.xml
+++ b/scala-package/core/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-parent_2.11
-1.2.0
+1.2.1-SNAPSHOT
 ../pom.xml
   
 
@@ -76,13 +76,13 @@
 
   org.apache.mxnet
   mxnet-init_${scala.binary.version}
-  1.2.0
+  1.2.1-SNAPSHOT
   provided
 
 
   org.apache.mxnet
   mxnet-macros_${scala.binary.version}
-  1.2.0
+  1.2.1-SNAPSHOT
   provided
 
   
diff --git a/scala-package/examples/pom.xml b/scala-package/examples/pom.xml
index 721f882..5ae86f4 100644
--- a/scala-package/examples/pom.xml
+++ b/scala-package/examples/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-parent_2.11
-1.2.0
+1.2.1-SNAPSHOT
 ../pom.xml
   
 
@@ -150,13 +150,13 @@
 
   org.apache.mxnet
   mxnet-core_${scala.binary.version}
-  1.2.0
+  1.2.1-SNAPSHOT
   provided
 
 
   org.apache.mxnet
   mxnet-infer_${scala.binary.version}
-  1.2.0
+  1.2.1-SNAPSHOT
   provided
 
 
diff --git a/scala-package/infer/pom.xml b/scala-package/infer/pom.xml
index 56a3507..daf4b47 100644
--- a/scala-package/infer/pom.xml
+++ b/scala-package/infer/pom.xml
@@ -4,7 +4,7 @@
 
 mxnet-parent_2.11
 org.apache.mxnet
-1.2.0
+1.2.1-SNAPSHOT
 ../pom.xml
 
 
@@ -76,7 +76,7 @@
 
 org.apache.mxnet
 mxnet-core_${scala.binary.version}
-1.2.0
+1.2.1-SNAPSHOT
 provided
 
 
diff --git a/scala-package/init-native/osx-x86_64/pom.xml 
b/scala-package/init-native/osx-x86_64/pom.xml
index 846116e..97193f4 100644
--- a/scala-package/init-native/osx-x86_64/pom.xml
+++ b/scala-package/init-native/osx-x86_64/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-scala-init-native-parent
-1.2.0
+1.2.1-SNAPSHOT
 ../pom.xml
   
 
@@ -18,7 +18,7 @@
 
   org.apache.mxnet
   mxnet-init_${scala.binary.version}
-  1.2.0
+  1.2.1-SNAPSHOT
   jar
   compile
 
diff --git a/scala-package/init-native/pom.xml 
b/scala-package/init-native/pom.xml
index 3a56d58..55c724d 100644
--- a/scala-package/init-native/pom.xml
+++ b/scala-package/init-native/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-parent_2.11
-1.2.0
+1.2.1-SNAPSHOT
 ../pom.xml
   
 
diff --git a/scala-package/init/pom.xml b/scala-package/init/pom.xml
index 35ab07d..198bd89 100644
--- a/scala-package/init/pom.xml
+++ 

[GitHub] lanking520 commented on issue #10787: [MXNET-357] New Scala API Design (NDArray)

2018-05-22 Thread GitBox
lanking520 commented on issue #10787: [MXNET-357] New Scala API Design (NDArray)
URL: https://github.com/apache/incubator-mxnet/pull/10787#issuecomment-391181697
 
 
   Add 1 line change in Sync with this PR: #11021 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] 01/05: Prepare Scala package for publishing to Maven

2018-05-22 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch v1.2.0
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git

commit d6524c57546d14469937a0391917a9eaec1b0dbd
Author: Naveen Swamy 
AuthorDate: Thu May 17 04:27:10 2018 -0700

Prepare Scala package for publishing to Maven

Update Organization/description

add infer package to full package

don't publish to maven any package except the full platform dependent 
packages
---
 Makefile|  8 -
 scala-package/assembly/linux-x86_64-cpu/pom.xml |  5 
 scala-package/assembly/linux-x86_64-gpu/pom.xml |  5 
 scala-package/assembly/osx-x86_64-cpu/pom.xml   |  5 
 scala-package/assembly/pom.xml  |  7 +
 scala-package/core/pom.xml  |  7 +
 scala-package/examples/pom.xml  |  7 +
 scala-package/infer/pom.xml |  7 +
 scala-package/init-native/linux-x86_64/pom.xml  |  7 +
 scala-package/init-native/osx-x86_64/pom.xml|  7 +
 scala-package/init-native/pom.xml   | 13 +
 scala-package/init/pom.xml  | 13 +
 scala-package/macros/pom.xml| 12 
 scala-package/native/linux-x86_64-cpu/pom.xml   |  7 +
 scala-package/native/linux-x86_64-gpu/pom.xml   |  7 +
 scala-package/native/osx-x86_64-cpu/pom.xml |  7 +
 scala-package/native/pom.xml| 13 +
 scala-package/pom.xml   | 39 ++---
 scala-package/spark/pom.xml | 12 
 19 files changed, 164 insertions(+), 24 deletions(-)

diff --git a/Makefile b/Makefile
index 951b29b..46cbccf 100644
--- a/Makefile
+++ b/Makefile
@@ -592,9 +592,15 @@ scalainstall:
-Dcflags="$(CFLAGS)" -Dldflags="$(LDFLAGS)" \
-Dlddeps="$(LIB_DEP) $(ROOTDIR)/lib/libmxnet.a")
 
+scalarelease:
+   (cd $(ROOTDIR)/scala-package; \
+   mvn release:clean release:prepare -DautoVersionSubmodules=true \
+   -Papache-release,$(SCALA_PKG_PROFILE),$(SCALA_VERSION_PROFILE) \
+   -Darguments=""-DskipTests\ -Dcflags=\""$(CFLAGS)\""\ 
-Dcxx=\""$(CXX)\""\ -Dldflags=\""$(LDFLAGS)\""\ -Dlddeps=\""$(LIB_DEP) 
$(ROOTDIR)/lib/libmxnet.a\)
+
 scaladeploy:
(cd $(ROOTDIR)/scala-package; \
-   mvn deploy 
-Prelease,$(SCALA_PKG_PROFILE),$(SCALA_VERSION_PROFILE) -DskipTests 
-Dcxx="$(CXX)" \
+   mvn deploy 
-Papache-release,$(SCALA_PKG_PROFILE),$(SCALA_VERSION_PROFILE) \-DskipTests 
-Dcxx="$(CXX)" \
-Dcflags="$(CFLAGS)" -Dldflags="$(LDFLAGS)" \
-Dlddeps="$(LIB_DEP) $(ROOTDIR)/lib/libmxnet.a")
 
diff --git a/scala-package/assembly/linux-x86_64-cpu/pom.xml 
b/scala-package/assembly/linux-x86_64-cpu/pom.xml
index aeabd4f..9db9444 100644
--- a/scala-package/assembly/linux-x86_64-cpu/pom.xml
+++ b/scala-package/assembly/linux-x86_64-cpu/pom.xml
@@ -26,6 +26,11 @@
   1.2.0-SNAPSHOT
   so
 
+
+  org.apache.mxnet
+  mxnet-infer_${scala.binary.version}
+  1.2.0-SNAPSHOT
+
   
 
   
diff --git a/scala-package/assembly/linux-x86_64-gpu/pom.xml 
b/scala-package/assembly/linux-x86_64-gpu/pom.xml
index a9bb6e5..c8e0f13 100644
--- a/scala-package/assembly/linux-x86_64-gpu/pom.xml
+++ b/scala-package/assembly/linux-x86_64-gpu/pom.xml
@@ -26,6 +26,11 @@
   1.2.0-SNAPSHOT
   so
 
+
+  org.apache.mxnet
+  mxnet-infer_${scala.binary.version}
+  1.2.0-SNAPSHOT
+
   
 
   
diff --git a/scala-package/assembly/osx-x86_64-cpu/pom.xml 
b/scala-package/assembly/osx-x86_64-cpu/pom.xml
index b06a7c6..4c9c4c5 100644
--- a/scala-package/assembly/osx-x86_64-cpu/pom.xml
+++ b/scala-package/assembly/osx-x86_64-cpu/pom.xml
@@ -26,6 +26,11 @@
   1.2.0-SNAPSHOT
   jnilib
 
+
+  org.apache.mxnet
+  mxnet-infer_${scala.binary.version}
+  1.2.0-SNAPSHOT
+
   
 
   
diff --git a/scala-package/assembly/pom.xml b/scala-package/assembly/pom.xml
index bc8a5c0..191310a 100644
--- a/scala-package/assembly/pom.xml
+++ b/scala-package/assembly/pom.xml
@@ -39,6 +39,13 @@
 
   
 org.apache.maven.plugins
+maven-deploy-plugin
+
+  true
+
+  
+  
+org.apache.maven.plugins
 maven-source-plugin
 
   
diff --git a/scala-package/core/pom.xml b/scala-package/core/pom.xml
index 63cebb7..e6a3daa 100644
--- a/scala-package/core/pom.xml
+++ b/scala-package/core/pom.xml
@@ -38,6 +38,13 @@
 
   
 org.apache.maven.plugins
+maven-deploy-plugin
+
+  true
+
+  
+  
+org.apache.maven.plugins
 maven-jar-plugin
 
   
diff --git a/scala-package/examples/

[incubator-mxnet] 05/05: [maven-release-plugin] prepare release mxnet-parent_2.11-1.2.0

2018-05-22 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch v1.2.0
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git

commit d208391afbabd9443c556b053cccb26a288ab681
Author: Naveen Swamy 
AuthorDate: Tue May 22 17:20:37 2018 -0700

[maven-release-plugin] prepare release mxnet-parent_2.11-1.2.0
---
 scala-package/assembly/osx-x86_64-cpu/pom.xml | 6 +++---
 scala-package/assembly/pom.xml| 2 +-
 scala-package/core/pom.xml| 6 +++---
 scala-package/examples/pom.xml| 6 +++---
 scala-package/infer/pom.xml   | 4 ++--
 scala-package/init-native/osx-x86_64/pom.xml  | 4 ++--
 scala-package/init-native/pom.xml | 2 +-
 scala-package/init/pom.xml| 2 +-
 scala-package/macros/pom.xml  | 6 +++---
 scala-package/native/osx-x86_64-cpu/pom.xml   | 4 ++--
 scala-package/native/pom.xml  | 2 +-
 scala-package/pom.xml | 3 ++-
 scala-package/spark/pom.xml   | 4 ++--
 13 files changed, 26 insertions(+), 25 deletions(-)

diff --git a/scala-package/assembly/osx-x86_64-cpu/pom.xml 
b/scala-package/assembly/osx-x86_64-cpu/pom.xml
index 1553c98..f02e431 100644
--- a/scala-package/assembly/osx-x86_64-cpu/pom.xml
+++ b/scala-package/assembly/osx-x86_64-cpu/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-full-parent_2.11
-1.2.0-SNAPSHOT
+1.2.0
 ../pom.xml
   
 
@@ -16,7 +16,7 @@
 
   org.apache.mxnet
   mxnet-core_${scala.binary.version}
-  1.2.0-SNAPSHOT
+  1.2.0
 
 
   org.apache.mxnet
@@ -27,7 +27,7 @@
 
   org.apache.mxnet
   mxnet-infer_${scala.binary.version}
-  1.2.0-SNAPSHOT
+  1.2.0
 
   
 
diff --git a/scala-package/assembly/pom.xml b/scala-package/assembly/pom.xml
index 2c2a4fc..8d2df25 100644
--- a/scala-package/assembly/pom.xml
+++ b/scala-package/assembly/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-parent_2.11
-1.2.0-SNAPSHOT
+1.2.0
 ../pom.xml
   
 
diff --git a/scala-package/core/pom.xml b/scala-package/core/pom.xml
index 509e732..711a5fb 100644
--- a/scala-package/core/pom.xml
+++ b/scala-package/core/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-parent_2.11
-1.2.0-SNAPSHOT
+1.2.0
 ../pom.xml
   
 
@@ -76,13 +76,13 @@
 
   org.apache.mxnet
   mxnet-init_${scala.binary.version}
-  1.2.0-SNAPSHOT
+  1.2.0
   provided
 
 
   org.apache.mxnet
   mxnet-macros_${scala.binary.version}
-  1.2.0-SNAPSHOT
+  1.2.0
   provided
 
   
diff --git a/scala-package/examples/pom.xml b/scala-package/examples/pom.xml
index 1d187ca..721f882 100644
--- a/scala-package/examples/pom.xml
+++ b/scala-package/examples/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-parent_2.11
-1.2.0-SNAPSHOT
+1.2.0
 ../pom.xml
   
 
@@ -150,13 +150,13 @@
 
   org.apache.mxnet
   mxnet-core_${scala.binary.version}
-  1.2.0-SNAPSHOT
+  1.2.0
   provided
 
 
   org.apache.mxnet
   mxnet-infer_${scala.binary.version}
-  1.2.0-SNAPSHOT
+  1.2.0
   provided
 
 
diff --git a/scala-package/infer/pom.xml b/scala-package/infer/pom.xml
index 929a45e..56a3507 100644
--- a/scala-package/infer/pom.xml
+++ b/scala-package/infer/pom.xml
@@ -4,7 +4,7 @@
 
 mxnet-parent_2.11
 org.apache.mxnet
-1.2.0-SNAPSHOT
+1.2.0
 ../pom.xml
 
 
@@ -76,7 +76,7 @@
 
 org.apache.mxnet
 mxnet-core_${scala.binary.version}
-1.2.0-SNAPSHOT
+1.2.0
 provided
 
 
diff --git a/scala-package/init-native/osx-x86_64/pom.xml 
b/scala-package/init-native/osx-x86_64/pom.xml
index dd09a36..846116e 100644
--- a/scala-package/init-native/osx-x86_64/pom.xml
+++ b/scala-package/init-native/osx-x86_64/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-scala-init-native-parent
-1.2.0-SNAPSHOT
+1.2.0
 ../pom.xml
   
 
@@ -18,7 +18,7 @@
 
   org.apache.mxnet
   mxnet-init_${scala.binary.version}
-  1.2.0-SNAPSHOT
+  1.2.0
   jar
   compile
 
diff --git a/scala-package/init-native/pom.xml 
b/scala-package/init-native/pom.xml
index e7834a0..3a56d58 100644
--- a/scala-package/init-native/pom.xml
+++ b/scala-package/init-native/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-parent_2.11
-1.2.0-SNAPSHOT
+1.2.0
 ../pom.xml
   
 
diff --git a/scala-package/init/pom.xml b/scala-package/init/pom.xml
index df24618..35ab07d 100644
--- a/scala-package/init/pom.xml
+++ b/scala-package/init/pom.xml
@@ -4,7 +4,7 @@
   
 org.apache.mxnet
 mxnet-parent_2.11
-1.2.0-SNAPSHOT
+1.2.0
 
   
 
diff --git a/scala-package/macros/pom.xml b/scala-package/macros/pom.xml
index 1b33561..e692d95 100644
--- a/scala-package/m

[incubator-mxnet] 03/05: remove sontype config from main pom

2018-05-22 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch v1.2.0
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git

commit e12b24a6e81392627681e3bf5c031ed37fb2281e
Author: Naveen Swamy 
AuthorDate: Mon May 21 23:24:40 2018 -0700

remove sontype config from main pom
---
 Makefile  |  1 +
 scala-package/pom.xml | 23 ---
 2 files changed, 1 insertion(+), 23 deletions(-)

diff --git a/Makefile b/Makefile
index 98e1756..9004dbe 100644
--- a/Makefile
+++ b/Makefile
@@ -597,6 +597,7 @@ scalarelease-dryrun:
mvn release:clean release:prepare -DdryRun=true 
-DautoVersionSubmodules=true \
-Papache-release,$(SCALA_PKG_PROFILE),$(SCALA_VERSION_PROFILE) \
-Darguments=""-DskipTests\ -Dcflags=\""$(CFLAGS)\""\ 
-Dcxx=\""$(CXX)\""\ -Dldflags=\""$(LDFLAGS)\""\ -Dlddeps=\""$(LIB_DEP) 
$(ROOTDIR)/lib/libmxnet.a\)
+
 scalarelease:
(cd $(ROOTDIR)/scala-package; \
mvn release:clean release:prepare -DautoVersionSubmodules=true \
diff --git a/scala-package/pom.xml b/scala-package/pom.xml
index 2243fb3..653196d 100644
--- a/scala-package/pom.xml
+++ b/scala-package/pom.xml
@@ -68,29 +68,6 @@
   
 
   
-  
-org.apache.maven.plugins
-maven-gpg-plugin
-
-  
-sign-artifacts
-verify
-
-  sign
-
-  
-
-  
-  
-org.sonatype.plugins
-nexus-staging-maven-plugin
-true
-
-  ossrh
-  https://oss.sonatype.org/
-  true
-
-  
 
   
 

-- 
To stop receiving notification emails like this one, please contact
nsw...@apache.org.


[incubator-mxnet] 02/05: add scalarelease-dryrun target to Makefile

2018-05-22 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch v1.2.0
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git

commit 6ee198b23d9fcffd90e3acd213df05d5d4f82225
Author: Naveen Swamy 
AuthorDate: Mon May 21 23:12:27 2018 -0700

add scalarelease-dryrun target to Makefile
---
 Makefile | 5 +
 1 file changed, 5 insertions(+)

diff --git a/Makefile b/Makefile
index 46cbccf..98e1756 100644
--- a/Makefile
+++ b/Makefile
@@ -592,6 +592,11 @@ scalainstall:
-Dcflags="$(CFLAGS)" -Dldflags="$(LDFLAGS)" \
-Dlddeps="$(LIB_DEP) $(ROOTDIR)/lib/libmxnet.a")
 
+scalarelease-dryrun:
+   (cd $(ROOTDIR)/scala-package; \
+   mvn release:clean release:prepare -DdryRun=true 
-DautoVersionSubmodules=true \
+   -Papache-release,$(SCALA_PKG_PROFILE),$(SCALA_VERSION_PROFILE) \
+   -Darguments=""-DskipTests\ -Dcflags=\""$(CFLAGS)\""\ 
-Dcxx=\""$(CXX)\""\ -Dldflags=\""$(LDFLAGS)\""\ -Dlddeps=\""$(LIB_DEP) 
$(ROOTDIR)/lib/libmxnet.a\)
 scalarelease:
(cd $(ROOTDIR)/scala-package; \
mvn release:clean release:prepare -DautoVersionSubmodules=true \

-- 
To stop receiving notification emails like this one, please contact
nsw...@apache.org.


[incubator-mxnet] branch v1.2.0 updated (297c64f -> d208391)

2018-05-22 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch v1.2.0
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 297c64f  add apache-release as parent to pom file (#10941)
 new d6524c5  Prepare Scala package for publishing to Maven
 new 6ee198b  add scalarelease-dryrun target to Makefile
 new e12b24a  remove sontype config from main pom
 new 54abb5d  inherited=false in maven-deploy-plugin
 new d208391  [maven-release-plugin] prepare release mxnet-parent_2.11-1.2.0

The 5 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 Makefile| 14 +-
 scala-package/assembly/linux-x86_64-cpu/pom.xml |  5 ++
 scala-package/assembly/linux-x86_64-gpu/pom.xml |  5 ++
 scala-package/assembly/osx-x86_64-cpu/pom.xml   | 15 +++---
 scala-package/assembly/pom.xml  | 15 --
 scala-package/core/pom.xml  | 17 ---
 scala-package/examples/pom.xml  | 17 ---
 scala-package/infer/pom.xml | 15 --
 scala-package/init-native/linux-x86_64/pom.xml  |  7 +++
 scala-package/init-native/osx-x86_64/pom.xml| 15 --
 scala-package/init-native/pom.xml   | 19 ++--
 scala-package/init/pom.xml  | 19 ++--
 scala-package/macros/pom.xml| 22 ++---
 scala-package/native/linux-x86_64-cpu/pom.xml   |  7 +++
 scala-package/native/linux-x86_64-gpu/pom.xml   |  7 +++
 scala-package/native/osx-x86_64-cpu/pom.xml | 15 --
 scala-package/native/pom.xml| 19 ++--
 scala-package/pom.xml   | 65 -
 scala-package/spark/pom.xml | 20 ++--
 19 files changed, 211 insertions(+), 107 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
nsw...@apache.org.


[incubator-mxnet] 04/05: inherited=false in maven-deploy-plugin

2018-05-22 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch v1.2.0
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git

commit 54abb5d7103056496a85bef71f0dc661b201cfd8
Author: Naveen Swamy 
AuthorDate: Mon May 21 23:35:46 2018 -0700

inherited=false in maven-deploy-plugin
---
 scala-package/assembly/osx-x86_64-cpu/pom.xml |  6 ++
 scala-package/assembly/pom.xml|  6 ++
 scala-package/core/pom.xml|  4 +---
 scala-package/examples/pom.xml|  4 +---
 scala-package/infer/pom.xml   |  4 +---
 scala-package/init-native/osx-x86_64/pom.xml  |  4 +---
 scala-package/init-native/pom.xml |  4 +---
 scala-package/init/pom.xml|  4 +---
 scala-package/macros/pom.xml  |  4 +---
 scala-package/native/osx-x86_64-cpu/pom.xml   |  4 +---
 scala-package/native/pom.xml  |  4 +---
 scala-package/pom.xml | 10 +++---
 scala-package/spark/pom.xml   |  4 +---
 13 files changed, 21 insertions(+), 41 deletions(-)

diff --git a/scala-package/assembly/osx-x86_64-cpu/pom.xml 
b/scala-package/assembly/osx-x86_64-cpu/pom.xml
index 4c9c4c5..1553c98 100644
--- a/scala-package/assembly/osx-x86_64-cpu/pom.xml
+++ b/scala-package/assembly/osx-x86_64-cpu/pom.xml
@@ -1,7 +1,5 @@
 
-http://maven.apache.org/POM/4.0.0";
- xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
- xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
   4.0.0
   
 org.apache.mxnet
@@ -23,7 +21,7 @@
 
   org.apache.mxnet
   libmxnet-scala-osx-x86_64-cpu
-  1.2.0-SNAPSHOT
+  1.2.0
   jnilib
 
 
diff --git a/scala-package/assembly/pom.xml b/scala-package/assembly/pom.xml
index 191310a..2c2a4fc 100644
--- a/scala-package/assembly/pom.xml
+++ b/scala-package/assembly/pom.xml
@@ -1,7 +1,5 @@
 
-http://maven.apache.org/POM/4.0.0";
- xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
- xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
   4.0.0
   
 org.apache.mxnet
@@ -54,7 +52,7 @@
   jar-no-fork
 
 
-  true>
+  true
 
   
 
diff --git a/scala-package/core/pom.xml b/scala-package/core/pom.xml
index e6a3daa..509e732 100644
--- a/scala-package/core/pom.xml
+++ b/scala-package/core/pom.xml
@@ -1,7 +1,5 @@
 
-http://maven.apache.org/POM/4.0.0";
- xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
- xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
   4.0.0
   
 org.apache.mxnet
diff --git a/scala-package/examples/pom.xml b/scala-package/examples/pom.xml
index 76229ef..1d187ca 100644
--- a/scala-package/examples/pom.xml
+++ b/scala-package/examples/pom.xml
@@ -1,7 +1,5 @@
 
-http://maven.apache.org/POM/4.0.0";
- xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
- xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
   4.0.0
   
 org.apache.mxnet
diff --git a/scala-package/infer/pom.xml b/scala-package/infer/pom.xml
index bc18ca1..929a45e 100644
--- a/scala-package/infer/pom.xml
+++ b/scala-package/infer/pom.xml
@@ -1,7 +1,5 @@
 
-http://maven.apache.org/POM/4.0.0";
- xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
- xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
 4.0.0
 
 mxnet-parent_2.11
diff --git a/scala-package/init-native/osx-x86_64/pom.xml 
b/scala-package/init-native/osx-x86_64/pom.xml
index 85b1e4e..dd09a36 100644
--- a/scala-package/init-native/osx-x86_64/pom.xml
+++ b/scala-package/init-native/osx-x86_64/pom.xml
@@ -1,7 +1,5 @@
 
-http:/

[GitHub] lanking520 commented on issue #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
lanking520 commented on issue #10991: [MXNET-386] API Docs Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#issuecomment-391181183
 
 
   Add 1 line change in Sync with this PR: 
https://github.com/apache/incubator-mxnet/pull/11021
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] eric-haibin-lin commented on issue #10589: req doesn't work for pretrained models in Gluon

2018-05-22 Thread GitBox
eric-haibin-lin commented on issue #10589: req doesn't work for pretrained 
models in Gluon
URL: 
https://github.com/apache/incubator-mxnet/issues/10589#issuecomment-391179209
 
 
   Fixed now on master


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #10589: req doesn't work for pretrained models in Gluon

2018-05-22 Thread GitBox
szha commented on issue #10589: req doesn't work for pretrained models in Gluon
URL: 
https://github.com/apache/incubator-mxnet/issues/10589#issuecomment-391179228
 
 
   This has already been fixed thanks to @eric-haibin-lin 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] eric-haibin-lin closed issue #10589: req doesn't work for pretrained models in Gluon

2018-05-22 Thread GitBox
eric-haibin-lin closed issue #10589: req doesn't work for pretrained models in 
Gluon
URL: https://github.com/apache/incubator-mxnet/issues/10589
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #10609: Gluon code fails in the normal mode but succeeds in the hybrid mode.

2018-05-22 Thread GitBox
szha commented on issue #10609: Gluon code fails in the normal mode but 
succeeds in the hybrid mode.
URL: 
https://github.com/apache/incubator-mxnet/issues/10609#issuecomment-391177297
 
 
   ```python
   from mxnet import gluon
   import mxnet as mx
   gru = gluon.rnn.GRUCell(5)
   gru.initialize(ctx=mx.cpu(0))
   data =  mx.nd.random.uniform(shape=(2, 3))
   state = [mx.nd.random.uniform(shape=(2, 5))]
   out = gru(data, state)
   ```
   `state` is supposed to be a list. The above code works


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha closed issue #10609: Gluon code fails in the normal mode but succeeds in the hybrid mode.

2018-05-22 Thread GitBox
szha closed issue #10609: Gluon code fails in the normal mode but succeeds in 
the hybrid mode.
URL: https://github.com/apache/incubator-mxnet/issues/10609
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhanghang1989 commented on issue #10799: Feature Request for SoftmaxCrossEntropyLoss with Ignore labels

2018-05-22 Thread GitBox
zhanghang1989 commented on issue #10799: Feature Request for 
SoftmaxCrossEntropyLoss with Ignore labels
URL: 
https://github.com/apache/incubator-mxnet/issues/10799#issuecomment-391176538
 
 
   Agree. This is just a PoC for feature request :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #10799: Feature Request for SoftmaxCrossEntropyLoss with Ignore labels

2018-05-22 Thread GitBox
szha commented on issue #10799: Feature Request for SoftmaxCrossEntropyLoss 
with Ignore labels
URL: 
https://github.com/apache/incubator-mxnet/issues/10799#issuecomment-391176241
 
 
   @zhanghang1989 this does not work because valid_label_map.size is used. 
`hybrid_forward` doesn't support using shape information if the block is to be 
hybridized.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #10910: Feature Request: loading gluon trainer states with different contexts from saved states

2018-05-22 Thread GitBox
szha commented on issue #10910: Feature Request: loading gluon trainer states 
with different contexts from saved states
URL: 
https://github.com/apache/incubator-mxnet/issues/10910#issuecomment-391176004
 
 
   There's a copy of state in each of the contexts, which might be different 
from the number of contexts provided by the user. In the case that you 
describe, the only solution is to throw away some states.
   
   @piiswrong what do you suggest we provide that would allow override of 
number of contexts for optimizer states?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] rahul003 commented on a change in pull request #10391: [MXNET-139] Tutorial for mixed precision training with float16

2018-05-22 Thread GitBox
rahul003 commented on a change in pull request #10391: [MXNET-139] Tutorial for 
mixed precision training with float16
URL: https://github.com/apache/incubator-mxnet/pull/10391#discussion_r190086295
 
 

 ##
 File path: docs/tutorials/python/float16.md
 ##
 @@ -0,0 +1,280 @@
+# Mixed precision training using float16
+
+The computational resources required for training deep neural networks has 
been increasing of late because of complexity of the architectures and size of 
models. Mixed precision training allows us to reduces the resources required by 
using lower precision arithmetic. In this approach we train using 16 bit 
floating points (half precision) while using 32 bit floating points (single 
precision) for output buffers of float16 computation. This combination of 
single and half precision gives rise to the name Mixed precision. It allows us 
to achieve the same accuracy as training with single precision, while 
decreasing the required memory and training or inference time.
+
+The float16 data type, is a 16 bit floating point representation according to 
the IEEE 754 standard. It has a dynamic range where the precision can go from 
0.000596046 (highest, for values closest to 0) to 32 (lowest, for values in 
the range 32768-65536). Despite the decreased precision when compared to single 
precision (float32), float16 computation can be much faster on supported 
hardware. The motivation for using float16 for deep learning comes from the 
idea that deep neural network architectures have natural resilience to errors 
due to backpropagation. Half precision is typically sufficient for training 
neural networks. This means that on hardware with specialized support for 
float16 computation we can greatly improve the speed of training and inference. 
This speedup results from faster matrix multiplication, saving on memory 
bandwidth and reduced communication costs. It also reduces the size of the 
model, allowing us to train larger models and use larger batch sizes. 
+
+The Volta range of Graphics Processing Units (GPUs) from Nvidia have Tensor 
Cores which perform efficient float16 computation. A tensor core allows 
accumulation of half precision products into single or half precision outputs. 
For the rest of this tutorial we assume that we are working with Nvidia's 
Tensor Cores on a Volta GPU.
+
+In this tutorial we will walk through how one can train deep learning neural 
networks with mixed precision on supported hardware. We will first see how to 
use float16 and then some techniques on achieving good performance and accuracy.
+
+## Prerequisites
+
+- Volta range of Nvidia GPUs
+- Cuda 9 or higher
+- CUDNN v7 or higher
+
+## Using the Gluon API
+
+With Gluon, we need to take care of two things to convert a model to support 
float16.
+1. Cast the Gluon Block, so as to cast the parameters of layers and change the 
type of input expected, to float16.
+2. Cast the data to float16 to match the input type expected by the blocks if 
necessary.
+
+### Training
+Let us look at an example of training a Resnet50 model with the Caltech101 
dataset with float16. 
+First, let us get some import stuff out of the way.
+
+
+```python
+import os
+import tarfile
+import multiprocessing
+import time
+import numpy as np
+import mxnet as mx
+from mxnet import nd, autograd, gluon
+from mxnet.gluon.model_zoo import vision as models
+from mxnet.metric import Accuracy
+from mxnet.gluon.data.vision.datasets import ImageFolderDataset
+```
+
+Let us start by fetching the Caltech101 dataset and extracting it. 
+
+
+```python
+url = 
"https://s3.us-east-2.amazonaws.com/mxnet-public/101_ObjectCategories.tar.gz";
+dataset_name = "101_ObjectCategories"
+data_folder = "data"
+if not os.path.isdir(data_folder):
+os.makedirs(data_folder)
+tar_path = mx.gluon.utils.download(url, path='data')
+if (not os.path.isdir(os.path.join(data_folder, "101_ObjectCategories")) or 
+not os.path.isdir(os.path.join(data_folder, "101_ObjectCategories_test"))):
+tar = tarfile.open(tar_path, "r:gz")
+tar.extractall(data_folder)
+tar.close()
+print('Data extracted')
+training_path = os.path.join(data_folder, dataset_name)
+testing_path = os.path.join(data_folder, "{}_test".format(dataset_name))
+```
+
+Now we have the images in two folders, one for training and the other for 
test. Let us next create Gluon Dataset from these folders, and then create 
Gluon DataLoader from those datasets. Let us also define a transform function 
so that each image loaded is resized, cropped and transposed. 
+
+
+```python
+EDGE = 224
+SIZE = (EDGE, EDGE)
+NUM_WORKERS = multiprocessing.cpu_count()
+# Lower batch size if you run out of memory on your GPU
+BATCH_SIZE = 64
+
+def transform(image, label):
+resized = mx.image.resize_short(image, EDGE)
+cropped, crop_info = mx.image.center_crop(resized, SIZE)
+transposed = nd.transpose(cropped, (2,0,1))
+return transposed, label
+
+dataset_train = ImageFolderDataset(root=t

[GitHub] szha commented on issue #10990: Gluon hybridize fails to detect an input

2018-05-22 Thread GitBox
szha commented on issue #10990: Gluon hybridize fails to detect an input
URL: 
https://github.com/apache/incubator-mxnet/issues/10990#issuecomment-391173857
 
 
   @piiswrong it would be great if we could have a way to explicitly warn about 
this though


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #10990: Gluon hybridize fails to detect an input

2018-05-22 Thread GitBox
szha commented on issue #10990: Gluon hybridize fails to detect an input
URL: 
https://github.com/apache/incubator-mxnet/issues/10990#issuecomment-391173401
 
 
   ```python
   # split into set lists. each list is of length max_set_size
   split_item_emb = F.split(item_emb, axis=0, num_outputs=len(set_sizes))
   
   # stack the lists to get a set dimension. need to sum each set separately
   if len(split_item_emb) > 1:
 batch_item_emb = F.concat(*split_item_emb, dim=0)
   else:
 batch_item_emb = split_item_emb
   ```
   seems to be equivalent to
   ```python
   batch_item_emb = item_emb
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhanghang1989 commented on issue #10852: [MXNET-411] Add ROI Align

2018-05-22 Thread GitBox
zhanghang1989 commented on issue #10852: [MXNET-411] Add ROI Align
URL: https://github.com/apache/incubator-mxnet/pull/10852#issuecomment-391152600
 
 
   I should have addressed most of the reviews. Please let me know if there are 
any further comments. Thanks! Related Gluon-CV PR 
https://github.com/dmlc/gluon-cv/pull/140


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #10990: Gluon hybridize fails to detect an input

2018-05-22 Thread GitBox
szha commented on issue #10990: Gluon hybridize fails to detect an input
URL: 
https://github.com/apache/incubator-mxnet/issues/10990#issuecomment-391172778
 
 
   The problem is in the following line:
   
https://gist.github.com/altosaar/6c29d8ac505ae1cea03ca65f193e7832#file-gluon-hybridize-error-py-L17
   `split_item_emb = F.split(item_emb, axis=0, num_outputs=len(set_sizes))`
   
   `__len__` behaves differently between symbol and ndarray.
   
   In general, `hybrid_forward` doesn't support having shape-specific logic.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on issue #11025: added ravel/unravel operators

2018-05-22 Thread GitBox
piiswrong commented on issue #11025: added ravel/unravel operators
URL: https://github.com/apache/incubator-mxnet/pull/11025#issuecomment-391172111
 
 
   Could you make sure the behavior is exactly the same with numpy for the same 
configuration?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] haojin2 commented on issue #10973: Flaky test_deconvolution

2018-05-22 Thread GitBox
haojin2 commented on issue #10973: Flaky test_deconvolution
URL: 
https://github.com/apache/incubator-mxnet/issues/10973#issuecomment-391166983
 
 
   Also encountered at: 
http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/incubator-mxnet/detail/PR-11021/5/pipeline


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] 01/01: add Builder and varargs which are easy for java to use

2018-05-22 Thread liuyizhi
This is an automated email from the ASF dual-hosted git repository.

liuyizhi pushed a commit to branch v1.2.0-java
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git

commit 9a7719e545918e7acbbe23c5684621f7307c3fca
Author: Yizhi Liu 
AuthorDate: Mon May 21 13:48:31 2018 -0700

add Builder and varargs which are easy for java to use
---
 .../core/src/main/scala/org/apache/mxnet/IO.scala  | 56 +-
 .../src/main/scala/org/apache/mxnet/NDArray.scala  |  3 +-
 .../src/main/scala/org/apache/mxnet/Shape.scala|  4 ++
 .../src/main/scala/org/apache/mxnet/Symbol.scala   |  1 -
 .../scala/org/apache/mxnet/module/BaseModule.scala | 30 
 .../scala/org/apache/mxnet/module/Module.scala | 43 -
 .../main/scala/org/apache/mxnet/NDArrayMacro.scala | 52 ++--
 7 files changed, 138 insertions(+), 51 deletions(-)

diff --git a/scala-package/core/src/main/scala/org/apache/mxnet/IO.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/IO.scala
index 7a9c1a7..123e2f8 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/IO.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/IO.scala
@@ -19,9 +19,10 @@ package org.apache.mxnet
 
 import org.apache.mxnet.Base._
 import org.apache.mxnet.DType.DType
-import org.apache.mxnet.io.{MXDataPack, MXDataIter}
+import org.apache.mxnet.io.{MXDataIter, MXDataPack}
 import org.slf4j.LoggerFactory
 
+import scala.annotation.varargs
 import scala.collection.immutable.ListMap
 import scala.collection.mutable.ListBuffer
 
@@ -140,6 +141,7 @@ class DataBatch(val data: IndexedSeq[NDArray],
 // (must match the order of input data/label)
 private val providedData: ListMap[String, Shape] = null,
 private val providedLabel: ListMap[String, Shape] = null) {
+
   /**
* Dispose its data and labels
* The object shall never be used after it is disposed.
@@ -160,6 +162,58 @@ class DataBatch(val data: IndexedSeq[NDArray],
   def provideLabel: ListMap[String, Shape] = providedLabel
 }
 
+object DataBatch {
+  class Builder() {
+private var data: IndexedSeq[NDArray] = null
+private var label: IndexedSeq[NDArray] = null
+private var index: IndexedSeq[Long] = null
+private var pad: Int = 0
+private var bucketKey: AnyRef = null
+private var providedData: ListMap[String, Shape] = ListMap.empty
+private var providedLabel: ListMap[String, Shape] = ListMap.empty
+
+@varargs def setData(data: NDArray*): Builder = {
+  this.data = data.toIndexedSeq
+  this
+}
+
+@varargs def setLabel(label: NDArray*): Builder = {
+  this.label = label.toIndexedSeq
+  this
+}
+
+@varargs def setIndex(index: Long*): Builder = {
+  this.index = index.toIndexedSeq
+  this
+}
+
+def setPad(pad: Int): Builder = {
+  this.pad = pad
+  this
+}
+
+def setBucketKey(bucketKey: AnyRef): Builder = {
+  this.bucketKey = bucketKey
+  this
+}
+
+def provideData(name: String, shape: Shape): Builder = {
+  providedData = providedData.updated(name, shape)
+  this
+}
+
+def provideLabel(name: String, shape: Shape): Builder = {
+  providedLabel = providedLabel.updated(name, shape)
+  this
+}
+
+def build(): DataBatch = {
+  new DataBatch(data, label, index, pad,
+bucketKey, providedData, providedLabel)
+}
+  }
+}
+
 /**
  * DataIter object in mxnet.
  */
diff --git a/scala-package/core/src/main/scala/org/apache/mxnet/NDArray.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/NDArray.scala
index 416f2d7..e8c687e 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/NDArray.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/NDArray.scala
@@ -48,6 +48,7 @@ object NDArray {
 }
   }
 
+  //  private[mxnet] def genericNDArrayFunctionInvoke(
   /**
* Used by NDArrayMacro.
* Invoke this function by passing in parameters.
@@ -57,7 +58,7 @@ object NDArray {
* @param kwargs Key-value arguments of input scalars
* @return The result NDArrays of result of computation.
*/
-  private[mxnet] def genericNDArrayFunctionInvoke(
+  def genericNDArrayFunctionInvoke(
 funcName: String, args: Seq[Any], kwargs: Map[String, Any] = null): 
NDArrayFuncReturn = {
 val function = functions(funcName)
 val ndArgs = ArrayBuffer.empty[NDArray]
diff --git a/scala-package/core/src/main/scala/org/apache/mxnet/Shape.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/Shape.scala
index e632ade..6891762 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/Shape.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/Shape.scala
@@ -17,6 +17,8 @@
 
 package org.apache.mxnet
 
+import scala.annotation.varargs
+
 /**
  * Shape of [[NDArray]] or other data
  */
@@ -28,6 +30,7 @@ class Shape(dims: Traversable[Int]) extends Serializable {
   }
 
   def apply(dim: Int): Int = shape(dim)

[incubator-mxnet] branch v1.2.0-java updated (1e0d064 -> 9a7719e)

2018-05-22 Thread liuyizhi
This is an automated email from the ASF dual-hosted git repository.

liuyizhi pushed a change to branch v1.2.0-java
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


 discard 1e0d064  add Builder and @varargs which are easy for java to use
 new 9a7719e  add Builder and varargs which are easy for java to use

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (1e0d064)
\
 N -- N -- N   refs/heads/v1.2.0-java (9a7719e)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:

-- 
To stop receiving notification emails like this one, please contact
liuyi...@apache.org.


[GitHub] anirudh2290 commented on a change in pull request #11026: Test/mkl dnn act

2018-05-22 Thread GitBox
anirudh2290 commented on a change in pull request #11026: Test/mkl dnn act
URL: https://github.com/apache/incubator-mxnet/pull/11026#discussion_r190075769
 
 

 ##
 File path: src/operator/quadratics_op-inl.h
 ##
 @@ -0,0 +1,31 @@
+//
 
 Review comment:
   Is this intentional?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] azai91 opened a new pull request #11026: Test/mkl dnn act

2018-05-22 Thread GitBox
azai91 opened a new pull request #11026: Test/mkl dnn act
URL: https://github.com/apache/incubator-mxnet/pull/11026
 
 
   ## Description ##
   (Brief description on what this PR is about)
   
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [ ] The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to 
the relevant [JIRA issue](https://issues.apache.org/jira/projects/MXNET/issues) 
created (except PRs with tiny changes)
   - [ ] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage:
   - Unit tests are added for small changes to verify correctness (e.g. adding 
a new operator)
   - Nightly tests are added for complicated/long-running ones (e.g. changing 
distributed kvstore)
   - Build tests will be added for build configuration changes (e.g. adding a 
new build option with NCCL)
   - [ ] Code is well-documented: 
   - For user-facing API changes, API doc string has been updated. 
   - For new C++ functions in header files, their functionalities and arguments 
are documented. 
   - For new examples, README.md is added to explain the what the example does, 
the source of the dataset, expected performance on test set and reference to 
the original paper if applicable
   - Check the API doc at 
http://mxnet-ci-doc.s3-accelerate.dualstack.amazonaws.com/PR-$PR_ID/$BUILD_ID/index.html
   - [ ] To the my best knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   
   ### Changes ###
   - [ ] Feature1, tests, (and when applicable, API doc)
   - [ ] Feature2, tests, (and when applicable, API doc)
   
   ## Comments ##
   - If this change is a backward incompatible change, why must this change be 
made.
   - Interesting edge cases to note here
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] asmushetzel opened a new pull request #11025: added ravel/unravel operators

2018-05-22 Thread GitBox
asmushetzel opened a new pull request #11025: added ravel/unravel operators
URL: https://github.com/apache/incubator-mxnet/pull/11025
 
 
   ## Description ##
   This resolves #10203 by implementing ravel_multi_index and unravel_index 
operators. The operators perform the same functionality as the corresponding 
operators in numpy.
   This enables writing fully symbolic operations that involve ravel/unravel of 
indices. 
   
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [X] Changes are complete (i.e. I finished coding on this PR)
   - [X] All changes have test coverage:
   - [X] Unit tests are added for small changes to verify correctness (e.g. 
adding a new operator)
   - [ X] Code is well-documented: 
   - [X] For user-facing API changes, API doc string has been updated. 
   - [ X] To the my best knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] branch master updated: Add live object detection from camera device. (#9808)

2018-05-22 Thread zhreshold
This is an automated email from the ASF dual-hosted git repository.

zhreshold pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 85668cc  Add live object detection from camera device. (#9808)
85668cc is described below

commit 85668cc36a40591b9f721d89eb99c69fae6aa295
Author: Pedro Larroy <928489+lar...@users.noreply.github.com>
AuthorDate: Wed May 23 00:08:44 2018 +0200

Add live object detection from camera device. (#9808)
---
 example/ssd/README.md  |  8 
 example/ssd/dataset/cv2Iterator.py | 56 +
 example/ssd/demo.py| 84 +++---
 example/ssd/detect/detector.py | 36 ++--
 example/ssd/init.sh|  3 +-
 5 files changed, 177 insertions(+), 10 deletions(-)

diff --git a/example/ssd/README.md b/example/ssd/README.md
index 0b97092..55387c5 100644
--- a/example/ssd/README.md
+++ b/example/ssd/README.md
@@ -17,6 +17,7 @@ remarkable traits of MXNet.
 Due to the permission issue, this example is maintained in this 
[repository](https://github.com/zhreshold/mxnet-ssd) separately. You can use 
the link regarding specific per example 
[issues](https://github.com/zhreshold/mxnet-ssd/issues).
 
 ### What's new
+* Added live camera capture and detection display (run with --camera flag)
 * Added multiple trained models.
 * Added a much simpler way to compose network from mainstream classification 
networks (resnet, inception...) and [Guide](symbol/README.md).
 * Update to the latest version according to caffe version, with 5% mAP 
increase.
@@ -84,6 +85,13 @@ python demo.py --cpu --network resnet50 --data-shape 512
 ```
 * Check `python demo.py --help` for more options.
 
+### Live Camera detection
+
+Use `init.sh` to download the trained model.
+You can use `./demo.py --camera` to use a video capture device with opencv 
such as a webcam. This
+will open a window that will display the camera output together with the 
detections. You can play
+with the detection threshold to get more or less detections.
+
 ### Train the model
 This example only covers training on Pascal VOC dataset. Other datasets should
 be easily supported by adding subclass derived from class `Imdb` in 
`dataset/imdb.py`.
diff --git a/example/ssd/dataset/cv2Iterator.py 
b/example/ssd/dataset/cv2Iterator.py
new file mode 100644
index 000..469faea
--- /dev/null
+++ b/example/ssd/dataset/cv2Iterator.py
@@ -0,0 +1,56 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import mxnet as mx
+import numpy as np
+import cv2
+
+
+class CameraIterator():
+"""
+An iterator that captures frames with opencv or the specified capture
+"""
+def __init__(self, capture=cv2.VideoCapture(0), frame_resize=None):
+self._capture = capture
+self._frame_resize = frame_resize
+if frame_resize:
+assert isinstance(frame_resize, tuple) and (len(tuple) == 2), 
"frame_resize should be a tuple of (x,y)"
+self._frame_shape = (1, 3, frame_resize[0], frame_resize[1])
+else:
+self._frame_shape = (1, 3,
+int(self._capture.get(cv2.CAP_PROP_FRAME_WIDTH)),
+int(self._capture.get(cv2.CAP_PROP_FRAME_HEIGHT)))
+
+def __iter__(self):
+return self
+
+def __next__(self):
+ret, frame = self._capture.read()
+if cv2.waitKey(1) & 0xFF == ord('q') or ret is not True:
+raise StopIteration
+if self._frame_resize:
+frame = cv2.resize(frame, (self._frame_resize[0], 
self._frame_resize[1]))
+return frame
+
+def __enter__(self):
+pass
+
+def __exit__(self, exc_type, exc_alue, traceback):
+self.close()
+
+def close(self):
+self._capture.release()
diff --git a/example/ssd/demo.py b/example/ssd/demo.py
index 0480bdd..4ae8b35 100755
--- a/example/ssd/demo.py
+++ b/example/ssd/demo.py
@@ -25,6 +25,9 @@ import os
 import sys
 from detect.detector import Detector
 from symbol.symbol_factory import get_symbol
+from dataset.cv2Iterator import CameraIterator
+import logging
+import cv2
 
 def get_detector(

[GitHub] zhreshold closed pull request #9808: [MXNET-460] Ssd camera demo

2018-05-22 Thread GitBox
zhreshold closed pull request #9808: [MXNET-460] Ssd camera demo
URL: https://github.com/apache/incubator-mxnet/pull/9808
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/example/ssd/README.md b/example/ssd/README.md
index 0b970923e44..55387c5fd2d 100644
--- a/example/ssd/README.md
+++ b/example/ssd/README.md
@@ -17,6 +17,7 @@ remarkable traits of MXNet.
 Due to the permission issue, this example is maintained in this 
[repository](https://github.com/zhreshold/mxnet-ssd) separately. You can use 
the link regarding specific per example 
[issues](https://github.com/zhreshold/mxnet-ssd/issues).
 
 ### What's new
+* Added live camera capture and detection display (run with --camera flag)
 * Added multiple trained models.
 * Added a much simpler way to compose network from mainstream classification 
networks (resnet, inception...) and [Guide](symbol/README.md).
 * Update to the latest version according to caffe version, with 5% mAP 
increase.
@@ -84,6 +85,13 @@ python demo.py --cpu --network resnet50 --data-shape 512
 ```
 * Check `python demo.py --help` for more options.
 
+### Live Camera detection
+
+Use `init.sh` to download the trained model.
+You can use `./demo.py --camera` to use a video capture device with opencv 
such as a webcam. This
+will open a window that will display the camera output together with the 
detections. You can play
+with the detection threshold to get more or less detections.
+
 ### Train the model
 This example only covers training on Pascal VOC dataset. Other datasets should
 be easily supported by adding subclass derived from class `Imdb` in 
`dataset/imdb.py`.
diff --git a/example/ssd/dataset/cv2Iterator.py 
b/example/ssd/dataset/cv2Iterator.py
new file mode 100644
index 000..469faeac828
--- /dev/null
+++ b/example/ssd/dataset/cv2Iterator.py
@@ -0,0 +1,56 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import mxnet as mx
+import numpy as np
+import cv2
+
+
+class CameraIterator():
+"""
+An iterator that captures frames with opencv or the specified capture
+"""
+def __init__(self, capture=cv2.VideoCapture(0), frame_resize=None):
+self._capture = capture
+self._frame_resize = frame_resize
+if frame_resize:
+assert isinstance(frame_resize, tuple) and (len(tuple) == 2), 
"frame_resize should be a tuple of (x,y)"
+self._frame_shape = (1, 3, frame_resize[0], frame_resize[1])
+else:
+self._frame_shape = (1, 3,
+int(self._capture.get(cv2.CAP_PROP_FRAME_WIDTH)),
+int(self._capture.get(cv2.CAP_PROP_FRAME_HEIGHT)))
+
+def __iter__(self):
+return self
+
+def __next__(self):
+ret, frame = self._capture.read()
+if cv2.waitKey(1) & 0xFF == ord('q') or ret is not True:
+raise StopIteration
+if self._frame_resize:
+frame = cv2.resize(frame, (self._frame_resize[0], 
self._frame_resize[1]))
+return frame
+
+def __enter__(self):
+pass
+
+def __exit__(self, exc_type, exc_alue, traceback):
+self.close()
+
+def close(self):
+self._capture.release()
diff --git a/example/ssd/demo.py b/example/ssd/demo.py
index 0480bdd658b..4ae8b350742 100755
--- a/example/ssd/demo.py
+++ b/example/ssd/demo.py
@@ -25,6 +25,9 @@
 import sys
 from detect.detector import Detector
 from symbol.symbol_factory import get_symbol
+from dataset.cv2Iterator import CameraIterator
+import logging
+import cv2
 
 def get_detector(net, prefix, epoch, data_shape, mean_pixels, ctx, num_class,
  nms_thresh=0.5, force_nms=True, nms_topk=400):
@@ -72,6 +75,8 @@ def parse_args():
 type=str, nargs='?')
 parser.add_argument('--epoch', dest='epoch', help='epoch of trained model',
 default=0, type=int)
+parser.add_argument('--batch-size', dest='batch_size', help='batch size',
+default=1, type=int)
 parser.add_argument('--prefix', dest='pre

[GitHub] szha closed issue #10766: Bug Cannot save/load params with Gluon model

2018-05-22 Thread GitBox
szha closed issue #10766: Bug Cannot save/load params with Gluon model
URL: https://github.com/apache/incubator-mxnet/issues/10766
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #10766: Bug Cannot save/load params with Gluon model

2018-05-22 Thread GitBox
szha commented on issue #10766: Bug Cannot save/load params with Gluon model
URL: 
https://github.com/apache/incubator-mxnet/issues/10766#issuecomment-391155042
 
 
   Should have already been fixed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhanghang1989 commented on issue #10852: [MXNET-411] Add ROI Align

2018-05-22 Thread GitBox
zhanghang1989 commented on issue #10852: [MXNET-411] Add ROI Align
URL: https://github.com/apache/incubator-mxnet/pull/10852#issuecomment-391152600
 
 
   I should have addressed the comments. Please let me know if there are any 
further notes. Thanks


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhanghang1989 commented on a change in pull request #10536: [MXNET-317] Add Data Parallel

2018-05-22 Thread GitBox
zhanghang1989 commented on a change in pull request #10536: [MXNET-317] Add 
Data Parallel
URL: https://github.com/apache/incubator-mxnet/pull/10536#discussion_r190064254
 
 

 ##
 File path: python/mxnet/gluon/contrib/parallel.py
 ##
 @@ -0,0 +1,343 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# pylint: disable=broad-except, redefined-builtin
+"""Synchronized DataParallel"""
+import threading
+from ... import autograd
+from ...ndarray import NDArray
+from ..utils import split_and_load
+
+__all__ = ['DataParallelModel', 'DataParallelCriterion', 'Barrier']
+
+
+class Barrier(object):
+"""Shared NDArray for cross device operation.
+
+A cross device operation that allows synchronized push and pull. It can be 
used in
+Cross-gpu Sycnhronized Batch Normalization and Sparse Blocks.
+
+Parameters
+--
+counter : int
+Number of deivces.
+operation : callable
+The cross device operation is applying (e.g. AllReduce).
+"""
+def __init__(self, counter, operation):
+self.mutex = threading.Lock()
+self.all_tasks_done = threading.Condition(self.mutex)
+self.counter = counter
+self.op = operation
+self._clear()
+
+def push(self, x):
+"""Push a NDArray from one of the device.
+Input:
+x (NDArray)
+
+Output:
+idx (int), the output index
+"""
+with self.mutex:
+if self.push_tasks == 0:
+self._clear()
+self.list.append(x)
+idx = len(self.list) - 1
+self.push_tasks -= 1
+
+with self.all_tasks_done:
+if self.push_tasks == 0:
+self.all_tasks_done.notify_all()
+while self.push_tasks:
+self.all_tasks_done.wait()
+
+self._sync_op()
+return idx
+
+def pull(self, idx):
+"""Pull the output to each device
+Input:
+idx (int)
+
+Output:
+out (NDArray)
+"""
+return self.out[idx]
+
+def _sync_op(self):
+with self.mutex:
+if self.reduce_tasks == 1:
+assert(len(self.list) == self.counter)
+self.out = self.op(*self.list)
+if isinstance(self.out, (list, tuple)):
+for xi in self.out:
+xi.wait_to_read()
+else:
+self.out.wait_to_read()
+self.reduce_tasks -= 1
+else:
+self.reduce_tasks -= 1
+
+with self.all_tasks_done:
+if self.reduce_tasks == 0:
+self.all_tasks_done.notify_all()
+while self.reduce_tasks:
+self.all_tasks_done.wait()
+
+def _clear(self):
+self.list = []
+self.push_tasks = self.counter
+self.reduce_tasks = self.counter
+
+def __len__(self):
+return len(self.list)
+
+def __repr__(self):
+return 'ParallelState'
+
+
+class DataParallelModel(object):
+"""Data parallelism
+
+Hide the difference of single/multiple GPUs to the user.
+Inputs and outputs are both list of NDArrays in different contexts.
+In the forward pass, the module is replicated on each device,
+and each replica handles a portion of the input. During the backwards
+pass, gradients from each replica are summed into the original module.
+
+Parameters
+--
+module : object
+Network to be parallelized.
+ctx_list : list
+A list of contexts
+sync : bool
+enable synchronization (default: False).
+
+
+Inputs:
+- **inputs**: list of input (NDArrays)
+
+Outputs:
+- **outputs**: list of output (NDArrays)
+
+Example::
+>>> ctx = [mx.gpu(0), mx.gpu(1)]
+>>> net = DataParallelModel(model, ctx_list=ctx)
+>>> y = net(x)
+"""
+def __init__(self, module, ctx_list=None, sync=False):
+module.collect_params().reset_ctx(ctx=ctx_list)
+self.ctx_list = ctx_list
+self.module = module
+self.sync = sync
+
+def __call__(self, *inputs, **kwargs):
+if not self.ctx_list:
+re

[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190029910
 
 

 ##
 File path: python/mxnet/gluon/parameter.py
 ##
 @@ -271,12 +313,18 @@ def _init_grad(self):
 self._grad = [ndarray.zeros(shape=i.shape, dtype=i.dtype, 
ctx=i.context,
 stype=self._grad_stype) for i in 
self._data]
 
-autograd.mark_variables(self.list_data(), self.list_grad(), 
self.grad_req)
+autograd.mark_variables(self._check_and_get(self._data, list),
+self._grad, self.grad_req)
 
 def _reduce(self):
 """Reduce data from multiple context."""
-block = self.list_data()
-data = ndarray.add_n(*(w.copyto(context.cpu()) for w in block)) / 
len(block)
+if self._stype == 'default':
+block = self.list_data()
+data = ndarray.add_n(*(w.copyto(context.cpu()) for w in block)) / 
len(block)
+else:
+# fetch all rows for 'row_sparse' param
+all_row_ids = ndarray.arange(0, self.shape[0], dtype='int64', 
ctx=context.cpu())
+data = self.row_sparse_data(all_row_ids)
 
 Review comment:
   Is it possible to have row_sparse but update_on_kvstore=false?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190030607
 
 

 ##
 File path: python/mxnet/gluon/parameter.py
 ##
 @@ -396,11 +485,25 @@ def data(self, ctx=None):
 ---
 NDArray on ctx
 """
+if self._stype != 'default':
+raise ValueError("Cannot return a copy of Parameter '%s' on ctx %s 
via data() " \
 
 Review comment:
   These should be UserError?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190028573
 
 

 ##
 File path: python/mxnet/gluon/parameter.py
 ##
 @@ -194,8 +210,26 @@ def _check_and_get(self, arr_list, ctx):
 "because the later does not include Parameters of " \
 "nested child Blocks"%(self.name))
 
-def _load_init(self, data, ctx):
+def _get_row_sparse(self, arr_list, ctx, row_id):
+""" Get row_sparse data from row_sparse parameters based on row_id. """
+# get row sparse params based on row ids
+if not isinstance(row_id, ndarray.NDArray):
+raise TypeError("Cannot get 'row_sparse' Parameter %s with %s 
type. "
 
 Review comment:
   "row_id must have NDArray type, but %s is given"


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190058383
 
 

 ##
 File path: python/mxnet/gluon/trainer.py
 ##
 @@ -109,38 +117,54 @@ def _init_optimizer(self, optimizer, optimizer_params):
 self._updaters = [opt.get_updater(self._optimizer) \
 for _ in self._contexts]
 
+def _init_params(self):
+""" Initialize parameters in the KVStore. Parameters whose
 
 Review comment:
   Wrong format


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190029296
 
 

 ##
 File path: python/mxnet/gluon/parameter.py
 ##
 @@ -194,8 +210,26 @@ def _check_and_get(self, arr_list, ctx):
 "because the later does not include Parameters of " \
 "nested child Blocks"%(self.name))
 
-def _load_init(self, data, ctx):
+def _get_row_sparse(self, arr_list, ctx, row_id):
+""" Get row_sparse data from row_sparse parameters based on row_id. """
+# get row sparse params based on row ids
+if not isinstance(row_id, ndarray.NDArray):
+raise TypeError("Cannot get 'row_sparse' Parameter %s with %s 
type. "
+"NDArray type is expected." % (self.name, 
type(row_id)))
+if not self._trainer:
+raise RuntimeError("Cannot get row_sparse data for Parameter '%s' 
when no " \
+   "Trainer is created with it."%self.name)
+results = self._check_and_get(arr_list, ctx)
+
+# fetch row sparse params from the trainer
+self._trainer._row_sparse_pull(self, results, row_id)
+return results
+
+def _load_init(self, data, ctx, cast_stype=False):
 """(Re)initializes by loading from data."""
+if self._trainer and self._trainer._kv_initialized and 
self._trainer._update_on_kvstore:
+raise RuntimeError("Cannot (Re)initialize Parameter '%s' when its 
Trainer " \
+   "already initialized the parameter on 
KVStore."%(self.name))
 
 Review comment:
   message is cryptic. The reason is multi device training and 
update_on_kvstore is true.
   error message should describe the reason and suggest a solution


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190058808
 
 

 ##
 File path: python/mxnet/gluon/trainer.py
 ##
 @@ -109,38 +117,54 @@ def _init_optimizer(self, optimizer, optimizer_params):
 self._updaters = [opt.get_updater(self._optimizer) \
 for _ in self._contexts]
 
+def _init_params(self):
+""" Initialize parameters in the KVStore. Parameters whose
+intiailization is incomplete are ignored.
+"""
+assert self._kv_initialized, "Cannot initialize parameters in KVStore 
" \
+ "when KVStore is not initialized."
+params_to_init = []
+if self._kvstore:
+params = [param for param in self._params_to_init \
 
 Review comment:
   better to use for loop and if/else here 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190024546
 
 

 ##
 File path: python/mxnet/gluon/parameter.py
 ##
 @@ -162,6 +169,15 @@ def shape(self, new_shape):
 
 self._shape = new_shape
 
+def _set_trainer(self, trainer):
+""" Set the trainer this parameter is associated with. """
+if self._trainer and trainer and self._trainer is not trainer:
+raise RuntimeError(
+"Failed to set the trainer for Parameter '%s' to %s because it 
was set to %s. " \
+"More than one trainers for a single Parameter is not 
supported." %(
+self.name, str(trainer), str(self._trainer)))
 
 Review comment:
   what does str(trainer) show? It's likely not meaningful to users


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190004050
 
 

 ##
 File path: python/mxnet/gluon/block.py
 ##
 @@ -443,8 +443,17 @@ class HybridBlock(Block):
 
 Refer `Hybrid tutorial `_ to 
see
 the end-to-end usage.
+
 """
 def __init__(self, prefix=None, params=None):
+# check if any parameter is row_sparse
+if isinstance(params, ParameterDict):
 
 Review comment:
   This check shouldn't be done here.
   Parameters are only added to the current block when self.params.get is 
called.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190028733
 
 

 ##
 File path: python/mxnet/gluon/parameter.py
 ##
 @@ -194,8 +210,26 @@ def _check_and_get(self, arr_list, ctx):
 "because the later does not include Parameters of " \
 "nested child Blocks"%(self.name))
 
-def _load_init(self, data, ctx):
+def _get_row_sparse(self, arr_list, ctx, row_id):
+""" Get row_sparse data from row_sparse parameters based on row_id. """
+# get row sparse params based on row ids
+if not isinstance(row_id, ndarray.NDArray):
+raise TypeError("Cannot get 'row_sparse' Parameter %s with %s 
type. "
+"NDArray type is expected." % (self.name, 
type(row_id)))
+if not self._trainer:
+raise RuntimeError("Cannot get row_sparse data for Parameter '%s' 
when no " \
+   "Trainer is created with it."%self.name)
 
 Review comment:
   What if user want to train with single device?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190029487
 
 

 ##
 File path: python/mxnet/gluon/parameter.py
 ##
 @@ -208,6 +242,14 @@ def _load_init(self, data, ctx):
 "Failed loading Parameter '%s' from saved params: " \
 "dtype incompatible expected %s vs saved %s"%(
 self.name, str(self.dtype), str(data.dtype))
+if self._stype != data.stype:
+if not cast_stype:
 
 Review comment:
   Why is cast_stype needed? Why not always cast? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190059562
 
 

 ##
 File path: python/mxnet/gluon/trainer.py
 ##
 @@ -191,6 +224,8 @@ def step(self, batch_size, ignore_stale_grad=False):
 """
 if not self._kv_initialized:
 self._init_kvstore()
+if self._params_to_init:
 
 Review comment:
   I don't quite understand this. If there are uninitialized parameters, 
wouldn't step fail?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190024897
 
 

 ##
 File path: python/mxnet/gluon/parameter.py
 ##
 @@ -162,6 +169,15 @@ def shape(self, new_shape):
 
 self._shape = new_shape
 
+def _set_trainer(self, trainer):
+""" Set the trainer this parameter is associated with. """
+if self._trainer and trainer and self._trainer is not trainer:
+raise RuntimeError(
+"Failed to set the trainer for Parameter '%s' to %s because it 
was set to %s. " \
+"More than one trainers for a single Parameter is not 
supported." %(
+self.name, str(trainer), str(self._trainer)))
 
 Review comment:
   This is a breaking change.
   Suppose users want to use sgd to train 10 epochs and then switch to ADAM, 
this would prevent that.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #11001: [MXNET-374] handle row_sparse weight in parameter and trainer

2018-05-22 Thread GitBox
piiswrong commented on a change in pull request #11001: [MXNET-374] handle 
row_sparse weight in parameter and trainer
URL: https://github.com/apache/incubator-mxnet/pull/11001#discussion_r190058027
 
 

 ##
 File path: python/mxnet/gluon/trainer.py
 ##
 @@ -68,12 +68,20 @@ def __init__(self, params, optimizer, 
optimizer_params=None, kvstore='device',
 "First argument must be a list or dict of Parameters, " \
 "got %s."%(type(params)))
 self._params = []
-for param in params:
+self._params_to_init = []
+self._contains_sparse = False
+self._param2idx = {}
+for i, param in enumerate(params):
 if not isinstance(param, Parameter):
 raise ValueError(
 "First argument must be a list or dict of Parameters, " \
 "got list of %s."%(type(param)))
+self._param2idx[param.name] = i
 self._params.append(param)
+self._params_to_init.append(param)
+param._set_trainer(self)
 
 Review comment:
   do we need to set_trainer when stype='default' and update_on_kvstore=False?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhanghang1989 commented on a change in pull request #10536: [MXNET-317] Add Data Parallel

2018-05-22 Thread GitBox
zhanghang1989 commented on a change in pull request #10536: [MXNET-317] Add 
Data Parallel
URL: https://github.com/apache/incubator-mxnet/pull/10536#discussion_r190059705
 
 

 ##
 File path: tests/python/unittest/test_contrib_parallel.py
 ##
 @@ -0,0 +1,100 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import mxnet as mx
+from mxnet import nd, autograd, gluon
+from mxnet.gluon import nn, Block
+from mxnet.gluon.contrib.parallel import *
+from numpy.testing import assert_allclose, assert_array_equal
+
+def test_data_parallel():
+# test gluon.contrib.parallel.DataParallelModel
+net = nn.HybridSequential()
+with net.name_scope():
+net.add(nn.Conv2D(in_channels=1, channels=20, kernel_size=5))
+net.add(nn.Activation('relu'))
+net.add(nn.MaxPool2D(pool_size=2, strides=2))
+net.add(nn.Conv2D(in_channels=20, channels=50, kernel_size=5))
+net.add(nn.Activation('relu'))
+net.add(nn.MaxPool2D(pool_size=2, strides=2))
+# The Flatten layer collapses all axis, except the first one, into one 
axis.
+net.add(nn.Flatten())
+net.add(nn.Dense(512,in_units=800))
+net.add(nn.Activation('relu'))
+net.add(nn.Dense(10, in_units=512))
+
+net.collect_params().initialize()
+criterion = gluon.loss.SoftmaxCELoss(axis=1)
+
+def test_net_sync(net, criterion, sync, nDevices):
+ctx_list = [mx.cpu(0) for i in range(nDevices)]
+net = DataParallelModel(net, ctx_list, sync=sync)
+criterion = DataParallelCriterion(criterion, ctx_list, sync=sync)
+iters = 100
+# train mode
+for i in range(iters):
+x = mx.random.uniform(shape=(8, 1, 28, 28))
+t = nd.ones(shape=(8))
+with autograd.record():
+y = net(x)
+loss = criterion(y, t)
+autograd.backward(loss)
+# evaluation mode
+for i in range(iters):
+x = mx.random.uniform(shape=(8, 1, 28, 28))
+y = net(x)
+
+test_net_sync(net, criterion, True, 1)
+test_net_sync(net, criterion, True, 2)
+test_net_sync(net, criterion, False, 1)
+test_net_sync(net, criterion, False, 2)
+
+
+def test_parallel_barrier():
+def my_callable(*inputs):
+return inputs
+
+class MyLayer(Block):
+def __init__(self, nGPU):
+super(MyLayer, self).__init__()
+self.barrier = Barrier(nGPU, my_callable)
+
+def forward(self, x):
+idx = self.barrier.push(x)
+y = self.barrier.pull(idx)
+assert_allclose(y.asnumpy(), x.asnumpy(), rtol=1e-2, atol=1e-4)
+return y
+
+nDevices = 2
+ctx_list = [mx.cpu(0) for i in range(nDevices)]
+net = MyLayer(nDevices)
+net = DataParallelModel(net, ctx_list, sync=True)
+iters = 100
+# train mode
+for i in range(iters):
+x = mx.random.uniform(shape=(8, 1, 28, 28))
+with autograd.record():
+y = net(x)
+# evaluation mode
+for i in range(iters):
+x = mx.random.uniform(shape=(8, 1, 28, 28))
+y = net(x)
 
 Review comment:
   mainly check the behavior of Barrier:
   ``assert_allclose(y.asnumpy(), x.asnumpy(), rtol=1e-2, atol=1e-4)``


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhanghang1989 commented on a change in pull request #10536: [MXNET-317] Add Data Parallel

2018-05-22 Thread GitBox
zhanghang1989 commented on a change in pull request #10536: [MXNET-317] Add 
Data Parallel
URL: https://github.com/apache/incubator-mxnet/pull/10536#discussion_r190059425
 
 

 ##
 File path: python/mxnet/gluon/contrib/parallel.py
 ##
 @@ -0,0 +1,362 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# pylint: disable=broad-except, redefined-builtin
+"""Synchronized DataParallel"""
+import threading
+from ... import autograd
+from ...ndarray import NDArray
+from ..utils import split_and_load
+
+__all__ = ['DataParallelModel', 'DataParallelCriterion', 'Barrier']
+
+
+class Barrier(object):
+"""Shared NDArray for cross device operation.
+
+A cross device operation that allows synchronized push and pull. It can be 
used in
+Cross-gpu Sycnhronized Batch Normalization and Sparse Blocks.
+
+Parameters
+--
+counter : int
+Number of deivces.
+operation : callable
+The cross device operation is applying (e.g. AllReduce).
+"""
+def __init__(self, counter, operation):
+self._mutex = threading.Lock()
+self.all_tasks_done = threading.Condition(self._mutex)
+self.counter = counter
+self.op = operation
+self._clear()
+
+def push(self, x):
+"""Push a NDArray from one of the device.
+Input:
+x (NDArray)
+
+Output:
+idx (int), the output index
+"""
+with self._mutex:
+if self.push_tasks == 0:
+self._clear()
+self.list.append(x)
+idx = len(self.list) - 1
+self.push_tasks -= 1
+
+with self.all_tasks_done:
+if self.push_tasks == 0:
+self.all_tasks_done.notify_all()
+while self.push_tasks:
+self.all_tasks_done.wait()
+
+self._sync_op()
+return idx
+
+def pull(self, idx):
+"""Pull the output to each device
+Input:
+idx (int)
+
+Output:
+out (NDArray)
+"""
+return self.out[idx]
+
+def _sync_op(self):
+with self._mutex:
+if self.reduce_tasks == 1:
+assert(len(self.list) == self.counter)
+self.out = self.op(*self.list)
+if isinstance(self.out, (list, tuple)):
+for xi in self.out:
+xi.wait_to_read()
+else:
+self.out.wait_to_read()
+self.reduce_tasks -= 1
+else:
+self.reduce_tasks -= 1
+
+with self.all_tasks_done:
+if self.reduce_tasks == 0:
+self.all_tasks_done.notify_all()
+while self.reduce_tasks:
+self.all_tasks_done.wait()
+
+def _clear(self):
+self.list = []
+self.push_tasks = self.counter
+self.reduce_tasks = self.counter
+
+def __len__(self):
+return len(self.list)
+
+def __repr__(self):
+return 'Barrier'
+
+
+class DataParallelModel(object):
+"""Data Parallelism
+
+Hide the difference of single/multiple GPUs to the user.
+This container parallelizes the application of the given module by
+splitting the input across the specified devices by chunking in the
+batch dimension.
+In the forward pass, the module is replicated on each device,
+and each replica handles a portion of the input. During the backwards pass,
+gradients from each replica are summed into the original module.
+Note that the outputs are not gathered, please use compatible
+:class:`mxnet.gluon.contrib.DataParallelCriterion`.
+
+The batch size should be larger than the number of GPUs used. It should
+also be an integer multiple of the number of GPUs so that each chunk is
+the same size (so that each GPU processes the same number of samples).
+
+Parameters
+--
+module : object
+Network to be parallelized.
+ctx_list : list
+A list of contexts
+sync : bool
+enable synchronization (default: False).
+
+
+Inputs:
+- **inputs**: list of input (NDArrays)
+
+Outputs:
+- **outputs**: list of output (NDArrays)
+

[GitHub] szha closed issue #7376: MXbox -- a simple and flexible vision toolbox for mxnet framework.

2018-05-22 Thread GitBox
szha closed issue #7376: MXbox -- a simple and flexible vision toolbox for 
mxnet framework.
URL: https://github.com/apache/incubator-mxnet/issues/7376
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #8365: error to use gluon interface to download pretrained mobilenet1_0

2018-05-22 Thread GitBox
szha commented on issue #8365: error to use gluon interface to download 
pretrained mobilenet1_0
URL: 
https://github.com/apache/incubator-mxnet/issues/8365#issuecomment-391139127
 
 
   pretrained models are provided now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha closed issue #8365: error to use gluon interface to download pretrained mobilenet1_0

2018-05-22 Thread GitBox
szha closed issue #8365: error to use gluon interface to download pretrained 
mobilenet1_0
URL: https://github.com/apache/incubator-mxnet/issues/8365
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] rahul003 commented on issue #10967: Is the doc for launch.py parameters outdated?

2018-05-22 Thread GitBox
rahul003 commented on issue #10967: Is the doc for launch.py parameters 
outdated?
URL: 
https://github.com/apache/incubator-mxnet/issues/10967#issuecomment-391134099
 
 
   Looks like --launcher is mapped to --cluster to keep it consistent with a 
dependency (dmlc-core). 
   You will need to use --launcher


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ZiyueHuang opened a new pull request #11024: concat of CSR NDArrays on first dimension

2018-05-22 Thread GitBox
ZiyueHuang opened a new pull request #11024: concat of CSR NDArrays on first 
dimension
URL: https://github.com/apache/incubator-mxnet/pull/11024
 
 
   ## Description ##
   concat of CSR NDArrays on first dimension
   
   cc @eric-haibin-lin 
   
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [ ] The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to 
the relevant [JIRA issue](https://issues.apache.org/jira/projects/MXNET/issues) 
created (except PRs with tiny changes)
   - [ ] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage:
   - Unit tests are added for small changes to verify correctness (e.g. adding 
a new operator)
   - Nightly tests are added for complicated/long-running ones (e.g. changing 
distributed kvstore)
   - Build tests will be added for build configuration changes (e.g. adding a 
new build option with NCCL)
   - [ ] Code is well-documented: 
   - For user-facing API changes, API doc string has been updated. 
   - For new C++ functions in header files, their functionalities and arguments 
are documented. 
   - For new examples, README.md is added to explain the what the example does, 
the source of the dataset, expected performance on test set and reference to 
the original paper if applicable
   - Check the API doc at 
http://mxnet-ci-doc.s3-accelerate.dualstack.amazonaws.com/PR-$PR_ID/$BUILD_ID/index.html
   - [ ] To the my best knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   
   ### Changes ###
   - [ ] Feature1, tests, (and when applicable, API doc)
   - [ ] Feature2, tests, (and when applicable, API doc)
   
   ## Comments ##
   - If this change is a backward incompatible change, why must this change be 
made.
   - Interesting edge cases to note here
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhanghang1989 commented on issue #11021: [MXNET-380] [WIP] count_include_pad argument for Avg Pooling

2018-05-22 Thread GitBox
zhanghang1989 commented on issue #11021: [MXNET-380] [WIP] count_include_pad 
argument for Avg Pooling
URL: https://github.com/apache/incubator-mxnet/pull/11021#issuecomment-391103000
 
 
   FYI @hetong007 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lanking520 opened a new issue #11023: No Windows Support for Scala

2018-05-22 Thread GitBox
lanking520 opened a new issue #11023: No Windows Support for Scala
URL: https://github.com/apache/incubator-mxnet/issues/11023
 
 
   
   ## Description
   Currently I see the tutorial contains Scala installation page for Scala, 
please remove that or adding some signs to show we are not currently support 
Windows.
   @aaronmarkham 
   
   ## Environment info (Required)
   Windows
   
   
   Package used (Python/R/Scala/Julia):
   (I'm using Scala)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] larroy commented on issue #9808: [MXNET-460] Ssd camera demo

2018-05-22 Thread GitBox
larroy commented on issue #9808: [MXNET-460] Ssd camera demo
URL: https://github.com/apache/incubator-mxnet/pull/9808#issuecomment-391100865
 
 
   Added  info to the README


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lanking520 commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
lanking520 commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r190006337
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
+  case class traitArg(argName : String, argType : String, argDesc : String, 
isOptional : Boolean)
+  case class traitFunction(name : String, desc : String,
+   listOfArgs: List[traitArg], returnType : String)
+
+
+  def main(args: Array[String]) : Unit = {
+val FILE_PATH = args(0)
+absClassGen(FILE_PATH, true)
+absClassGen(FILE_PATH, false)
+  }
+
+  def absClassGen(FILE_PATH : String, isSymbol : Boolean) : Unit = {
+// scalastyle:off
+val traitFunctions = initSymbolModule(isSymbol)
+val traitfuncs = 
traitFunctions.filterNot(_.name.startsWith("_")).map(traitfunction => {
+  val scalaDoc = ScalaDocGen(traitfunction)
+  val traitBody = defBodyGen(traitfunction, isSymbol)
+  s"$scalaDoc\n$traitBody"
+})
+val packageName = if (isSymbol) "SymbolAPIBase" else "NDArrayAPIBase"
+val apacheLicence = "/*\n* Licensed to the Apache Software Foundation 
(ASF) under one or more\n* contributor license agreements.  See the NOTICE file 
distributed with\n* this work for additional information regarding copyright 
ownership.\n* The ASF licenses this file to You under the Apache License, 
Version 2.0\n* (the \"License\"); you may not use this file except in 
compliance with\n* the License.  You may obtain a copy of the License at\n*\n*  
  http://www.apache.org/licenses/LICENSE-2.0\n*\n* Unless required by 
applicable law or agreed to in writing, software\n* distributed under the 
License is distributed on an \"AS IS\" BASIS,\n* WITHOUT WARRANTIES OR 
CONDITIONS OF ANY KIND, either express or implied.\n* See the License for the 
specific language governing permissions and\n* limitations under the 
License.\n*/\n"
+val scalaStyle = "// scalastyle:off"
+val packageDef = "package org.apache.mxnet"
+val absClassDef = s"abstract class $packageName"
+val finalStr = s"$apacheLicence\n$scalaStyle\n$packageDef\n$absClassDef 
{\n${traitfuncs.mkString("\n")}\n}"
+import java.io._
+val pw = new PrintWriter(new File(FILE_PATH + s"$packageName.scala"))
+pw.write(finalStr)
+pw.close()
+  }
+
+  // Generate ScalaDoc type
+  def ScalaDocGen(traitFunc : traitFunction) : String = {
+val desc = traitFunc.desc.split("\n").map({ currStr =>
+  s"  * $currStr"
+})
+val params = traitFunc.listOfArgs.map({ traitarg =>
+  val currArgName = traitarg.argName match {
+case "var" => "vari"
+case "type" => "typeOf"
+case _ => traitarg.argName
+  }
+  s"  * @param $currArgName\t\t${traitarg.argDesc}"
+})
+val returnType = s"  * @return ${traitFunc.returnType}"
+s"  /**\n${desc.mkString("\n")}\n${params.mkString("\n")}\n$returnType\n  
*/"
+  }
+
+  def defBodyGen(traitFunc : traitFunction, isSymbol : Boolean) : String = {
+var argDef = ListBuffer[String]()
+traitFunc.listOfArgs.foreach(traitarg => {
+  val currArgName = traitarg.argName match {
+case "var" => "vari"
+case "type" => "typeOf"
+case _ => traitarg.argName
+  }
+  if (traitarg.isOptional) {
+argDef += s"$currArgName : Option[${traitarg.argType}] = None"
+  }
+  else {
+argDef += s"$currArgName : ${traitarg.argType}"
+  }
+})
+if (isSymbol) {
+  argDef += "name : String = null"
+  argDef += "attr : Map[String, String] = null"
+}
+s"def ${traitFunc.name} (${argDef.mkString(", ")}) : 
${traitFunc.returnType}"
+  }
+
+
+  // Convert C++ Types to Scala Types
+  def typeConversion(in : String, argType : String = "", returnType : String) 
: String = {
 
 Review comment:
   Will place in the same class

--

[GitHub] lanking520 commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
lanking520 commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r190005922
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
+  case class traitArg(argName : String, argType : String, argDesc : String, 
isOptional : Boolean)
+  case class traitFunction(name : String, desc : String,
+   listOfArgs: List[traitArg], returnType : String)
+
+
+  def main(args: Array[String]) : Unit = {
+val FILE_PATH = args(0)
+absClassGen(FILE_PATH, true)
+absClassGen(FILE_PATH, false)
+  }
+
+  def absClassGen(FILE_PATH : String, isSymbol : Boolean) : Unit = {
+// scalastyle:off
+val traitFunctions = initSymbolModule(isSymbol)
+val traitfuncs = 
traitFunctions.filterNot(_.name.startsWith("_")).map(traitfunction => {
 
 Review comment:
   Let's put them all together in one class, will create a Class for that


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lanking520 commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
lanking520 commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r190004546
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
+  case class traitArg(argName : String, argType : String, argDesc : String, 
isOptional : Boolean)
 
 Review comment:
   Done the change


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lanking520 commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
lanking520 commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r190004477
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,185 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
 
 Review comment:
   Does it seemed too long?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
nswamy commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r19469
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
+  case class traitArg(argName : String, argType : String, argDesc : String, 
isOptional : Boolean)
+  case class traitFunction(name : String, desc : String,
+   listOfArgs: List[traitArg], returnType : String)
+
+
+  def main(args: Array[String]) : Unit = {
+val FILE_PATH = args(0)
+absClassGen(FILE_PATH, true)
+absClassGen(FILE_PATH, false)
+  }
+
+  def absClassGen(FILE_PATH : String, isSymbol : Boolean) : Unit = {
+// scalastyle:off
+val traitFunctions = initSymbolModule(isSymbol)
+val traitfuncs = 
traitFunctions.filterNot(_.name.startsWith("_")).map(traitfunction => {
+  val scalaDoc = ScalaDocGen(traitfunction)
+  val traitBody = defBodyGen(traitfunction, isSymbol)
+  s"$scalaDoc\n$traitBody"
+})
+val packageName = if (isSymbol) "SymbolAPIBase" else "NDArrayAPIBase"
+val apacheLicence = "/*\n* Licensed to the Apache Software Foundation 
(ASF) under one or more\n* contributor license agreements.  See the NOTICE file 
distributed with\n* this work for additional information regarding copyright 
ownership.\n* The ASF licenses this file to You under the Apache License, 
Version 2.0\n* (the \"License\"); you may not use this file except in 
compliance with\n* the License.  You may obtain a copy of the License at\n*\n*  
  http://www.apache.org/licenses/LICENSE-2.0\n*\n* Unless required by 
applicable law or agreed to in writing, software\n* distributed under the 
License is distributed on an \"AS IS\" BASIS,\n* WITHOUT WARRANTIES OR 
CONDITIONS OF ANY KIND, either express or implied.\n* See the License for the 
specific language governing permissions and\n* limitations under the 
License.\n*/\n"
+val scalaStyle = "// scalastyle:off"
+val packageDef = "package org.apache.mxnet"
+val absClassDef = s"abstract class $packageName"
+val finalStr = s"$apacheLicence\n$scalaStyle\n$packageDef\n$absClassDef 
{\n${traitfuncs.mkString("\n")}\n}"
+import java.io._
+val pw = new PrintWriter(new File(FILE_PATH + s"$packageName.scala"))
+pw.write(finalStr)
+pw.close()
+  }
+
+  // Generate ScalaDoc type
+  def ScalaDocGen(traitFunc : traitFunction) : String = {
 
 Review comment:
   I would give it a different name, sounds like inbuilt scala method, may be 
generateAPIDocFromBackend
   also we follow camelCase for methodNames


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
nswamy commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r19872
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
+  case class traitArg(argName : String, argType : String, argDesc : String, 
isOptional : Boolean)
+  case class traitFunction(name : String, desc : String,
+   listOfArgs: List[traitArg], returnType : String)
+
+
+  def main(args: Array[String]) : Unit = {
+val FILE_PATH = args(0)
+absClassGen(FILE_PATH, true)
+absClassGen(FILE_PATH, false)
+  }
+
+  def absClassGen(FILE_PATH : String, isSymbol : Boolean) : Unit = {
+// scalastyle:off
+val traitFunctions = initSymbolModule(isSymbol)
+val traitfuncs = 
traitFunctions.filterNot(_.name.startsWith("_")).map(traitfunction => {
+  val scalaDoc = ScalaDocGen(traitfunction)
+  val traitBody = defBodyGen(traitfunction, isSymbol)
+  s"$scalaDoc\n$traitBody"
+})
+val packageName = if (isSymbol) "SymbolAPIBase" else "NDArrayAPIBase"
+val apacheLicence = "/*\n* Licensed to the Apache Software Foundation 
(ASF) under one or more\n* contributor license agreements.  See the NOTICE file 
distributed with\n* this work for additional information regarding copyright 
ownership.\n* The ASF licenses this file to You under the Apache License, 
Version 2.0\n* (the \"License\"); you may not use this file except in 
compliance with\n* the License.  You may obtain a copy of the License at\n*\n*  
  http://www.apache.org/licenses/LICENSE-2.0\n*\n* Unless required by 
applicable law or agreed to in writing, software\n* distributed under the 
License is distributed on an \"AS IS\" BASIS,\n* WITHOUT WARRANTIES OR 
CONDITIONS OF ANY KIND, either express or implied.\n* See the License for the 
specific language governing permissions and\n* limitations under the 
License.\n*/\n"
+val scalaStyle = "// scalastyle:off"
+val packageDef = "package org.apache.mxnet"
+val absClassDef = s"abstract class $packageName"
+val finalStr = s"$apacheLicence\n$scalaStyle\n$packageDef\n$absClassDef 
{\n${traitfuncs.mkString("\n")}\n}"
+import java.io._
+val pw = new PrintWriter(new File(FILE_PATH + s"$packageName.scala"))
+pw.write(finalStr)
+pw.close()
+  }
+
+  // Generate ScalaDoc type
+  def ScalaDocGen(traitFunc : traitFunction) : String = {
+val desc = traitFunc.desc.split("\n").map({ currStr =>
+  s"  * $currStr"
+})
+val params = traitFunc.listOfArgs.map({ traitarg =>
+  val currArgName = traitarg.argName match {
+case "var" => "vari"
+case "type" => "typeOf"
+case _ => traitarg.argName
+  }
+  s"  * @param $currArgName\t\t${traitarg.argDesc}"
+})
+val returnType = s"  * @return ${traitFunc.returnType}"
+s"  /**\n${desc.mkString("\n")}\n${params.mkString("\n")}\n$returnType\n  
*/"
+  }
+
+  def defBodyGen(traitFunc : traitFunction, isSymbol : Boolean) : String = {
+var argDef = ListBuffer[String]()
+traitFunc.listOfArgs.foreach(traitarg => {
+  val currArgName = traitarg.argName match {
+case "var" => "vari"
+case "type" => "typeOf"
+case _ => traitarg.argName
+  }
+  if (traitarg.isOptional) {
+argDef += s"$currArgName : Option[${traitarg.argType}] = None"
+  }
+  else {
+argDef += s"$currArgName : ${traitarg.argType}"
+  }
+})
+if (isSymbol) {
+  argDef += "name : String = null"
+  argDef += "attr : Map[String, String] = null"
+}
+s"def ${traitFunc.name} (${argDef.mkString(", ")}) : 
${traitFunc.returnType}"
+  }
+
+
+  // Convert C++ Types to Scala Types
+  def typeConversion(in : String, argType : String = "", returnType : String) 
: String = {
+in match {
+  case "Shape(tuple)" | "ShapeorNone" => "org.apache.mxnet.Shape"
+  case "Symbol" | "NDArray

[GitHub] nswamy commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
nswamy commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r189363020
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,185 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
 
 Review comment:
   what do you think of NDSymAPIDocGenerator?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
nswamy commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r18954
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
+  case class traitArg(argName : String, argType : String, argDesc : String, 
isOptional : Boolean)
+  case class traitFunction(name : String, desc : String,
+   listOfArgs: List[traitArg], returnType : String)
+
+
+  def main(args: Array[String]) : Unit = {
+val FILE_PATH = args(0)
+absClassGen(FILE_PATH, true)
+absClassGen(FILE_PATH, false)
+  }
+
+  def absClassGen(FILE_PATH : String, isSymbol : Boolean) : Unit = {
+// scalastyle:off
+val traitFunctions = initSymbolModule(isSymbol)
+val traitfuncs = 
traitFunctions.filterNot(_.name.startsWith("_")).map(traitfunction => {
 
 Review comment:
   can we have these filtering condition in one place, may be declare a 
constant string somewhere...its in too many places across files and easy to 
miss if we refactor


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
nswamy commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r189994831
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
+  case class traitArg(argName : String, argType : String, argDesc : String, 
isOptional : Boolean)
+  case class traitFunction(name : String, desc : String,
 
 Review comment:
   traitFunction ->classMethod


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
nswamy commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r189994715
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
+  case class traitArg(argName : String, argType : String, argDesc : String, 
isOptional : Boolean)
 
 Review comment:
   traitArg->classArg?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
nswamy commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r190002528
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
+  case class traitArg(argName : String, argType : String, argDesc : String, 
isOptional : Boolean)
+  case class traitFunction(name : String, desc : String,
+   listOfArgs: List[traitArg], returnType : String)
+
+
+  def main(args: Array[String]) : Unit = {
+val FILE_PATH = args(0)
+absClassGen(FILE_PATH, true)
+absClassGen(FILE_PATH, false)
+  }
+
+  def absClassGen(FILE_PATH : String, isSymbol : Boolean) : Unit = {
+// scalastyle:off
+val traitFunctions = initSymbolModule(isSymbol)
+val traitfuncs = 
traitFunctions.filterNot(_.name.startsWith("_")).map(traitfunction => {
+  val scalaDoc = ScalaDocGen(traitfunction)
+  val traitBody = defBodyGen(traitfunction, isSymbol)
+  s"$scalaDoc\n$traitBody"
+})
+val packageName = if (isSymbol) "SymbolAPIBase" else "NDArrayAPIBase"
+val apacheLicence = "/*\n* Licensed to the Apache Software Foundation 
(ASF) under one or more\n* contributor license agreements.  See the NOTICE file 
distributed with\n* this work for additional information regarding copyright 
ownership.\n* The ASF licenses this file to You under the Apache License, 
Version 2.0\n* (the \"License\"); you may not use this file except in 
compliance with\n* the License.  You may obtain a copy of the License at\n*\n*  
  http://www.apache.org/licenses/LICENSE-2.0\n*\n* Unless required by 
applicable law or agreed to in writing, software\n* distributed under the 
License is distributed on an \"AS IS\" BASIS,\n* WITHOUT WARRANTIES OR 
CONDITIONS OF ANY KIND, either express or implied.\n* See the License for the 
specific language governing permissions and\n* limitations under the 
License.\n*/\n"
+val scalaStyle = "// scalastyle:off"
+val packageDef = "package org.apache.mxnet"
+val absClassDef = s"abstract class $packageName"
+val finalStr = s"$apacheLicence\n$scalaStyle\n$packageDef\n$absClassDef 
{\n${traitfuncs.mkString("\n")}\n}"
+import java.io._
+val pw = new PrintWriter(new File(FILE_PATH + s"$packageName.scala"))
+pw.write(finalStr)
+pw.close()
+  }
+
+  // Generate ScalaDoc type
+  def ScalaDocGen(traitFunc : traitFunction) : String = {
+val desc = traitFunc.desc.split("\n").map({ currStr =>
+  s"  * $currStr"
+})
+val params = traitFunc.listOfArgs.map({ traitarg =>
+  val currArgName = traitarg.argName match {
+case "var" => "vari"
+case "type" => "typeOf"
+case _ => traitarg.argName
+  }
+  s"  * @param $currArgName\t\t${traitarg.argDesc}"
+})
+val returnType = s"  * @return ${traitFunc.returnType}"
+s"  /**\n${desc.mkString("\n")}\n${params.mkString("\n")}\n$returnType\n  
*/"
+  }
+
+  def defBodyGen(traitFunc : traitFunction, isSymbol : Boolean) : String = {
 
 Review comment:
   defBodyGen-> generateAPISignature


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
nswamy commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r189362728
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,185 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
 
 Review comment:
   Add comments on what this program does


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
nswamy commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r18607
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
+  case class traitArg(argName : String, argType : String, argDesc : String, 
isOptional : Boolean)
+  case class traitFunction(name : String, desc : String,
+   listOfArgs: List[traitArg], returnType : String)
+
+
+  def main(args: Array[String]) : Unit = {
+val FILE_PATH = args(0)
+absClassGen(FILE_PATH, true)
+absClassGen(FILE_PATH, false)
+  }
+
+  def absClassGen(FILE_PATH : String, isSymbol : Boolean) : Unit = {
+// scalastyle:off
+val traitFunctions = initSymbolModule(isSymbol)
 
 Review comment:
   traitFunctions->classMethods
   same everywhere


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #10991: [MXNET-386] API Docs Generation

2018-05-22 Thread GitBox
nswamy commented on a change in pull request #10991: [MXNET-386] API Docs 
Generation
URL: https://github.com/apache/incubator-mxnet/pull/10991#discussion_r190003013
 
 

 ##
 File path: 
scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
 ##
 @@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.mxnet
+
+import org.apache.mxnet.init.Base._
+
+import scala.collection.mutable.ListBuffer
+
+private[mxnet] object APIDocGenerator{
+  case class traitArg(argName : String, argType : String, argDesc : String, 
isOptional : Boolean)
+  case class traitFunction(name : String, desc : String,
+   listOfArgs: List[traitArg], returnType : String)
+
+
+  def main(args: Array[String]) : Unit = {
+val FILE_PATH = args(0)
+absClassGen(FILE_PATH, true)
+absClassGen(FILE_PATH, false)
+  }
+
+  def absClassGen(FILE_PATH : String, isSymbol : Boolean) : Unit = {
+// scalastyle:off
+val traitFunctions = initSymbolModule(isSymbol)
+val traitfuncs = 
traitFunctions.filterNot(_.name.startsWith("_")).map(traitfunction => {
+  val scalaDoc = ScalaDocGen(traitfunction)
+  val traitBody = defBodyGen(traitfunction, isSymbol)
+  s"$scalaDoc\n$traitBody"
+})
+val packageName = if (isSymbol) "SymbolAPIBase" else "NDArrayAPIBase"
+val apacheLicence = "/*\n* Licensed to the Apache Software Foundation 
(ASF) under one or more\n* contributor license agreements.  See the NOTICE file 
distributed with\n* this work for additional information regarding copyright 
ownership.\n* The ASF licenses this file to You under the Apache License, 
Version 2.0\n* (the \"License\"); you may not use this file except in 
compliance with\n* the License.  You may obtain a copy of the License at\n*\n*  
  http://www.apache.org/licenses/LICENSE-2.0\n*\n* Unless required by 
applicable law or agreed to in writing, software\n* distributed under the 
License is distributed on an \"AS IS\" BASIS,\n* WITHOUT WARRANTIES OR 
CONDITIONS OF ANY KIND, either express or implied.\n* See the License for the 
specific language governing permissions and\n* limitations under the 
License.\n*/\n"
+val scalaStyle = "// scalastyle:off"
+val packageDef = "package org.apache.mxnet"
+val absClassDef = s"abstract class $packageName"
+val finalStr = s"$apacheLicence\n$scalaStyle\n$packageDef\n$absClassDef 
{\n${traitfuncs.mkString("\n")}\n}"
+import java.io._
+val pw = new PrintWriter(new File(FILE_PATH + s"$packageName.scala"))
+pw.write(finalStr)
+pw.close()
+  }
+
+  // Generate ScalaDoc type
+  def ScalaDocGen(traitFunc : traitFunction) : String = {
+val desc = traitFunc.desc.split("\n").map({ currStr =>
+  s"  * $currStr"
+})
+val params = traitFunc.listOfArgs.map({ traitarg =>
+  val currArgName = traitarg.argName match {
+case "var" => "vari"
+case "type" => "typeOf"
+case _ => traitarg.argName
+  }
+  s"  * @param $currArgName\t\t${traitarg.argDesc}"
+})
+val returnType = s"  * @return ${traitFunc.returnType}"
+s"  /**\n${desc.mkString("\n")}\n${params.mkString("\n")}\n$returnType\n  
*/"
+  }
+
+  def defBodyGen(traitFunc : traitFunction, isSymbol : Boolean) : String = {
+var argDef = ListBuffer[String]()
+traitFunc.listOfArgs.foreach(traitarg => {
+  val currArgName = traitarg.argName match {
+case "var" => "vari"
+case "type" => "typeOf"
+case _ => traitarg.argName
+  }
+  if (traitarg.isOptional) {
+argDef += s"$currArgName : Option[${traitarg.argType}] = None"
+  }
+  else {
+argDef += s"$currArgName : ${traitarg.argType}"
+  }
+})
+if (isSymbol) {
+  argDef += "name : String = null"
+  argDef += "attr : Map[String, String] = null"
+}
+s"def ${traitFunc.name} (${argDef.mkString(", ")}) : 
${traitFunc.returnType}"
+  }
+
+
+  // Convert C++ Types to Scala Types
+  def typeConversion(in : String, argType : String = "", returnType : String) 
: String = {
 
 Review comment:
   please use this in the NDArrayAPI/SymbolAPI class and remove the redundant 
code

--

[incubator-mxnet] branch master updated: add gluon model summary (#10989)

2018-05-22 Thread jxie
This is an automated email from the ASF dual-hosted git repository.

jxie pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 022f238  add gluon model summary (#10989)
022f238 is described below

commit 022f23885bcd90b69448f25edf507bd89cd46caf
Author: Sheng Zha 
AuthorDate: Tue May 22 10:57:41 2018 -0700

add gluon model summary (#10989)

* add hook api

* add block.summary

* remove count
---
 python/mxnet/gluon/block.py | 172 +++-
 python/mxnet/gluon/rnn/rnn_layer.py |   4 +
 python/mxnet/gluon/utils.py |  37 
 tests/python/unittest/test_gluon.py |  77 
 4 files changed, 287 insertions(+), 3 deletions(-)

diff --git a/python/mxnet/gluon/block.py b/python/mxnet/gluon/block.py
index 4779484..dbe3c5e 100644
--- a/python/mxnet/gluon/block.py
+++ b/python/mxnet/gluon/block.py
@@ -31,7 +31,7 @@ from ..symbol import Symbol
 from ..ndarray import NDArray
 from .. import name as _name
 from .parameter import Parameter, ParameterDict, DeferredInitializationError
-from .utils import _indent, _brief_print_list
+from .utils import _indent, _brief_print_list, HookHandle
 
 
 class _BlockScope(object):
@@ -173,6 +173,8 @@ class Block(object):
 self._scope = _BlockScope(self)
 self._children = OrderedDict()
 self._reg_params = {}
+self._forward_hooks = OrderedDict()
+self._forward_pre_hooks = OrderedDict()
 
 def __repr__(self):
 s = '{name}(\n{modstr}\n)'
@@ -355,7 +357,6 @@ class Block(object):
 name, filename, 
_brief_print_list(self._params.keys(
 params[name]._load_init(loaded[name], ctx)
 
-
 def register_child(self, block, name=None):
 """Registers block as a child of self. :py:class:`Block` s assigned to 
self as
 attributes will be registered automatically."""
@@ -363,6 +364,61 @@ class Block(object):
 name = str(len(self._children))
 self._children[name] = block
 
+def register_forward_pre_hook(self, hook):
+r"""Registers a forward pre-hook on the block.
+
+The hook function is called immediately before :func:`forward`.
+It should not modify the input or output.
+
+Parameters
+--
+hook : callable
+The forward hook function of form `hook(block, input) -> None`.
+
+Returns
+---
+:class:`mxnet.gluon.utils.HookHandle`
+"""
+handle = HookHandle()
+handle.attach(self._forward_pre_hooks, hook)
+return handle
+
+def register_forward_hook(self, hook):
+r"""Registers a forward hook on the block.
+
+The hook function is called immediately after :func:`forward`.
+It should not modify the input or output.
+
+Parameters
+--
+hook : callable
+The forward hook function of form `hook(block, input, output) -> 
None`.
+
+Returns
+---
+:class:`mxnet.gluon.utils.HookHandle`
+"""
+handle = HookHandle()
+handle.attach(self._forward_hooks, hook)
+return handle
+
+def apply(self, fn):
+r"""Applies ``fn`` recursively to every child block as well as self.
+
+Parameters
+--
+fn : callable
+Function to be applied to each submodule, of form `fn(block)`.
+
+Returns
+---
+this block
+"""
+for cld in self._children.values():
+cld.apply(fn)
+fn(self)
+return self
+
 def initialize(self, init=initializer.Uniform(), ctx=None, verbose=False,
force_reinit=False):
 """Initializes :py:class:`Parameter` s of this :py:class:`Block` and 
its children.
@@ -411,7 +467,15 @@ class Block(object):
 
 def __call__(self, *args):
 """Calls forward. Only accepts positional arguments."""
-return self.forward(*args)
+for hook in self._forward_pre_hooks.values():
+hook(self, args)
+
+out = self.forward(*args)
+
+for hook in self._forward_hooks.values():
+hook(self, args, out)
+
+return out
 
 def forward(self, *args):
 """Overrides to implement forward computation using 
:py:class:`NDArray`. Only
@@ -425,6 +489,105 @@ class Block(object):
 # pylint: disable= invalid-name
 raise NotImplementedError
 
+def summary(self, *inputs):
+"""Print the summary of the model's output and parameters.
+
+The network must have been initialized, and must not have been 
hybridized.
+
+Parameters
+--
+inputs : object
+Any input that the model supports. For any tensor in the input, 
only
+:class:`mxnet.ndarray.NDArray` is supported.
+

[GitHub] piiswrong closed pull request #10989: add gluon model summary

2018-05-22 Thread GitBox
piiswrong closed pull request #10989: add gluon model summary
URL: https://github.com/apache/incubator-mxnet/pull/10989
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/python/mxnet/gluon/block.py b/python/mxnet/gluon/block.py
index 4779484ec3e..dbe3c5e032b 100644
--- a/python/mxnet/gluon/block.py
+++ b/python/mxnet/gluon/block.py
@@ -31,7 +31,7 @@
 from ..ndarray import NDArray
 from .. import name as _name
 from .parameter import Parameter, ParameterDict, DeferredInitializationError
-from .utils import _indent, _brief_print_list
+from .utils import _indent, _brief_print_list, HookHandle
 
 
 class _BlockScope(object):
@@ -173,6 +173,8 @@ def __init__(self, prefix=None, params=None):
 self._scope = _BlockScope(self)
 self._children = OrderedDict()
 self._reg_params = {}
+self._forward_hooks = OrderedDict()
+self._forward_pre_hooks = OrderedDict()
 
 def __repr__(self):
 s = '{name}(\n{modstr}\n)'
@@ -355,7 +357,6 @@ def load_params(self, filename, ctx=None, 
allow_missing=False,
 name, filename, 
_brief_print_list(self._params.keys(
 params[name]._load_init(loaded[name], ctx)
 
-
 def register_child(self, block, name=None):
 """Registers block as a child of self. :py:class:`Block` s assigned to 
self as
 attributes will be registered automatically."""
@@ -363,6 +364,61 @@ def register_child(self, block, name=None):
 name = str(len(self._children))
 self._children[name] = block
 
+def register_forward_pre_hook(self, hook):
+r"""Registers a forward pre-hook on the block.
+
+The hook function is called immediately before :func:`forward`.
+It should not modify the input or output.
+
+Parameters
+--
+hook : callable
+The forward hook function of form `hook(block, input) -> None`.
+
+Returns
+---
+:class:`mxnet.gluon.utils.HookHandle`
+"""
+handle = HookHandle()
+handle.attach(self._forward_pre_hooks, hook)
+return handle
+
+def register_forward_hook(self, hook):
+r"""Registers a forward hook on the block.
+
+The hook function is called immediately after :func:`forward`.
+It should not modify the input or output.
+
+Parameters
+--
+hook : callable
+The forward hook function of form `hook(block, input, output) -> 
None`.
+
+Returns
+---
+:class:`mxnet.gluon.utils.HookHandle`
+"""
+handle = HookHandle()
+handle.attach(self._forward_hooks, hook)
+return handle
+
+def apply(self, fn):
+r"""Applies ``fn`` recursively to every child block as well as self.
+
+Parameters
+--
+fn : callable
+Function to be applied to each submodule, of form `fn(block)`.
+
+Returns
+---
+this block
+"""
+for cld in self._children.values():
+cld.apply(fn)
+fn(self)
+return self
+
 def initialize(self, init=initializer.Uniform(), ctx=None, verbose=False,
force_reinit=False):
 """Initializes :py:class:`Parameter` s of this :py:class:`Block` and 
its children.
@@ -411,7 +467,15 @@ def cast(self, dtype):
 
 def __call__(self, *args):
 """Calls forward. Only accepts positional arguments."""
-return self.forward(*args)
+for hook in self._forward_pre_hooks.values():
+hook(self, args)
+
+out = self.forward(*args)
+
+for hook in self._forward_hooks.values():
+hook(self, args, out)
+
+return out
 
 def forward(self, *args):
 """Overrides to implement forward computation using 
:py:class:`NDArray`. Only
@@ -425,6 +489,105 @@ def forward(self, *args):
 # pylint: disable= invalid-name
 raise NotImplementedError
 
+def summary(self, *inputs):
+"""Print the summary of the model's output and parameters.
+
+The network must have been initialized, and must not have been 
hybridized.
+
+Parameters
+--
+inputs : object
+Any input that the model supports. For any tensor in the input, 
only
+:class:`mxnet.ndarray.NDArray` is supported.
+"""
+summary = OrderedDict()
+hooks = []
+
+def _get_shape_str(args):
+def flatten(args):
+if not isinstance(args, (list, tuple)):
+return [args], int(0)
+flat = []
+fmts = []
+for i in args:
+arg, fmt = flatten(i)
+

[GitHub] haojin2 commented on a change in pull request #11021: [MXNET-380] [WIP] count_include_pad argument for Avg Pooling

2018-05-22 Thread GitBox
haojin2 commented on a change in pull request #11021: [MXNET-380] [WIP] 
count_include_pad argument for Avg Pooling
URL: https://github.com/apache/incubator-mxnet/pull/11021#discussion_r189991965
 
 

 ##
 File path: src/operator/nn/pooling-inl.h
 ##
 @@ -50,6 +50,7 @@ struct PoolingParam : public dmlc::Parameter {
   bool global_pool;
   bool cudnn_off;
   dmlc::optional p_value;
+  dmlc::optional count_include_pad;
 
 Review comment:
   We want to ensure forward compatibility here, if this is not an optional 
field, json file generated for the symbol will have an extra field, which we 
think may cause confusions for users of earlier versions.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] haojin2 commented on a change in pull request #11021: [MXNET-380] [WIP] count_include_pad argument for Avg Pooling

2018-05-22 Thread GitBox
haojin2 commented on a change in pull request #11021: [MXNET-380] [WIP] 
count_include_pad argument for Avg Pooling
URL: https://github.com/apache/incubator-mxnet/pull/11021#discussion_r189985671
 
 

 ##
 File path: src/operator/nn/pooling-inl.h
 ##
 @@ -81,7 +82,10 @@ struct PoolingParam : public dmlc::Parameter {
 .describe("Pad for pooling: (y, x) or (d, y, x). Defaults to no padding.");
 
 DMLC_DECLARE_FIELD(p_value).set_default(dmlc::optional())
-.describe("Value of p for Lp pooling, can be 1 or 2, required for Lp 
Pooling");
+.describe("Value of p for Lp pooling, can be 1 or 2, required for Lp 
Pooling.");
+
+DMLC_DECLARE_FIELD(count_include_pad).set_default(dmlc::optional())
 
 Review comment:
   We're using the same name as Pytorch: 
https://pytorch.org/docs/master/nn.html?highlight=pool2d#torch.nn.AvgPool2d


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   >