[GitHub] [incubator-mxnet] sandeep-krishnamurthy commented on issue #15298: Fix Cached_op with static_shape=true

2019-06-26 Thread GitBox
sandeep-krishnamurthy commented on issue #15298: Fix Cached_op with 
static_shape=true
URL: https://github.com/apache/incubator-mxnet/pull/15298#issuecomment-506198977
 
 
   > @ZhennanQin Please verify the performance of this PR with our internal 
tests and NLP tests.
   > If everything is OK, I will merge this soon.
   
   @pengzhao-intel - I will be very interested to learn more about what 
internal tests and benchmark setup you have. Main motivation is to see if some 
of these tests should be bought to Nightly CI.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] roywei opened a new pull request #15380: [backport 1.5.x]Fix Cached_op with static_shape=true (#15298)

2019-06-26 Thread GitBox
roywei opened a new pull request #15380: [backport 1.5.x]Fix Cached_op with 
static_shape=true (#15298)
URL: https://github.com/apache/incubator-mxnet/pull/15380
 
 
   cherry pick
   
https://github.com/apache/incubator-mxnet/pull/15298#pullrequestreview-254718992
   
   as master nightly CI is failing.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, transpose, stack, split, log2, rint and radians

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, 
transpose, stack, split, log2, rint and radians
URL: https://github.com/apache/incubator-mxnet/pull/15370#discussion_r298007410
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -173,3 +173,112 @@ def _np_cumsum(a, axis=None, dtype=None, out=None):
 `axis` is not None or `a` is a 1-d array.
 """
 pass
+
+
+def _np_mean(a, axis=None, dtype=None, out=None, keepdims=None):
+"""
+Compute the arithmetic mean along the specified axis.
+Returns the average of the array elements.
+The average is taken over the flattened array by default, otherwise over 
the specified axis.
+
+Parameters 
+--
+a : `NDArray`
 
 Review comment:
   Change all `NDArray` to `ndarray`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, transpose, stack, split, log2, rint and radians

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, 
transpose, stack, split, log2, rint and radians
URL: https://github.com/apache/incubator-mxnet/pull/15370#discussion_r298007619
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -173,3 +173,112 @@ def _np_cumsum(a, axis=None, dtype=None, out=None):
 `axis` is not None or `a` is a 1-d array.
 """
 pass
+
+
+def _np_mean(a, axis=None, dtype=None, out=None, keepdims=None):
+"""
+Compute the arithmetic mean along the specified axis.
+Returns the average of the array elements.
+The average is taken over the flattened array by default, otherwise over 
the specified axis.
+
+Parameters 
+--
+a : `NDArray`
+NDArray containing numbers whose mean is desired.
+axis : None or int or tuple of ints, optional
+Axis or axes along which the means are computed. The default is to 
compute the mean of the flattened array.
+If this is a tuple of ints, a mean is performed over multiple axes, 
+instead of a single axis or all the axes as before.
+dtype : data-type, optional
+Type to use in computing the mean. For integer inputs, the default is 
float32;
+for floating point inputs, it is the same as the input dtype.
+keepdims : bool, optional
+If this is set to True, the axes which are reduced are left in the 
result
+as dimensions with size one. With this option, the result will 
broadcast correctly
+against the input array.
+If the default value is passed, then keepdims will not be passed 
through to the mean
+method of sub-classes of ndarray, however any non-default value will 
be. If the sub-class
+method does not implement keepdims any exceptions will be raised.
+initial : scalar, deprecated
+The initial value to start with. Default to None.
+out : ndarray, optional
+Alternate output array in which to place the result. The default is 
None; if provided,
+it must have the same shape as the expected output, but the type will 
be cast if necessary. 
+name : string, optional
+Name of the resulting symbol
+
+Returns
+---
+m : ndarray, see dtype parameter above
+If out=None, returns a new array containing the mean values,
+otherwise a reference to the output array is returned.
+
+Notes
+-
+This function differs from the original `numpy.mean
+`_ in
+the following way(s):
+
+- only NDArray is accepted as valid input, python iterables are not 
supported
+- default data type for integer input is float32
+
+Examples
+
+>>> a = np.array([[1, 2], [3, 4]])
+>>> np.mean(a)
+array(2.5, dtype=float32)
+>>> a = np.zeros((2, 512*512), dtype=np.float32)
+>>> a[0,:] = 1.0
+>>> a[1,:] = 0.1
+>>> np.mean(a)
+array(0.55, dtype=float32)
+>>> np.mean(a, dtype=np.float64)
+array(0.55)
+"""
+pass
+
+
+def _np_transpose(a, axes=None):
+"""
+Permute the dimensions of an array.
+
+Parameters
+--
+a : NDArray
+Input array.
+axes : list of ints, optional
+By default, reverse the dimensions,
+otherwise permute the axes according to the values given.
+out : ndarray, optional
+Alternate output array in which to place the result. The default is 
None; if provided,
+it must have the same shape as the expected output, but the type will 
be cast if necessary. 
+name : string, optional
+Name of the resulting symbol
+
+Returns
+---
+p : ndarray
+a with its axes permuted. A view is returned whenever possible.
 
 Review comment:
   view is not supported.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, transpose, stack, split, log2, rint and radians

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, 
transpose, stack, split, log2, rint and radians
URL: https://github.com/apache/incubator-mxnet/pull/15370#discussion_r298007848
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -173,3 +173,112 @@ def _np_cumsum(a, axis=None, dtype=None, out=None):
 `axis` is not None or `a` is a 1-d array.
 """
 pass
+
+
+def _np_mean(a, axis=None, dtype=None, out=None, keepdims=None):
+"""
+Compute the arithmetic mean along the specified axis.
+Returns the average of the array elements.
+The average is taken over the flattened array by default, otherwise over 
the specified axis.
+
+Parameters 
+--
+a : `NDArray`
+NDArray containing numbers whose mean is desired.
+axis : None or int or tuple of ints, optional
+Axis or axes along which the means are computed. The default is to 
compute the mean of the flattened array.
+If this is a tuple of ints, a mean is performed over multiple axes, 
+instead of a single axis or all the axes as before.
+dtype : data-type, optional
+Type to use in computing the mean. For integer inputs, the default is 
float32;
+for floating point inputs, it is the same as the input dtype.
+keepdims : bool, optional
+If this is set to True, the axes which are reduced are left in the 
result
+as dimensions with size one. With this option, the result will 
broadcast correctly
+against the input array.
+If the default value is passed, then keepdims will not be passed 
through to the mean
+method of sub-classes of ndarray, however any non-default value will 
be. If the sub-class
+method does not implement keepdims any exceptions will be raised.
+initial : scalar, deprecated
+The initial value to start with. Default to None.
+out : ndarray, optional
+Alternate output array in which to place the result. The default is 
None; if provided,
+it must have the same shape as the expected output, but the type will 
be cast if necessary. 
+name : string, optional
+Name of the resulting symbol
+
+Returns
+---
+m : ndarray, see dtype parameter above
+If out=None, returns a new array containing the mean values,
+otherwise a reference to the output array is returned.
+
+Notes
+-
+This function differs from the original `numpy.mean
+`_ in
+the following way(s):
+
+- only NDArray is accepted as valid input, python iterables are not 
supported
+- default data type for integer input is float32
+
+Examples
+
+>>> a = np.array([[1, 2], [3, 4]])
+>>> np.mean(a)
+array(2.5, dtype=float32)
+>>> a = np.zeros((2, 512*512), dtype=np.float32)
+>>> a[0,:] = 1.0
+>>> a[1,:] = 0.1
+>>> np.mean(a)
+array(0.55, dtype=float32)
+>>> np.mean(a, dtype=np.float64)
+array(0.55)
+"""
+pass
+
+
+def _np_transpose(a, axes=None):
+"""
+Permute the dimensions of an array.
 
 Review comment:
   Same as above, add the real signature here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, transpose, stack, split, log2, rint and radians

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, 
transpose, stack, split, log2, rint and radians
URL: https://github.com/apache/incubator-mxnet/pull/15370#discussion_r298007318
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -173,3 +173,112 @@ def _np_cumsum(a, axis=None, dtype=None, out=None):
 `axis` is not None or `a` is a 1-d array.
 """
 pass
+
+
+def _np_mean(a, axis=None, dtype=None, out=None, keepdims=None):
+"""
 
 Review comment:
   Add the real signature here: `mean(a, axis=None, dtype=None, out=None, 
keepdims=None)` and an empty line after it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, transpose, stack, split, log2, rint and radians

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, 
transpose, stack, split, log2, rint and radians
URL: https://github.com/apache/incubator-mxnet/pull/15370#discussion_r298007708
 
 

 ##
 File path: python/mxnet/ndarray/numpy/_op.py
 ##
 @@ -186,7 +186,7 @@ def stack(arrays, axis=0, out=None):
 
 Parameters
 --
-arrays : sequence of array_like
+arrays : sequence of ndarray
 
 Review comment:
   ndarray -> ndarrays


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, transpose, stack, split, log2, rint and radians

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, 
transpose, stack, split, log2, rint and radians
URL: https://github.com/apache/incubator-mxnet/pull/15370#discussion_r298008117
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -173,3 +173,112 @@ def _np_cumsum(a, axis=None, dtype=None, out=None):
 `axis` is not None or `a` is a 1-d array.
 """
 pass
+
+
+def _np_mean(a, axis=None, dtype=None, out=None, keepdims=None):
+"""
+Compute the arithmetic mean along the specified axis.
+Returns the average of the array elements.
+The average is taken over the flattened array by default, otherwise over 
the specified axis.
+
+Parameters 
+--
+a : `NDArray`
+NDArray containing numbers whose mean is desired.
+axis : None or int or tuple of ints, optional
+Axis or axes along which the means are computed. The default is to 
compute the mean of the flattened array.
+If this is a tuple of ints, a mean is performed over multiple axes, 
+instead of a single axis or all the axes as before.
+dtype : data-type, optional
+Type to use in computing the mean. For integer inputs, the default is 
float32;
+for floating point inputs, it is the same as the input dtype.
+keepdims : bool, optional
+If this is set to True, the axes which are reduced are left in the 
result
+as dimensions with size one. With this option, the result will 
broadcast correctly
+against the input array.
+If the default value is passed, then keepdims will not be passed 
through to the mean
+method of sub-classes of ndarray, however any non-default value will 
be. If the sub-class
+method does not implement keepdims any exceptions will be raised.
+initial : scalar, deprecated
+The initial value to start with. Default to None.
+out : ndarray, optional
+Alternate output array in which to place the result. The default is 
None; if provided,
+it must have the same shape as the expected output, but the type will 
be cast if necessary. 
+name : string, optional
+Name of the resulting symbol
+
+Returns
+---
+m : ndarray, see dtype parameter above
+If out=None, returns a new array containing the mean values,
+otherwise a reference to the output array is returned.
+
+Notes
+-
+This function differs from the original `numpy.mean
+`_ in
+the following way(s):
+
+- only NDArray is accepted as valid input, python iterables are not 
supported
+- default data type for integer input is float32
+
+Examples
+
+>>> a = np.array([[1, 2], [3, 4]])
+>>> np.mean(a)
+array(2.5, dtype=float32)
+>>> a = np.zeros((2, 512*512), dtype=np.float32)
+>>> a[0,:] = 1.0
+>>> a[1,:] = 0.1
+>>> np.mean(a)
+array(0.55, dtype=float32)
+>>> np.mean(a, dtype=np.float64)
+array(0.55)
+"""
+pass
+
+
+def _np_transpose(a, axes=None):
+"""
+Permute the dimensions of an array.
+
+Parameters
+--
+a : NDArray
+Input array.
+axes : list of ints, optional
+By default, reverse the dimensions,
+otherwise permute the axes according to the values given.
+out : ndarray, optional
 
 Review comment:
   `out` is not in the parameter list.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, transpose, stack, split, log2, rint and radians

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, 
transpose, stack, split, log2, rint and radians
URL: https://github.com/apache/incubator-mxnet/pull/15370#discussion_r298008419
 
 

 ##
 File path: python/mxnet/ndarray/numpy/_op.py
 ##
 @@ -882,3 +927,112 @@ def sqrt(x, out=None, **kwargs):
 This function only supports input type of float.
 """
 return _unary_func_helper(x, _npi.sqrt, _np.sqrt, out=out, **kwargs)
+
+
+# pylint: disable=line-too-long
+@set_module('mxnet.ndarray.numpy')
+def rint(x, out=None, **kwargs):
+"""
+Round elements of the array to the nearest integer.
+
+Parameters
+--
+x : ndarray or scalar
+Input array.
+out : ndarray or None
+A location into which the result is stored. If provided, it must have 
a shape that the inputs broadcast to.
+If not provided or None, a freshly-allocated array is returned.
+
+Returns
+---
+out : ndarray or scalar
+Output array is same shape and type as x. This is a scalar if x is a 
scalar.
+
+Notes
+-
+This function differs from the original `numpy.rint
 
 Review comment:
   \`numpy.rint\`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, transpose, stack, split, log2, rint and radians

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15370: [numpy][doc-fix] mean, 
transpose, stack, split, log2, rint and radians
URL: https://github.com/apache/incubator-mxnet/pull/15370#discussion_r298008378
 
 

 ##
 File path: python/mxnet/ndarray/numpy/_op.py
 ##
 @@ -882,3 +927,112 @@ def sqrt(x, out=None, **kwargs):
 This function only supports input type of float.
 """
 return _unary_func_helper(x, _npi.sqrt, _np.sqrt, out=out, **kwargs)
+
+
+# pylint: disable=line-too-long
 
 Review comment:
   Do not add this. Make the line shorter if necessary.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298006635
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -173,3 +217,255 @@ def _np_cumsum(a, axis=None, dtype=None, out=None):
 `axis` is not None or `a` is a 1-d array.
 """
 pass
+
+
+def _np_reciprocal(x, out=None, **kwargs):
 
 Review comment:
   Seems you have changed `_np_reciprocal` to `_npi_reciprocal` in C++ and 
wrapped it in frontend. Then no need to add this here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298001932
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -71,19 +71,63 @@ def _np_ones_like(a):
 pass
 
 
-def _np_zeros_like(a):
-"""Return an array of zeros with the same shape and type as a given array.
+def _np_zeros_like(a, dtype=None, **kwargs):
+"""
+zeros_like(a, dtype=None, order='C', subok=True)
+
+Return an array of zeros with the same shape and type as a given array.
 
 Parameters
 --
 a : ndarray
-The shape and data-type of `a` define these same attributes of
+The shape of `a` define these same attributes of
 the returned array.
-
+dtype : data-type, optional
 
 Review comment:
   Delete dtype and order.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298004544
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -71,19 +71,63 @@ def _np_ones_like(a):
 pass
 
 
-def _np_zeros_like(a):
-"""Return an array of zeros with the same shape and type as a given array.
+def _np_zeros_like(a, dtype=None, **kwargs):
+"""
+zeros_like(a, dtype=None, order='C', subok=True)
+
+Return an array of zeros with the same shape and type as a given array.
 
 Parameters
 --
 a : ndarray
-The shape and data-type of `a` define these same attributes of
+The shape of `a` define these same attributes of
 the returned array.
-
+dtype : data-type, optional
+Overrides the data type of the result.
+.. versionadded:: 1.6.0
+order : {'C'}, optional, default: C
+Store multi-dimensional data in row-major
+(C-style) in memory. Note that the column-major is not supported yet.
+
 Returns
 ---
 out : ndarray
 Array of zeros with the same shape and type as `a`.
+
+
+See Also
+
+empty_like : Return an empty array with shape and type of input.
+ones_like : Return an array of ones with shape and type of input.
+full_like : Return a new array with shape of input filled with value.
+zeros : Return a new array setting values to zero.
+
+Examples
+
+>>> x = np.arange(6)
+>>> x = x.reshape((2, 3))
+>>> x
+array([[0, 1, 2],
+   [3, 4, 5]])
+>>> np.zeros_like(x)
+array([[0, 0, 0],
+   [0, 0, 0]])
+>>> y = np.arange(3, dtype=float)
+>>> y
+array([0., 1., 2.])
+>>> np.zeros_like(y)
+array([0.,  0.,  0.])
+
+Notes
+-
+`ctx` argument is not supported now.
 
 Review comment:
   Change this to "the output `ndarray` has the same `ctx` as the input 
`ndarray`."


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298006831
 
 

 ##
 File path: python/mxnet/symbol/numpy/_symbol.py
 ##
 @@ -1555,4 +1599,157 @@ def sqrt(x, out=None, **kwargs):
 return _unary_func_helper(x, _npi.sqrt, _np.sqrt, out=out, **kwargs)
 
 
+@set_module('mxnet.symbol.numpy')
+def reciprocal(x, out=None, **kwargs):
 
 Review comment:
   Please follow this example to add description. The input is either a 
`_Symbol` or a scalar value, instead of `ndarray`.
   
https://github.com/apache/incubator-mxnet/pull/15377/files#diff-e9b526dba36aa6f2e7dbb6c4d49e9822R1584


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298004573
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -71,19 +71,63 @@ def _np_ones_like(a):
 pass
 
 
-def _np_zeros_like(a):
-"""Return an array of zeros with the same shape and type as a given array.
+def _np_zeros_like(a, dtype=None, **kwargs):
+"""
+zeros_like(a, dtype=None, order='C', subok=True)
+
+Return an array of zeros with the same shape and type as a given array.
 
 Parameters
 --
 a : ndarray
-The shape and data-type of `a` define these same attributes of
+The shape of `a` define these same attributes of
 the returned array.
-
+dtype : data-type, optional
+Overrides the data type of the result.
+.. versionadded:: 1.6.0
+order : {'C'}, optional, default: C
+Store multi-dimensional data in row-major
+(C-style) in memory. Note that the column-major is not supported yet.
+
 Returns
 ---
 out : ndarray
 Array of zeros with the same shape and type as `a`.
+
+
+See Also
+
+empty_like : Return an empty array with shape and type of input.
+ones_like : Return an array of ones with shape and type of input.
+full_like : Return a new array with shape of input filled with value.
+zeros : Return a new array setting values to zero.
+
+Examples
+
+>>> x = np.arange(6)
+>>> x = x.reshape((2, 3))
+>>> x
+array([[0, 1, 2],
+   [3, 4, 5]])
+>>> np.zeros_like(x)
+array([[0, 0, 0],
+   [0, 0, 0]])
+>>> y = np.arange(3, dtype=float)
+>>> y
+array([0., 1., 2.])
+>>> np.zeros_like(y)
+array([0.,  0.,  0.])
+
+Notes
+-
+`ctx` argument is not supported now.
+
+This function differs to the original `numpy.zeros_like
 
 Review comment:
   differs to -> differs from


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298001629
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -71,19 +71,63 @@ def _np_ones_like(a):
 pass
 
 
-def _np_zeros_like(a):
-"""Return an array of zeros with the same shape and type as a given array.
+def _np_zeros_like(a, dtype=None, **kwargs):
 
 Review comment:
   We do not support parameters other than `a`. Please keep the original 
signature.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298004769
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -173,3 +217,255 @@ def _np_cumsum(a, axis=None, dtype=None, out=None):
 `axis` is not None or `a` is a 1-d array.
 """
 pass
+
+
+def _np_reciprocal(x, out=None, **kwargs):
+"""
+reciprocal(x, out=None, dtype=None)
+
+Return the reciprocal of the argument, element-wise.
+Calculates ``1/x``.
+
+Parameters
+--
+x : ndarray
+out : ndarray, None, or tuple of ndarray and None, optional
+A location into which the result is stored. If provided, it must have 
+a shape that the inputs broadcast to. If not provided or None, 
+a freshly-allocated array is returned. A tuple 
+(possible only as a keyword argument) must have length equal to 
+the number of outputs.
+
+Returns
+---
+y : ndarray
+Return array.
+
+
+Examples
+
+>>> np.reciprocal(2.)
+0.5
+>>> np.reciprocal([1, 2., 3.33])
+array([ 1.   ,  0.5  ,  0.3003003])
+
+Notes
+-
+
+.. note::
+This function is not designed to work with integers.
+For integer arguments with absolute value larger than 1 the result is
+always zero because of the way Python handles integer division.  For
+integer zero the result is an overflow.
+
+`ctx` argument is not supported now.
+
+This function differs to the original `numpy.reciprocal
 
 Review comment:
   differs from.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298006430
 
 

 ##
 File path: python/mxnet/ndarray/numpy/_op.py
 ##
 @@ -882,3 +925,168 @@ def sqrt(x, out=None, **kwargs):
 This function only supports input type of float.
 """
 return _unary_func_helper(x, _npi.sqrt, _np.sqrt, out=out, **kwargs)
+
+
+@set_module('mxnet.ndarray.numpy')
+def reciprocal(x, out=None, **kwargs):
+"""
+reciprocal(x, out=None, dtype=None)
+
+Return the reciprocal of the argument, element-wise.
+Calculates ``1/x``.
+
+Parameters
+--
+x : ndarray
 
 Review comment:
   Please follow this to add description for scalar inputs.
   
https://github.com/apache/incubator-mxnet/pull/15377/files#diff-8496f2110e0984377dcc2002ad00f1f6R910


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298003048
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -71,19 +71,63 @@ def _np_ones_like(a):
 pass
 
 
-def _np_zeros_like(a):
-"""Return an array of zeros with the same shape and type as a given array.
+def _np_zeros_like(a, dtype=None, **kwargs):
+"""
+zeros_like(a, dtype=None, order='C', subok=True)
+
+Return an array of zeros with the same shape and type as a given array.
 
 Parameters
 --
 a : ndarray
-The shape and data-type of `a` define these same attributes of
+The shape of `a` define these same attributes of
 the returned array.
-
+dtype : data-type, optional
+Overrides the data type of the result.
+.. versionadded:: 1.6.0
+order : {'C'}, optional, default: C
+Store multi-dimensional data in row-major
+(C-style) in memory. Note that the column-major is not supported yet.
+
 Returns
 ---
 out : ndarray
 Array of zeros with the same shape and type as `a`.
+
+
+See Also
+
+empty_like : Return an empty array with shape and type of input.
+ones_like : Return an array of ones with shape and type of input.
+full_like : Return a new array with shape of input filled with value.
+zeros : Return a new array setting values to zero.
+
+Examples
+
+>>> x = np.arange(6)
+>>> x = x.reshape((2, 3))
+>>> x
+array([[0, 1, 2],
+   [3, 4, 5]])
+>>> np.zeros_like(x)
+array([[0, 0, 0],
+   [0, 0, 0]])
+>>> y = np.arange(3, dtype=float)
 
 Review comment:
   No need to use `dtype=float`. Please update the output with the latest code.
   ```python
   >>> y = np.arange(3)
   >>> y
   array([0., 1., 2.])
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298004754
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -173,3 +217,255 @@ def _np_cumsum(a, axis=None, dtype=None, out=None):
 `axis` is not None or `a` is a 1-d array.
 """
 pass
+
+
+def _np_reciprocal(x, out=None, **kwargs):
+"""
+reciprocal(x, out=None, dtype=None)
+
+Return the reciprocal of the argument, element-wise.
+Calculates ``1/x``.
+
+Parameters
+--
+x : ndarray
+out : ndarray, None, or tuple of ndarray and None, optional
+A location into which the result is stored. If provided, it must have 
+a shape that the inputs broadcast to. If not provided or None, 
+a freshly-allocated array is returned. A tuple 
+(possible only as a keyword argument) must have length equal to 
+the number of outputs.
+
+Returns
+---
+y : ndarray
+Return array.
+
+
+Examples
+
+>>> np.reciprocal(2.)
+0.5
+>>> np.reciprocal([1, 2., 3.33])
+array([ 1.   ,  0.5  ,  0.3003003])
+
+Notes
+-
+
+.. note::
+This function is not designed to work with integers.
+For integer arguments with absolute value larger than 1 the result is
+always zero because of the way Python handles integer division.  For
+integer zero the result is an overflow.
+
+`ctx` argument is not supported now.
 
 Review comment:
   No need to add this.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298004193
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -71,19 +71,63 @@ def _np_ones_like(a):
 pass
 
 
-def _np_zeros_like(a):
-"""Return an array of zeros with the same shape and type as a given array.
+def _np_zeros_like(a, dtype=None, **kwargs):
+"""
+zeros_like(a, dtype=None, order='C', subok=True)
+
+Return an array of zeros with the same shape and type as a given array.
 
 Parameters
 --
 a : ndarray
-The shape and data-type of `a` define these same attributes of
+The shape of `a` define these same attributes of
 the returned array.
-
+dtype : data-type, optional
+Overrides the data type of the result.
+.. versionadded:: 1.6.0
+order : {'C'}, optional, default: C
 
 Review comment:
   Delete.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298006239
 
 

 ##
 File path: python/mxnet/ndarray/numpy/_op.py
 ##
 @@ -882,3 +925,168 @@ def sqrt(x, out=None, **kwargs):
 This function only supports input type of float.
 """
 return _unary_func_helper(x, _npi.sqrt, _np.sqrt, out=out, **kwargs)
+
+
+@set_module('mxnet.ndarray.numpy')
+def reciprocal(x, out=None, **kwargs):
+"""
+reciprocal(x, out=None, dtype=None)
+
+Return the reciprocal of the argument, element-wise.
+Calculates ``1/x``.
+
+Parameters
+--
+x : ndarray
+out : ndarray, None, or tuple of ndarray and None, optional
+A location into which the result is stored. If provided, it must have 
+a shape that the inputs broadcast to. If not provided or None, 
+a freshly-allocated array is returned. A tuple 
+(possible only as a keyword argument) must have length equal to 
+the number of outputs.
+
+Returns
+---
+y : ndarray
+Return array.
+
+
+Examples
+
+>>> np.reciprocal(2.)
+0.5
+>>> np.reciprocal([1, 2., 3.33])
+array([ 1.   ,  0.5  ,  0.3003003])
+
+Notes
+-
+
+.. note::
+This function is not designed to work with integers.
+For integer arguments with absolute value larger than 1 the result is
+always zero because of the way Python handles integer division.  For
+integer zero the result is an overflow.
+
+`ctx` argument is not supported now.
 
 Review comment:
   No need to add this line.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298001733
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -71,19 +71,63 @@ def _np_ones_like(a):
 pass
 
 
-def _np_zeros_like(a):
-"""Return an array of zeros with the same shape and type as a given array.
+def _np_zeros_like(a, dtype=None, **kwargs):
+"""
+zeros_like(a, dtype=None, order='C', subok=True)
+
+Return an array of zeros with the same shape and type as a given array.
 
 Parameters
 --
 a : ndarray
-The shape and data-type of `a` define these same attributes of
 
 Review comment:
   Keep this.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298005904
 
 

 ##
 File path: python/mxnet/ndarray/numpy/_op.py
 ##
 @@ -635,16 +636,19 @@ def tile(A, reps):
 
 @set_module('mxnet.ndarray.numpy')
 def linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None, 
axis=0, **kwargs):  # pylint: disable=too-many-arguments
-"""Return evenly spaced numbers over a specified interval.
+"""
+linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None, 
axis=0, ctx=None)
 
 Review comment:
   No need to add this line.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce commented on a change in pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
reminisce commented on a change in pull request #15377: [numpy][doc-fix] 
zeros_like, linspace, reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377#discussion_r298004840
 
 

 ##
 File path: python/mxnet/_numpy_op_doc.py
 ##
 @@ -173,3 +217,255 @@ def _np_cumsum(a, axis=None, dtype=None, out=None):
 `axis` is not None or `a` is a 1-d array.
 """
 pass
+
+
+def _np_reciprocal(x, out=None, **kwargs):
+"""
+reciprocal(x, out=None, dtype=None)
+
+Return the reciprocal of the argument, element-wise.
+Calculates ``1/x``.
+
+Parameters
+--
+x : ndarray
+out : ndarray, None, or tuple of ndarray and None, optional
+A location into which the result is stored. If provided, it must have 
+a shape that the inputs broadcast to. If not provided or None, 
+a freshly-allocated array is returned. A tuple 
+(possible only as a keyword argument) must have length equal to 
+the number of outputs.
+
+Returns
+---
+y : ndarray
+Return array.
+
+
+Examples
+
+>>> np.reciprocal(2.)
+0.5
+>>> np.reciprocal([1, 2., 3.33])
+array([ 1.   ,  0.5  ,  0.3003003])
+
+Notes
+-
+
+.. note::
+This function is not designed to work with integers.
+For integer arguments with absolute value larger than 1 the result is
+always zero because of the way Python handles integer division.  For
+integer zero the result is an overflow.
+
+`ctx` argument is not supported now.
+
+This function differs to the original `numpy.reciprocal
+
`_ 
in
+the following aspects:
+
+- Only support ndarray now.
+- `where` argument is not supported.
+"""
+pass
+
+
+def _np_square(x, out=None, **kwargs):
+"""
+square(x, out=None, **kwargs)
+
+Return the element-wise square of the input.
+
+Parameters
+--
+x : ndarray
+out : ndarray, None, or tuple of ndarray and None, optional
+A location into which the result is stored. If provided, it must have
+a shape that the inputs broadcast to. If not provided or `None`,
+a freshly-allocated array is returned. A tuple (possible only as a
+keyword argument) must have length equal to the number of outputs.
+
+Returns
+---
+y : ndarray
+Return array.
+
+
+Examples
+
+>>> np.square([-1j, 1])
 
 Review comment:
   This is not supported. Please use mxnet.numpy to run the examples.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] pengzhao-intel commented on issue #15351: fix fp32 flatten issue

2019-06-26 Thread GitBox
pengzhao-intel commented on issue #15351: fix fp32 flatten issue
URL: https://github.com/apache/incubator-mxnet/pull/15351#issuecomment-506181099
 
 
   @TaoLv @ciyongch please help to take a review.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] pengzhao-intel commented on issue #15373: CPP inference example behaves differently from the Python one

2019-06-26 Thread GitBox
pengzhao-intel commented on issue #15373: CPP inference example behaves 
differently from the Python one
URL: 
https://github.com/apache/incubator-mxnet/issues/15373#issuecomment-506176773
 
 
   
https://github.com/apache/incubator-mxnet/tree/master/cpp-package/example/inference


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] pengzhao-intel commented on issue #15373: CPP inference example behaves differently from the Python one

2019-06-26 Thread GitBox
pengzhao-intel commented on issue #15373: CPP inference example behaves 
differently from the Python one
URL: 
https://github.com/apache/incubator-mxnet/issues/15373#issuecomment-506175179
 
 
   Just update the cpp inference example for image processing. 
   Could you try again?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ZhennanQin opened a new pull request #15379: Use omp threads for cpu data loader

2019-06-26 Thread GitBox
ZhennanQin opened a new pull request #15379: Use omp threads for cpu data loader
URL: https://github.com/apache/incubator-mxnet/pull/15379
 
 
   ## Description ##
   CPU data loader works as a normal op, so it's naturally to use omp threads 
number. This change can help to improve data loader performance for 
non-core-binding usage. 
   
   @pengzhao-intel @anirudh2290 
   
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [ ] The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to 
the relevant [JIRA issue](https://issues.apache.org/jira/projects/MXNET/issues) 
created (except PRs with tiny changes)
   - [ ] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage:
   - Unit tests are added for small changes to verify correctness (e.g. adding 
a new operator)
   - Nightly tests are added for complicated/long-running ones (e.g. changing 
distributed kvstore)
   - Build tests will be added for build configuration changes (e.g. adding a 
new build option with NCCL)
   - [ ] Code is well-documented: 
   - For user-facing API changes, API doc string has been updated. 
   - For new C++ functions in header files, their functionalities and arguments 
are documented. 
   - For new examples, README.md is added to explain the what the example does, 
the source of the dataset, expected performance on test set and reference to 
the original paper if applicable
   - Check the API doc at 
http://mxnet-ci-doc.s3-accelerate.dualstack.amazonaws.com/PR-$PR_ID/$BUILD_ID/index.html
   - [ ] To the my best knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   
   ### Changes ###
   - [ ] Feature1, tests, (and when applicable, API doc)
   - [ ] Feature2, tests, (and when applicable, API doc)
   
   ## Comments ##
   - If this change is a backward incompatible change, why must this change be 
made.
   - Interesting edge cases to note here
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] lanking520 opened a new pull request #15378: Add Sparse NDArray support for Scala

2019-06-26 Thread GitBox
lanking520 opened a new pull request #15378: Add Sparse NDArray support for 
Scala
URL: https://github.com/apache/incubator-mxnet/pull/15378
 
 
   ## Description ##
   This is the initial PR to start supporting Sparse `NDArray` for the Scala 
package. 
   
   `SparseNDArray` is a child class of MXNet NDArray. It supports the following 
types:
   - row sparse
   - CSR
   
   Currently, users can call `toSparse` method to convert a `NDArray` from 
dense to sparse or cast a `NDArray` to `SparseNDArray`. User can also call the 
following two methods to create Sparse NDArray from scratch
   - `SparseNDArray.csrMatrix`
   - `SparseNDArray.rowSparseArray`
   
   @gigasquid @zachgk @yzhliu @nswamy @eric-haibin-lin @frankfliu 
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [ ] The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to 
the relevant [JIRA issue](https://issues.apache.org/jira/projects/MXNET/issues) 
created (except PRs with tiny changes)
   - [ ] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage:
   - Unit tests are added for small changes to verify correctness (e.g. adding 
a new operator)
   - Nightly tests are added for complicated/long-running ones (e.g. changing 
distributed kvstore)
   - Build tests will be added for build configuration changes (e.g. adding a 
new build option with NCCL)
   - [ ] Code is well-documented: 
   - For user-facing API changes, API doc string has been updated. 
   - For new C++ functions in header files, their functionalities and arguments 
are documented. 
   - For new examples, README.md is added to explain the what the example does, 
the source of the dataset, expected performance on test set and reference to 
the original paper if applicable
   - Check the API doc at 
http://mxnet-ci-doc.s3-accelerate.dualstack.amazonaws.com/PR-$PR_ID/$BUILD_ID/index.html
   - [ ] To my best knowledge, examples are either not affected by this change, 
or have been fixed to be compatible with this change
   
   ### Changes ###
   - [ ] Feature1, tests, (and when applicable, API doc)
   - [ ] Feature2, tests, (and when applicable, API doc)
   
   ## Comments ##
   - If this change is a backward incompatible change, why must this change be 
made.
   - Interesting edge cases to note here
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] hgt312 opened a new pull request #15377: [numpy][doc-fix] zeros_like, linspace, reciprocal, square, and arcsin

2019-06-26 Thread GitBox
hgt312 opened a new pull request #15377: [numpy][doc-fix] zeros_like, linspace, 
reciprocal, square, and arcsin
URL: https://github.com/apache/incubator-mxnet/pull/15377
 
 
   ## Description ##
   Add the docs for the numpy operators in title.
   Build website successfully but **fail to find `linspace`**
   
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [ ] The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to 
the relevant [JIRA issue](https://issues.apache.org/jira/projects/MXNET/issues) 
created (except PRs with tiny changes)
   - [ ] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage:
   - Unit tests are added for small changes to verify correctness (e.g. adding 
a new operator)
   - Nightly tests are added for complicated/long-running ones (e.g. changing 
distributed kvstore)
   - Build tests will be added for build configuration changes (e.g. adding a 
new build option with NCCL)
   - [ ] Code is well-documented: 
   - For user-facing API changes, API doc string has been updated. 
   - For new C++ functions in header files, their functionalities and arguments 
are documented. 
   - For new examples, README.md is added to explain the what the example does, 
the source of the dataset, expected performance on test set and reference to 
the original paper if applicable
   - Check the API doc at 
http://mxnet-ci-doc.s3-accelerate.dualstack.amazonaws.com/PR-$PR_ID/$BUILD_ID/index.html
   - [ ] To the my best knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   
   ### Changes ###
   - add docs and verify the operations
   - three unary operators are registered to npi namespace, and wrapped in 
corresponding files
   
   ## Comments ##
   Thank @haojin2 and others for reviewing
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] reminisce merged pull request #15368: [numpy] Change d2l chapters cv and gan to use numpy

2019-06-26 Thread GitBox
reminisce merged pull request #15368: [numpy] Change d2l chapters cv and gan to 
use numpy
URL: https://github.com/apache/incubator-mxnet/pull/15368
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] branch numpy updated: [numpy] Change d2l chapters cv and gan to use numpy (#15368)

2019-06-26 Thread reminisce
This is an automated email from the ASF dual-hosted git repository.

reminisce pushed a commit to branch numpy
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/numpy by this push:
 new 6b7525c  [numpy] Change d2l chapters cv and gan to use numpy (#15368)
6b7525c is described below

commit 6b7525cc911b691647c8cad2faad1bad8c196988
Author: reminisce 
AuthorDate: Wed Jun 26 20:35:06 2019 -0700

[numpy] Change d2l chapters cv and gan to use numpy (#15368)

* Change op name style to lower case underscore

* Add ops under image to npx

* Add image submodule to npx

* Fix split_and_load use np

* Fix fine tuning

* Fix bbox and anchor

* Fix odd

* Fix ssd and rcnn

* Remove restriction on binary element-wise scalar

* Fix gan

* Fix sanity

* Try to fix website build failure

* Add npx.random.seed

* Fix doc
---
 python/mxnet/_numpy_op_doc.py  |  5 +-
 python/mxnet/base.py   |  3 +-
 python/mxnet/gluon/block.py| 24 ++-
 python/mxnet/gluon/data/vision/datasets.py |  5 +-
 python/mxnet/gluon/data/vision/transforms.py   | 28 +++-
 python/mxnet/gluon/loss.py | 39 
 python/mxnet/gluon/model_zoo/vision/resnet.py  | 19 --
 python/mxnet/gluon/nn/activations.py   |  8 +--
 python/mxnet/gluon/nn/basic_layers.py  | 26 
 python/mxnet/gluon/nn/conv_layers.py   | 47 ++
 python/mxnet/gluon/rnn/rnn_layer.py|  2 +-
 python/mxnet/gluon/utils.py| 25 
 python/mxnet/image/detection.py| 17 +++--
 python/mxnet/image/image.py| 44 +
 python/mxnet/ndarray/numpy_extension/__init__.py   |  1 +
 .../numpy_extension/image.py}  |  8 +--
 python/mxnet/numpy/__init__.py |  1 +
 python/mxnet/numpy/arrayprint.py   | 62 ++
 python/mxnet/numpy/multiarray.py   | 53 ++--
 python/mxnet/numpy_extension/__init__.py   |  2 +
 .../__init__.py => numpy_extension/image.py}   |  8 +--
 python/mxnet/numpy_extension/random.py | 74 ++
 python/mxnet/symbol/numpy_extension/__init__.py|  1 +
 .../numpy_extension/{__init__.py => image.py}  |  8 +--
 src/io/image_io.cc |  3 +
 src/ndarray/ndarray.cc |  2 +-
 src/operator/contrib/multibox_detection.cc |  4 ++
 src/operator/contrib/multibox_prior.cc |  3 +
 src/operator/contrib/multibox_target.cc|  4 ++
 src/operator/image/crop.cc |  1 +
 src/operator/image/image_random.cc | 13 
 src/operator/image/resize.cc   |  1 +
 src/operator/leaky_relu.cc |  1 +
 src/operator/nn/activation.cc  |  2 +-
 src/operator/nn/batch_norm.cc  |  2 +-
 src/operator/nn/convolution.cc |  2 +-
 src/operator/nn/deconvolution.cc   |  1 +
 src/operator/nn/dropout.cc |  2 +-
 src/operator/nn/fully_connected.cc |  2 +-
 src/operator/nn/layer_norm.cc  |  2 +-
 src/operator/nn/pooling.cc |  2 +-
 src/operator/numpy/np_elemwise_broadcast_op.cc | 11 +---
 src/operator/rnn.cc|  2 +-
 src/operator/roi_pooling.cc|  4 ++
 src/operator/sequence_mask.cc  |  2 +-
 .../tensor/elemwise_binary_scalar_op_extended.cc   |  3 +-
 src/operator/tensor/elemwise_unary_op_basic.cc |  1 +
 src/operator/tensor/indexing_op.cc |  2 +-
 48 files changed, 452 insertions(+), 130 deletions(-)

diff --git a/python/mxnet/_numpy_op_doc.py b/python/mxnet/_numpy_op_doc.py
index 995a65c..ca8636c 100644
--- a/python/mxnet/_numpy_op_doc.py
+++ b/python/mxnet/_numpy_op_doc.py
@@ -21,7 +21,10 @@
 
 
 def _np_reshape(a, newshape, order='C'):
-"""Gives a new shape to an array without changing its data.
+"""
+reshape(a, newshape, order='C')
+
+Gives a new shape to an array without changing its data.
 
 Parameters
 --
diff --git a/python/mxnet/base.py b/python/mxnet/base.py
index a4f75c6..545c2ea 100644
--- a/python/mxnet/base.py
+++ b/python/mxnet/base.py
@@ -744,6 +744,7 @@ _NP_OP_PREFIX = '_np_'
 _NP_OP_SUBMODULE_LIST = ['_random_', '_linalg_']
 
 _NP_EXT_OP_PREFIX = '_npx_'
+_NP_EXT_OP_SUBMODULE_LIST = ['_image_']
 
 _NP_INTERNAL_OP_PREFIX = '_npi_'
 
@@ -784,7 +785,7 @@ def _init_np_op_module(root_module_name, np_module_name, 
mx_module_name, make_op
 submodule_name_list = 

[GitHub] [incubator-mxnet] larroy commented on issue #11516: CCache does not work for NVCC or make-based builds

2019-06-26 Thread GitBox
larroy commented on issue #11516: CCache does not work for NVCC or make-based 
builds
URL: 
https://github.com/apache/incubator-mxnet/issues/11516#issuecomment-506140691
 
 
   Still doesn't work with nvcc. I think we should leave open.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] pengzhao-intel commented on issue #15298: Fix Cached_op with static_shape=true

2019-06-26 Thread GitBox
pengzhao-intel commented on issue #15298: Fix Cached_op with static_shape=true
URL: https://github.com/apache/incubator-mxnet/pull/15298#issuecomment-506138573
 
 
   Please pick up this fix to r1.5 branch.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] branch master updated: Fix Cached_op with static_shape=true (#15298)

2019-06-26 Thread patriczhao
This is an automated email from the ASF dual-hosted git repository.

patriczhao pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 582489c  Fix Cached_op with static_shape=true (#15298)
582489c is described below

commit 582489cebc16a8af21738281dcaee5eee54e478d
Author: Zhennan Qin 
AuthorDate: Thu Jun 27 11:08:26 2019 +0800

Fix Cached_op with static_shape=true (#15298)

* Fix

* run ci
---
 src/imperative/cached_op.cc |  7 +--
 src/nnvm/legacy_op_util.cc  | 47 ++---
 2 files changed, 24 insertions(+), 30 deletions(-)

diff --git a/src/imperative/cached_op.cc b/src/imperative/cached_op.cc
index d7e1543..efe3801 100644
--- a/src/imperative/cached_op.cc
+++ b/src/imperative/cached_op.cc
@@ -81,6 +81,7 @@ struct CachedOp::CachedOpState {
 
   std::vector buff;
   std::vector arrays;
+  std::vector arrays_with_in_out;
   std::vector array_reqs;
 
   std::vector op_states;
@@ -762,7 +763,8 @@ OpStatePtr CachedOp::StaticForward(
   // We are going to add input and output arrays to the array list.
   // The input and output arrays should only be valid for this run,
   // so we shouldn't modify the state's array list.
-  auto arrays = state.arrays;
+  state.arrays_with_in_out = state.arrays;
+  auto& arrays = state.arrays_with_in_out;
   if (config_.static_shape) {
 for (auto i : config_.param_indices) {
   auto nid = idx.input_nodes()[i];
@@ -1063,7 +1065,8 @@ void CachedOp::StaticBackward(
   // We are going to add input and output arrays to the array list.
   // The input and output arrays should only be valid for this run,
   // so we shouldn't modify the state's array list.
-  auto arrays = state.arrays;
+  state.arrays_with_in_out = state.arrays;
+  auto& arrays = state.arrays_with_in_out;
   for (size_t i = 0; i < state.info.bwd_input_eid.size(); ++i) {
 auto eid = state.info.bwd_input_eid[i];
 if (eid == kEidNotExist) {
diff --git a/src/nnvm/legacy_op_util.cc b/src/nnvm/legacy_op_util.cc
index 698666f..3e03b6b 100644
--- a/src/nnvm/legacy_op_util.cc
+++ b/src/nnvm/legacy_op_util.cc
@@ -79,7 +79,6 @@ class OperatorState {
  public:
   OperatorState(Operator *opr, const OperatorProperty *prop) {
 opr_ = opr;
-fwd_init_ = bwd_init_ = false;
 
 in_data_fwd_.resize(prop->ListArguments().size());
 in_data_bwd_.resize(prop->ListArguments().size());
@@ -110,19 +109,16 @@ class OperatorState {
const std::vector& inputs,
const std::vector& req,
const std::vector& outputs) {
-if (!fwd_init_) {
-  CHECK_EQ(inputs.size(), in_data_fwd_.size() + aux_data_.size());
-  CHECK_EQ(outputs.size(), out_data_.size());
-  // in_data_bwd_ has the same tblobs as the ones in in_data_fwd_, except 
that the ones
-  // referred by arg_data_ptr_ will be overriden
-  for (size_t i = 0; i < in_data_fwd_.size(); ++i) in_data_fwd_[i] = 
inputs[i];
-  for (size_t i = 0; i < in_data_fwd_.size(); ++i) in_data_bwd_[i] = 
inputs[i];
-  for (size_t i = 0; i < aux_data_.size(); ++i) {
-aux_data_[i] = inputs[i + in_data_fwd_.size()];
-  }
-  for (size_t i = 0; i < out_data_.size(); ++i) out_data_[i] = outputs[i];
-  fwd_init_ = true;
+CHECK_EQ(inputs.size(), in_data_fwd_.size() + aux_data_.size());
+CHECK_EQ(outputs.size(), out_data_.size());
+// in_data_bwd_ has the same tblobs as the ones in in_data_fwd_, except 
that the ones
+// referred by arg_data_ptr_ will be overriden
+for (size_t i = 0; i < in_data_fwd_.size(); ++i) in_data_fwd_[i] = 
inputs[i];
+for (size_t i = 0; i < in_data_fwd_.size(); ++i) in_data_bwd_[i] = 
inputs[i];
+for (size_t i = 0; i < aux_data_.size(); ++i) {
+  aux_data_[i] = inputs[i + in_data_fwd_.size()];
 }
+for (size_t i = 0; i < out_data_.size(); ++i) out_data_[i] = outputs[i];
 opr_->Forward(ctx, in_data_fwd_, req, out_data_, aux_data_);
   }
 
@@ -130,27 +126,22 @@ class OperatorState {
 const std::vector& inputs,
 const std::vector& req,
 const std::vector& outputs) {
-if (!bwd_init_) {
-  CHECK(fwd_init_);
-  CHECK_EQ(arg_data_ptr_.size() + aux_data_.size(), inputs.size());
-  // override tblobs pointed by arg_data_ptr_ since they might not contain
-  // initialized data during forward pass.
-  for (size_t i = 0; i < arg_data_ptr_.size(); ++i) {
-*arg_data_ptr_[i] = inputs[i];
-  }
-  for (size_t i = 0; i < aux_data_.size(); ++i) {
-aux_data_[i] = inputs[inputs.size() - aux_data_.size() + i];
-  }
-  CHECK_EQ(outputs.size(), in_grad_.size());
-  for (size_t i = 0; i < outputs.size(); ++i) in_grad_[i] = outputs[i];
-  bwd_init_ = true;
+CHECK_EQ(arg_data_ptr_.size() + aux_data_.size(), inputs.size());
+// override tblobs pointed by 

[GitHub] [incubator-mxnet] pengzhao-intel commented on issue #15298: Fix Cached_op with static_shape=true

2019-06-26 Thread GitBox
pengzhao-intel commented on issue #15298: Fix Cached_op with static_shape=true
URL: https://github.com/apache/incubator-mxnet/pull/15298#issuecomment-506138483
 
 
   Thanks, merging now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] pengzhao-intel merged pull request #15298: Fix Cached_op with static_shape=true

2019-06-26 Thread GitBox
pengzhao-intel merged pull request #15298: Fix Cached_op with static_shape=true
URL: https://github.com/apache/incubator-mxnet/pull/15298
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ZhennanQin edited a comment on issue #15298: Fix Cached_op with static_shape=true

2019-06-26 Thread GitBox
ZhennanQin edited a comment on issue #15298: Fix Cached_op with 
static_shape=true
URL: https://github.com/apache/incubator-mxnet/pull/15298#issuecomment-506134064
 
 
   @pengzhao-intel  Tested symbolic & gluon inference speed and bert, seems 
everything works fine.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ZhennanQin commented on issue #15298: Fix Cached_op with static_shape=true

2019-06-26 Thread GitBox
ZhennanQin commented on issue #15298: Fix Cached_op with static_shape=true
URL: https://github.com/apache/incubator-mxnet/pull/15298#issuecomment-506134064
 
 
   @PatricZhao Tested symbolic & gluon inference speed and bert, seems 
everything works fine.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sheep94lion commented on issue #12159: C++ api executor->Forward(false) is much slower than MXPredForward?

2019-06-26 Thread GitBox
sheep94lion commented on issue #12159: C++ api executor->Forward(false) is much 
slower than MXPredForward?
URL: 
https://github.com/apache/incubator-mxnet/issues/12159#issuecomment-506118494
 
 
   > Hi @xiaojingxie
   > 
   > The Executor->Forward() method performs more tasks as compared to 
MXPredForward(). It invokes following 2 C APIs internally
   > MXExecutorForward()
   > and
   > MXExecutorOutputs()
   > where as MXPredForward() only runs the forward pass and does not retrieve 
the outputs.
   > In addition, the outputs of the forward pass are copied to the 'output' 
array in the Executor object, for faster retrieval later.
   > 
   > Therefore, we can not compare MXPredForward() with Executor->Forward(). 
For the correct comparison, we should find out the time required to invoke 
MXPredGetOutputShape() and MXPredGetOutput() after invoking the MXPredForward()
   > 
   > I hope this answers the question.
   > @mxnet-label-bot add [Pending Requester Info]
   
   The running time of C API MXPredForward is much shorter than the running 
time of MXPredGetOutput:
   ```
   auto start = std::chrono::high_resolution_clock::now();
   MXPredForward(pred_hnd);
   auto stop = std::chrono::high_resolution_clock::now();
   auto duration = std::chrono::duration_cast(stop - 
start);
   LOGI("MXPredForward: %d microseconds.", duration.count());
   std::vector data(size);
   start = std::chrono::high_resolution_clock::now();
   MXPredGetOutput(pred_hnd, output_index, &(data[0]), 
static_cast(size));
   stop = std::chrono::high_resolution_clock::now();
   duration = std::chrono::duration_cast(stop - 
start);
   LOGI("MXPredGetOutput: %d microseconds.", duration.count());
   ```
   The result is:
   ```
   I/MXNET: MXPredForward: 106 microseconds.
   I/MXNET: MXPredGetOutput: 3748967 microseconds.
   ```
   Why? Is it something related to lazy evaluation?
   The code runs on Pixel3 with Snapdragon 835.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sandeep-krishnamurthy commented on issue #14421: Updating mxnet from 1.0.0, networks give different outputs

2019-06-26 Thread GitBox
sandeep-krishnamurthy commented on issue #14421: Updating mxnet from 1.0.0, 
networks give different outputs
URL: 
https://github.com/apache/incubator-mxnet/issues/14421#issuecomment-506105958
 
 
   Closing this issue as discussed in the PR #15026


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sandeep-krishnamurthy closed issue #14421: Updating mxnet from 1.0.0, networks give different outputs

2019-06-26 Thread GitBox
sandeep-krishnamurthy closed issue #14421: Updating mxnet from 1.0.0, networks 
give different outputs
URL: https://github.com/apache/incubator-mxnet/issues/14421
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sandeep-krishnamurthy commented on issue #15026: [MXNET-14421] Make global pooling backwards compatible

2019-06-26 Thread GitBox
sandeep-krishnamurthy commented on issue #15026: [MXNET-14421] Make global 
pooling backwards compatible
URL: https://github.com/apache/incubator-mxnet/pull/15026#issuecomment-506105427
 
 
   > I agree probably not worth merging it’s an edge case, but hopefully if 
someone comes across this issue they’ll find this and see the PR as a work 
around. We’ve started the process of retraining or deprecating networks 
effected by this bug (a long process...) so well eventually be able to upgrade.
   
   Thank you. Closing the PR and the issue.
   
   Sorry, it is unfortunate you people have to retrain a lot of models due to 
this bug. Keep us updated and cut us if you find any other issues.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sandeep-krishnamurthy closed pull request #15026: [MXNET-14421] Make global pooling backwards compatible

2019-06-26 Thread GitBox
sandeep-krishnamurthy closed pull request #15026: [MXNET-14421] Make global 
pooling backwards compatible
URL: https://github.com/apache/incubator-mxnet/pull/15026
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] anirudh2290 commented on issue #15118: Conversion from FP32 model to Mixed Precision model

2019-06-26 Thread GitBox
anirudh2290 commented on issue #15118: Conversion from FP32 model to Mixed 
Precision model
URL: https://github.com/apache/incubator-mxnet/pull/15118#issuecomment-506105272
 
 
   From an offline review done by sudipta@, feedback was provided that it is 
important to for users to be able to obtain models with params casted wherever 
possible. After additional discussion with @ptrendx , we decided to add 
additional graph pass, which would go through all inputs of amp_cast and 
amp_multicast to infer the dtypes of the input nodes wherever possible. I have 
added the support for the same in the recent commits.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] pengzhao-intel commented on a change in pull request #15167: [WIP] Pointwise fusion for GPU

2019-06-26 Thread GitBox
pengzhao-intel commented on a change in pull request #15167: [WIP] Pointwise 
fusion for GPU
URL: https://github.com/apache/incubator-mxnet/pull/15167#discussion_r297929053
 
 

 ##
 File path: docs/faq/env_var.md
 ##
 @@ -309,6 +309,17 @@ If ctypes is used, it must be 
`mxnet._ctypes.ndarray.NDArrayBase`.
 with float32.
   - Model accuracies do not necessarily improve with this environment variable 
turned on.
 
+* MXNET_USE_FUSION
 
 Review comment:
   As I suggested in dev@, could we align the variable to 
MXNET_SUBGRAPH_BACKEND to make the easy usage for the user? Currently, this env 
is for CPU operator fusion.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] pengzhao-intel edited a comment on issue #15298: Fix Cached_op with static_shape=true

2019-06-26 Thread GitBox
pengzhao-intel edited a comment on issue #15298: Fix Cached_op with 
static_shape=true
URL: https://github.com/apache/incubator-mxnet/pull/15298#issuecomment-506076794
 
 
   @ZhennanQin Please verify the performance of this PR with our internal tests 
and NLP tests.
   If everything is OK, I will merge this soon.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sheep94lion edited a comment on issue #15333: Compile error for custom operators in C

2019-06-26 Thread GitBox
sheep94lion edited a comment on issue #15333: Compile error for custom 
operators in C
URL: 
https://github.com/apache/incubator-mxnet/issues/15333#issuecomment-506102246
 
 
   The problem is with the type of parameters. It seems you can not use float*.
   After reading some source code of other operators. I find I can use
   ```
   mxnet::Tuple bbox_mean, bbox_std;
   ```
   What is confusing is the problem is actually in .h file, while the error 
report refers to the .cc file.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sheep94lion closed issue #15333: Compile error for custom operators in C

2019-06-26 Thread GitBox
sheep94lion closed issue #15333: Compile error for custom operators in C
URL: https://github.com/apache/incubator-mxnet/issues/15333
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sheep94lion commented on issue #15333: Compile error for custom operators in C

2019-06-26 Thread GitBox
sheep94lion commented on issue #15333: Compile error for custom operators in C
URL: 
https://github.com/apache/incubator-mxnet/issues/15333#issuecomment-506102246
 
 
   The problem is with the type of parameters. It seems you can not use float*.
   After reading some source code of other operators. I find I can use 
'mxnet::Tuple bbox_mean, bbox_std;'


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet-site] branch asf-site updated: Bump the publish timestamp.

2019-06-26 Thread marcoabreu
This is an automated email from the ASF dual-hosted git repository.

marcoabreu pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 26cbae7  Bump the publish timestamp.
26cbae7 is described below

commit 26cbae7fb4174f7fc301abee4fa35d657f9ce936
Author: mxnet-ci 
AuthorDate: Thu Jun 27 01:25:45 2019 +

Bump the publish timestamp.
---
 date.txt | 1 +
 1 file changed, 1 insertion(+)

diff --git a/date.txt b/date.txt
new file mode 100644
index 000..cb3fd05
--- /dev/null
+++ b/date.txt
@@ -0,0 +1 @@
+Thu Jun 27 01:25:45 UTC 2019



[GitHub] [incubator-mxnet] jmerkow commented on issue #15026: [MXNET-14421] Make global pooling backwards compatible

2019-06-26 Thread GitBox
jmerkow commented on issue #15026: [MXNET-14421] Make global pooling backwards 
compatible
URL: https://github.com/apache/incubator-mxnet/pull/15026#issuecomment-506097470
 
 
   I agree probably not worth merging it’s an edge case, but hopefully if 
someone comes across this issue they’ll find this and see the PR as a work 
around. We’ve started the process of retraining or deprecating networks 
effected by this bug (a long process...) so well eventually be able to upgrade. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on a change in pull request #15285: [WIP] Graph dumper

2019-06-26 Thread GitBox
larroy commented on a change in pull request #15285: [WIP] Graph dumper
URL: https://github.com/apache/incubator-mxnet/pull/15285#discussion_r297912617
 
 

 ##
 File path: src/common/directed_graph.h
 ##
 @@ -0,0 +1,201 @@
+  /*
 
 Review comment:
   Actually I implemented this before I knew the json dumping code, I 
understand there's a bit of functionality duplication, on the other hand we can 
customize without touching tvm and it's a pure C++ implementation.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] lanking520 commented on issue #15361: [WIP] Exclude external dependencies from MXNet JAR.

2019-06-26 Thread GitBox
lanking520 commented on issue #15361: [WIP] Exclude external dependencies from 
MXNet JAR.
URL: https://github.com/apache/incubator-mxnet/pull/15361#issuecomment-506097308
 
 
   To make it clear, all you need to do is to add your removed dependencies in 
`scala-package/deploy/src/main/deploy/deploy.xml ` and that should solve the 
Clojure problem


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] aaronmarkham closed issue #15160: Missing videos in autograd docs page

2019-06-26 Thread GitBox
aaronmarkham closed issue #15160: Missing videos in autograd docs page
URL: https://github.com/apache/incubator-mxnet/issues/15160
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] aaronmarkham opened a new issue #15376: beta website doesn't render html tags for images

2019-06-26 Thread GitBox
aaronmarkham opened a new issue #15376: beta website doesn't render html tags 
for images
URL: https://github.com/apache/incubator-mxnet/issues/15376
 
 
   ## Description
   
   When trying to fix https://github.com/mli/new-docs/pull/123
   I found that html code that should render fine in a markdown file, doesn't 
get converted when building the site. However, video html codes worked fine.
   There's no warning or error. It just skips those lines of code.
   
   Seems like a bug with the new site's build.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] mxnet-label-bot commented on issue #15376: beta website doesn't render html tags for images

2019-06-26 Thread GitBox
mxnet-label-bot commented on issue #15376: beta website doesn't render html 
tags for images
URL: 
https://github.com/apache/incubator-mxnet/issues/15376#issuecomment-506096772
 
 
   Hey, this is the MXNet Label Bot. 
Thank you for submitting the issue! I will try and suggest some labels so 
that the appropriate MXNet community members can help resolve it. 
Here are my recommended labels: Doc


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15285: [WIP] Graph dumper

2019-06-26 Thread GitBox
larroy commented on issue #15285: [WIP] Graph dumper
URL: https://github.com/apache/incubator-mxnet/pull/15285#issuecomment-506096295
 
 
   First gradient, and second gradient:
   
   ```
   Forward graph: 
   digraph G {
 "var0" -> "log10 node_0"
   }
   Backward graph: 
   digraph G {
 "head grad #0" -> "_backward_log10 node_0_backward"
 "var0" -> "log10 node_0"
 "var0" -> "_backward_log10 node_0_backward"
   }
   Forward graph: 
   digraph G {
 "head grad #0" -> "_backward_log10 node_1"
 "var0" -> "_backward_log10 node_1"
   }
   Backward graph: 
   digraph G {
 "head grad #0" -> "elemwise_mul node_1_backward_grad_grad_inp"
 "head grad #0" -> "_backward_log10 node_1"
 "var0" -> "_backward_log10 node_1"
 "var0" -> "reciprocal node_1_dlogx"
 "_backward_log10 node_1" -> "elemwise_mul node_1_d2ydx2_mid"
 "reciprocal node_1_dlogx" -> "elemwise_mul node_1_d2ydx2_mid"
 "elemwise_mul node_1_d2ydx2_mid" -> "negative node_1_d2ydx2"
 "negative node_1_d2ydx2" -> "elemwise_mul node_1_backward_grad_grad_inp"
   }
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid

2019-06-26 Thread GitBox
larroy commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid
URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-506095186
 
 
   Can we merge this?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15285: [WIP] Graph dumper

2019-06-26 Thread GitBox
larroy commented on issue #15285: [WIP] Graph dumper
URL: https://github.com/apache/incubator-mxnet/pull/15285#issuecomment-506093596
 
 
   Named the head grads
   ![graphviz 
(3)](https://user-images.githubusercontent.com/928489/60224819-f5be7100-9838-11e9-89dd-bb762b3e64f0.png)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on a change in pull request #15285: [WIP] Graph dumper

2019-06-26 Thread GitBox
larroy commented on a change in pull request #15285: [WIP] Graph dumper
URL: https://github.com/apache/incubator-mxnet/pull/15285#discussion_r297912617
 
 

 ##
 File path: src/common/directed_graph.h
 ##
 @@ -0,0 +1,201 @@
+  /*
 
 Review comment:
   Actually I implemented this before I knew the json dumping code, I 
understand there's a bit of functionality duplication, on the other hand we can 
customize without touching tvm.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy edited a comment on issue #15285: [WIP] Graph dumper

2019-06-26 Thread GitBox
larroy edited a comment on issue #15285: [WIP] Graph dumper
URL: https://github.com/apache/incubator-mxnet/pull/15285#issuecomment-506086935
 
 
   @apeforest @kshitij12345 sounds familiar?
   
   http://bit.ly/2X6QFYS
   ![graphviz 
(2)](https://user-images.githubusercontent.com/928489/60224502-898f3d80-9837-11e9-898a-97261f24f705.png)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy edited a comment on issue #15285: [WIP] Graph dumper

2019-06-26 Thread GitBox
larroy edited a comment on issue #15285: [WIP] Graph dumper
URL: https://github.com/apache/incubator-mxnet/pull/15285#issuecomment-506086935
 
 
   @apeforest @kshitij12345 sounds familiar?
   
   ![graphviz 
(2)](https://user-images.githubusercontent.com/928489/60224502-898f3d80-9837-11e9-898a-97261f24f705.png)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] branch master updated: [DOC] Clarify that global pooling is going to reset padding (#15269)

2019-06-26 Thread aaronmarkham
This is an automated email from the ASF dual-hosted git repository.

aaronmarkham pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new ba30644  [DOC] Clarify that global pooling is going to reset padding 
(#15269)
ba30644 is described below

commit ba30644612357930fd4543f01800d89be7963f8e
Author: Pedro Larroy 
AuthorDate: Wed Jun 26 17:13:53 2019 -0700

[DOC] Clarify that global pooling is going to reset padding (#15269)

This behaviour changed from older MXNet versions in which global pooling
would consider padding. This clarifies the user documentation.

See also #14421
---
 src/operator/nn/pooling.cc | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/src/operator/nn/pooling.cc b/src/operator/nn/pooling.cc
index 8705577..41a486e 100644
--- a/src/operator/nn/pooling.cc
+++ b/src/operator/nn/pooling.cc
@@ -389,8 +389,8 @@ The definition of *f* depends on ``pooling_convention``, 
which has two options:
 
 f(x, k, p, s) = ceil((x+2*p-k)/s)+1
 
-But ``global_pool`` is set to be true, then do a global pooling, namely reset
-``kernel=(height, width)``.
+When ``global_pool`` is set to be true, then global pooling is performed. It 
will reset
+``kernel=(height, width)`` and set the appropiate padding to 0.
 
 Three pooling options are supported by ``pool_type``:
 



[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #15082: Update CUDA 10.1 doc

2019-06-26 Thread GitBox
aaronmarkham commented on a change in pull request #15082: Update CUDA 10.1 doc
URL: https://github.com/apache/incubator-mxnet/pull/15082#discussion_r297913456
 
 

 ##
 File path: tools/pip/doc/CU101_ADDITIONAL.md
 ##
 @@ -0,0 +1,46 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Prerequisites
+-
+This package supports Linux and Windows platforms. You may also want to check:
+- [mxnet-cu101mkl](https://pypi.python.org/pypi/mxnet-cu101mkl/) with 
CUDA-10.1 support and MKLDNN support.
+- [mxnet-cu100](https://pypi.python.org/pypi/mxnet-cu100/) with CUDA-10.0 
support.
+- [mxnet-cu100mkl](https://pypi.python.org/pypi/mxnet-cu100mkl/) with 
CUDA-10.0 support and MKLDNN support.
+- [mxnet-cu92](https://pypi.python.org/pypi/mxnet-cu92/) with CUDA-9.2 support.
+- [mxnet-cu92mkl](https://pypi.python.org/pypi/mxnet-cu92mkl/) with CUDA-9.2 
support and MKLDNN support.
+- [mxnet-cu91](https://pypi.python.org/pypi/mxnet-cu91/) with CUDA-9.1 support.
+- [mxnet-cu91mkl](https://pypi.python.org/pypi/mxnet-cu91mkl/) with CUDA-9.1 
support and MKLDNN support.
+- [mxnet-cu90](https://pypi.python.org/pypi/mxnet-cu90/) with CUDA-9.0 support.
+- [mxnet-cu90mkl](https://pypi.python.org/pypi/mxnet-cu90mkl/) with CUDA-9.0 
support and MKLDNN support.
+- [mxnet-cu80](https://pypi.python.org/pypi/mxnet-cu80/) with CUDA-8.0 support.
+- [mxnet-cu80mkl](https://pypi.python.org/pypi/mxnet-cu80mkl/) with CUDA-8.0 
support and MKLDNN support.
+- [mxnet-cu75](https://pypi.python.org/pypi/mxnet-cu75/) with CUDA-7.5 
support. Note that we doesn't maintain CUDA 7.5 anymore.
+- [mxnet-cu75mkl](https://pypi.python.org/pypi/mxnet-cu75mkl/) with CUDA-7.5 
support and MKLDNN support. Note that we doesn't maintain CUDA 7.5 anymore.
 
 Review comment:
   ```suggestion
   - [mxnet-cu75mkl](https://pypi.python.org/pypi/mxnet-cu75mkl/) with CUDA-7.5 
support and MKLDNN support. Note that CUDA 7.5 is no longer maintained.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #15082: Update CUDA 10.1 doc

2019-06-26 Thread GitBox
aaronmarkham commented on a change in pull request #15082: Update CUDA 10.1 doc
URL: https://github.com/apache/incubator-mxnet/pull/15082#discussion_r297913424
 
 

 ##
 File path: tools/pip/doc/CU101_ADDITIONAL.md
 ##
 @@ -0,0 +1,46 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Prerequisites
+-
+This package supports Linux and Windows platforms. You may also want to check:
+- [mxnet-cu101mkl](https://pypi.python.org/pypi/mxnet-cu101mkl/) with 
CUDA-10.1 support and MKLDNN support.
+- [mxnet-cu100](https://pypi.python.org/pypi/mxnet-cu100/) with CUDA-10.0 
support.
+- [mxnet-cu100mkl](https://pypi.python.org/pypi/mxnet-cu100mkl/) with 
CUDA-10.0 support and MKLDNN support.
+- [mxnet-cu92](https://pypi.python.org/pypi/mxnet-cu92/) with CUDA-9.2 support.
+- [mxnet-cu92mkl](https://pypi.python.org/pypi/mxnet-cu92mkl/) with CUDA-9.2 
support and MKLDNN support.
+- [mxnet-cu91](https://pypi.python.org/pypi/mxnet-cu91/) with CUDA-9.1 support.
+- [mxnet-cu91mkl](https://pypi.python.org/pypi/mxnet-cu91mkl/) with CUDA-9.1 
support and MKLDNN support.
+- [mxnet-cu90](https://pypi.python.org/pypi/mxnet-cu90/) with CUDA-9.0 support.
+- [mxnet-cu90mkl](https://pypi.python.org/pypi/mxnet-cu90mkl/) with CUDA-9.0 
support and MKLDNN support.
+- [mxnet-cu80](https://pypi.python.org/pypi/mxnet-cu80/) with CUDA-8.0 support.
+- [mxnet-cu80mkl](https://pypi.python.org/pypi/mxnet-cu80mkl/) with CUDA-8.0 
support and MKLDNN support.
+- [mxnet-cu75](https://pypi.python.org/pypi/mxnet-cu75/) with CUDA-7.5 
support. Note that we doesn't maintain CUDA 7.5 anymore.
 
 Review comment:
   ```suggestion
   - [mxnet-cu75](https://pypi.python.org/pypi/mxnet-cu75/) with CUDA-7.5 
support. Note that CUDA 7.5 is no longer maintained.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] aaronmarkham merged pull request #15269: [DOC] Clarify that global pooling is going to reset padding

2019-06-26 Thread GitBox
aaronmarkham merged pull request #15269: [DOC] Clarify that global pooling is 
going to reset padding
URL: https://github.com/apache/incubator-mxnet/pull/15269
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #15082: Update CUDA 10.1 doc

2019-06-26 Thread GitBox
aaronmarkham commented on a change in pull request #15082: Update CUDA 10.1 doc
URL: https://github.com/apache/incubator-mxnet/pull/15082#discussion_r297913351
 
 

 ##
 File path: tools/pip/doc/CU101MKL_ADDITIONAL.md 
 ##
 @@ -0,0 +1,46 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Prerequisites
+-
+This package supports Linux and Windows platforms. You may also want to check:
+- [mxnet-cu101](https://pypi.python.org/pypi/mxnet-cu101/) with CUDA-10.1 
support.
+- [mxnet-cu100](https://pypi.python.org/pypi/mxnet-cu100/) with CUDA-10.0 
support.
+- [mxnet-cu100mkl](https://pypi.python.org/pypi/mxnet-cu100mkl/) with 
CUDA-10.0 support and MKLDNN support.
+- [mxnet-cu92](https://pypi.python.org/pypi/mxnet-cu92/) with CUDA-9.2 support.
+- [mxnet-cu92mkl](https://pypi.python.org/pypi/mxnet-cu92mkl/) with CUDA-9.2 
support and MKLDNN support.
+- [mxnet-cu91](https://pypi.python.org/pypi/mxnet-cu91/) with CUDA-9.1 support.
+- [mxnet-cu91mkl](https://pypi.python.org/pypi/mxnet-cu91mkl/) with CUDA-9.1 
support and MKLDNN support.
+- [mxnet-cu90](https://pypi.python.org/pypi/mxnet-cu90/) with CUDA-9.0 support.
+- [mxnet-cu90mkl](https://pypi.python.org/pypi/mxnet-cu90mkl/) with CUDA-9.0 
support and MKLDNN support.
+- [mxnet-cu80](https://pypi.python.org/pypi/mxnet-cu80/) with CUDA-8.0 support.
+- [mxnet-cu80mkl](https://pypi.python.org/pypi/mxnet-cu80mkl/) with CUDA-8.0 
support and MKLDNN support.
+- [mxnet-cu75](https://pypi.python.org/pypi/mxnet-cu75/) with CUDA-7.5 
support. Note that we doesn't maintain CUDA 7.5 anymore.
+- [mxnet-cu75mkl](https://pypi.python.org/pypi/mxnet-cu75mkl/) with CUDA-7.5 
support and MKLDNN support. Note that we doesn't maintain CUDA 7.5 anymore.
 
 Review comment:
   ```suggestion
   - [mxnet-cu75mkl](https://pypi.python.org/pypi/mxnet-cu75mkl/) with CUDA-7.5 
support and MKLDNN support. Note that CUDA 7.5 is no longer maintained.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #15082: Update CUDA 10.1 doc

2019-06-26 Thread GitBox
aaronmarkham commented on a change in pull request #15082: Update CUDA 10.1 doc
URL: https://github.com/apache/incubator-mxnet/pull/15082#discussion_r297913646
 
 

 ##
 File path: tools/pip/doc/CU101_ADDITIONAL.md
 ##
 @@ -0,0 +1,46 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Prerequisites
+-
+This package supports Linux and Windows platforms. You may also want to check:
+- [mxnet-cu101mkl](https://pypi.python.org/pypi/mxnet-cu101mkl/) with 
CUDA-10.1 support and MKLDNN support.
+- [mxnet-cu100](https://pypi.python.org/pypi/mxnet-cu100/) with CUDA-10.0 
support.
+- [mxnet-cu100mkl](https://pypi.python.org/pypi/mxnet-cu100mkl/) with 
CUDA-10.0 support and MKLDNN support.
+- [mxnet-cu92](https://pypi.python.org/pypi/mxnet-cu92/) with CUDA-9.2 support.
+- [mxnet-cu92mkl](https://pypi.python.org/pypi/mxnet-cu92mkl/) with CUDA-9.2 
support and MKLDNN support.
+- [mxnet-cu91](https://pypi.python.org/pypi/mxnet-cu91/) with CUDA-9.1 support.
+- [mxnet-cu91mkl](https://pypi.python.org/pypi/mxnet-cu91mkl/) with CUDA-9.1 
support and MKLDNN support.
+- [mxnet-cu90](https://pypi.python.org/pypi/mxnet-cu90/) with CUDA-9.0 support.
+- [mxnet-cu90mkl](https://pypi.python.org/pypi/mxnet-cu90mkl/) with CUDA-9.0 
support and MKLDNN support.
+- [mxnet-cu80](https://pypi.python.org/pypi/mxnet-cu80/) with CUDA-8.0 support.
+- [mxnet-cu80mkl](https://pypi.python.org/pypi/mxnet-cu80mkl/) with CUDA-8.0 
support and MKLDNN support.
+- [mxnet-cu75](https://pypi.python.org/pypi/mxnet-cu75/) with CUDA-7.5 
support. Note that we doesn't maintain CUDA 7.5 anymore.
+- [mxnet-cu75mkl](https://pypi.python.org/pypi/mxnet-cu75mkl/) with CUDA-7.5 
support and MKLDNN support. Note that we doesn't maintain CUDA 7.5 anymore.
+- [mxnet-mkl](https://pypi.python.org/pypi/mxnet-mkl/) with MKLDNN support.
+- [mxnet](https://pypi.python.org/pypi/mxnet/) without MKLDNN and CUDA support.
+
+To download CUDA, check [CUDA 
download](https://developer.nvidia.com/cuda-downloads). For more instructions, 
check [CUDA Toolkit online 
documentation](http://docs.nvidia.com/cuda/index.html).
+
+To install for other platforms (e.g. Windows, Raspberry Pi/ARM) or other 
versions of CUDA, check [Installing 
MXNet](https://mxnet.incubator.apache.org/versions/master/install/index.html) 
for instructions on building from source.
 
 Review comment:
   ```suggestion
   To install for other platforms (e.g. Windows, Raspberry Pi/ARM) or other 
versions of CUDA, check [Installing 
MXNet](https://mxnet.incubator.apache.org/versions/master/install/index.html?platform=Devices=Python=CPU)
 for instructions on building from source.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #15082: Update CUDA 10.1 doc

2019-06-26 Thread GitBox
aaronmarkham commented on a change in pull request #15082: Update CUDA 10.1 doc
URL: https://github.com/apache/incubator-mxnet/pull/15082#discussion_r297913254
 
 

 ##
 File path: tools/pip/doc/CU101MKL_ADDITIONAL.md 
 ##
 @@ -0,0 +1,46 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Prerequisites
+-
+This package supports Linux and Windows platforms. You may also want to check:
+- [mxnet-cu101](https://pypi.python.org/pypi/mxnet-cu101/) with CUDA-10.1 
support.
+- [mxnet-cu100](https://pypi.python.org/pypi/mxnet-cu100/) with CUDA-10.0 
support.
+- [mxnet-cu100mkl](https://pypi.python.org/pypi/mxnet-cu100mkl/) with 
CUDA-10.0 support and MKLDNN support.
+- [mxnet-cu92](https://pypi.python.org/pypi/mxnet-cu92/) with CUDA-9.2 support.
+- [mxnet-cu92mkl](https://pypi.python.org/pypi/mxnet-cu92mkl/) with CUDA-9.2 
support and MKLDNN support.
+- [mxnet-cu91](https://pypi.python.org/pypi/mxnet-cu91/) with CUDA-9.1 support.
+- [mxnet-cu91mkl](https://pypi.python.org/pypi/mxnet-cu91mkl/) with CUDA-9.1 
support and MKLDNN support.
+- [mxnet-cu90](https://pypi.python.org/pypi/mxnet-cu90/) with CUDA-9.0 support.
+- [mxnet-cu90mkl](https://pypi.python.org/pypi/mxnet-cu90mkl/) with CUDA-9.0 
support and MKLDNN support.
+- [mxnet-cu80](https://pypi.python.org/pypi/mxnet-cu80/) with CUDA-8.0 support.
+- [mxnet-cu80mkl](https://pypi.python.org/pypi/mxnet-cu80mkl/) with CUDA-8.0 
support and MKLDNN support.
+- [mxnet-cu75](https://pypi.python.org/pypi/mxnet-cu75/) with CUDA-7.5 
support. Note that we doesn't maintain CUDA 7.5 anymore.
 
 Review comment:
   ```suggestion
   - [mxnet-cu75](https://pypi.python.org/pypi/mxnet-cu75/) with CUDA-7.5 
support. Note that CUDA 7.5 is no longer maintained.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on a change in pull request #15285: [WIP] Graph dumper

2019-06-26 Thread GitBox
larroy commented on a change in pull request #15285: [WIP] Graph dumper
URL: https://github.com/apache/incubator-mxnet/pull/15285#discussion_r297912617
 
 

 ##
 File path: src/common/directed_graph.h
 ##
 @@ -0,0 +1,201 @@
+  /*
 
 Review comment:
   I don't this this is practical as it uses symbol plus needs additional 
python. I think in this case my code does the job in pure C++ without any needs 
for python. After considering your suggestion I think it's best to use this C++ 
approach. The graph is generic, reusable and tested so I don't see a big 
problem with this.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy edited a comment on issue #15285: [WIP] Graph dumper

2019-06-26 Thread GitBox
larroy edited a comment on issue #15285: [WIP] Graph dumper
URL: https://github.com/apache/incubator-mxnet/pull/15285#issuecomment-506086935
 
 
   @apeforest @kshitij12345 sounds familiar?
   
   http://bit.ly/2X6QFYS
   
![graphviz](https://user-images.githubusercontent.com/928489/60222926-f9032e00-9833-11e9-913b-2a3e96b4c143.png)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy edited a comment on issue #15285: [WIP] Graph dumper

2019-06-26 Thread GitBox
larroy edited a comment on issue #15285: [WIP] Graph dumper
URL: https://github.com/apache/incubator-mxnet/pull/15285#issuecomment-506086935
 
 
   @apeforest 
   
   http://bit.ly/2X6QFYS
   
![graphviz](https://user-images.githubusercontent.com/928489/60222926-f9032e00-9833-11e9-913b-2a3e96b4c143.png)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15285: [WIP] Graph dumper

2019-06-26 Thread GitBox
larroy commented on issue #15285: [WIP] Graph dumper
URL: https://github.com/apache/incubator-mxnet/pull/15285#issuecomment-506086935
 
 
   @apeforest 
   
   http://bit.ly/2X6QFYS


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sandeep-krishnamurthy commented on issue #15070: CustomOp does not show up in profiler output

2019-06-26 Thread GitBox
sandeep-krishnamurthy commented on issue #15070: CustomOp does not show up in 
profiler output
URL: 
https://github.com/apache/incubator-mxnet/issues/15070#issuecomment-506084752
 
 
   Resolving.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sandeep-krishnamurthy closed issue #15070: CustomOp does not show up in profiler output

2019-06-26 Thread GitBox
sandeep-krishnamurthy closed issue #15070: CustomOp does not show up in 
profiler output
URL: https://github.com/apache/incubator-mxnet/issues/15070
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sandeep-krishnamurthy commented on issue #15080: tensorrt.tensorrt_bind results NDArray with NaN

2019-06-26 Thread GitBox
sandeep-krishnamurthy commented on issue #15080: tensorrt.tensorrt_bind results 
NDArray with NaN
URL: 
https://github.com/apache/incubator-mxnet/issues/15080#issuecomment-506084563
 
 
   @KellenSunderland - Can you please help here?
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sandeep-krishnamurthy closed issue #13135: [Python] CUDNN error from 3D deconvolution

2019-06-26 Thread GitBox
sandeep-krishnamurthy closed issue #13135: [Python] CUDNN error from 3D 
deconvolution
URL: https://github.com/apache/incubator-mxnet/issues/13135
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sandeep-krishnamurthy commented on issue #13135: [Python] CUDNN error from 3D deconvolution

2019-06-26 Thread GitBox
sandeep-krishnamurthy commented on issue #13135: [Python] CUDNN error from 3D 
deconvolution
URL: 
https://github.com/apache/incubator-mxnet/issues/13135#issuecomment-506083366
 
 
   Resolving as the issue is fixed in latest master. Please reopen if you still 
find any issues.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sandeep-krishnamurthy commented on issue #15026: [MXNET-14421] Make global pooling backwards compatible

2019-06-26 Thread GitBox
sandeep-krishnamurthy commented on issue #15026: [MXNET-14421] Make global 
pooling backwards compatible
URL: https://github.com/apache/incubator-mxnet/pull/15026#issuecomment-506080050
 
 
   Thank you @jmerkow for all the effort in debugging the root cause and 
getting this PR. This PR and the issue is a great reference for people facing 
similar issue. As you rightly mentioned, I think we cannot merge this PR 
because that would be like backward compatibility of a bug. We should probably 
have this workaround suggested to users instead and encourage to upgrade to 
latest version of MXNet.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] access2rohit commented on issue #15374: NightlyTestsForBinaries tutorials test broken

2019-06-26 Thread GitBox
access2rohit commented on issue #15374: NightlyTestsForBinaries tutorials test 
broken
URL: 
https://github.com/apache/incubator-mxnet/issues/15374#issuecomment-506079591
 
 
   @lebeg I will merge my PR once unix tests pass. Currently they are failing 
on unix CI/CD build, which is independent of my code change.
   
http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/mxnet-validation%2Funix-gpu/detail/PR-15360/3/pipeline


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] pengzhao-intel commented on issue #15298: Fix Cached_op with static_shape=true

2019-06-26 Thread GitBox
pengzhao-intel commented on issue #15298: Fix Cached_op with static_shape=true
URL: https://github.com/apache/incubator-mxnet/pull/15298#issuecomment-506076794
 
 
   @ZhennanQin Please verify the performance of this PR with our internal tests 
and NLP tests.
   If everything is OI, I will merge this soon.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] aaronmarkham commented on issue #15269: [DOC] Clarify that global pooling is going to reset padding

2019-06-26 Thread GitBox
aaronmarkham commented on issue #15269: [DOC] Clarify that global pooling is 
going to reset padding
URL: https://github.com/apache/incubator-mxnet/pull/15269#issuecomment-506076555
 
 
   @larroy Please ask another person to review... then I can merge it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15295: Move gluon model zoo tests to nightly

2019-06-26 Thread GitBox
larroy commented on issue #15295: Move gluon model zoo tests to nightly
URL: 
https://github.com/apache/incubator-mxnet/issues/15295#issuecomment-506074002
 
 
   https://github.com/apache/incubator-mxnet/issues/14636


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15269: [DOC] Clarify that global pooling is going to reset padding

2019-06-26 Thread GitBox
larroy commented on issue #15269: [DOC] Clarify that global pooling is going to 
reset padding
URL: https://github.com/apache/incubator-mxnet/pull/15269#issuecomment-506073825
 
 
   @mxnet-label-bot update [Doc, pr-awaiting-merge]


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] branch master updated (51acd4d -> 009907a)

2019-06-26 Thread patriczhao
This is an automated email from the ASF dual-hosted git repository.

patriczhao pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 51acd4d  [MXNET-1086] added sub and mul to ONNX->TensorRT conversion 
(#15344)
 add 009907a  [C++] Improve inference script to support benchmark on 
Imagenet (#15164)

No new revisions were added by this update.

Summary of changes:
 cpp-package/example/inference/README.md| 108 +++-
 .../example/inference/imagenet_inference.cpp   | 610 +
 .../example/inference/inception_inference.cpp  | 444 ---
 .../inference/unit_test_imagenet_inference.sh  |  63 +++
 .../inference/unit_test_inception_inference.sh |  42 --
 cpp-package/example/mlp_csv.cpp|   1 +
 cpp-package/include/mxnet-cpp/initializer.h|  50 ++
 cpp-package/include/mxnet-cpp/io.h |   2 +
 cpp-package/tests/ci_test.sh   |   4 +-
 example/quantization/README.md |  18 +-
 10 files changed, 830 insertions(+), 512 deletions(-)
 create mode 100644 cpp-package/example/inference/imagenet_inference.cpp
 delete mode 100644 cpp-package/example/inference/inception_inference.cpp
 create mode 100755 
cpp-package/example/inference/unit_test_imagenet_inference.sh
 delete mode 100755 
cpp-package/example/inference/unit_test_inception_inference.sh



[GitHub] [incubator-mxnet] pengzhao-intel merged pull request #15164: [C++] Improve inference script to support benchmark on Imagenet

2019-06-26 Thread GitBox
pengzhao-intel merged pull request #15164: [C++] Improve inference script to 
support benchmark on Imagenet
URL: https://github.com/apache/incubator-mxnet/pull/15164
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] pengzhao-intel commented on issue #15164: [C++] Improve inference script to support benchmark on Imagenet

2019-06-26 Thread GitBox
pengzhao-intel commented on issue #15164: [C++] Improve inference script to 
support benchmark on Imagenet
URL: https://github.com/apache/incubator-mxnet/pull/15164#issuecomment-506073200
 
 
   Thanks for the nice PR and merge now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy edited a comment on issue #15367: [RFC] Add public functions to get list of registered operators and arguments

2019-06-26 Thread GitBox
larroy edited a comment on issue #15367: [RFC] Add public functions to get list 
of registered operators and arguments
URL: 
https://github.com/apache/incubator-mxnet/issues/15367#issuecomment-506066973
 
 
   Added a PR that will do this effect: 
https://github.com/apache/incubator-mxnet/pull/15364


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15367: [RFC] Add public functions to get list of registered operators and arguments

2019-06-26 Thread GitBox
larroy commented on issue #15367: [RFC] Add public functions to get list of 
registered operators and arguments
URL: 
https://github.com/apache/incubator-mxnet/issues/15367#issuecomment-506066973
 
 
   https://github.com/apache/incubator-mxnet/pull/15364


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15364: Expose get_all_registered_operators and get_operator_arguments in the…

2019-06-26 Thread GitBox
larroy commented on issue #15364: Expose get_all_registered_operators and 
get_operator_arguments in the…
URL: https://github.com/apache/incubator-mxnet/pull/15364#issuecomment-506067107
 
 
   @marcoabreu addressed your comments.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on a change in pull request #15364: Expose get_all_registered_operators and get_operator_arguments in the…

2019-06-26 Thread GitBox
larroy commented on a change in pull request #15364: Expose 
get_all_registered_operators and get_operator_arguments in the…
URL: https://github.com/apache/incubator-mxnet/pull/15364#discussion_r297887004
 
 

 ##
 File path: tests/python/unittest/test_operator.py
 ##
 @@ -8655,6 +8656,17 @@ def test_add_n():
 assert_almost_equal(rslt.asnumpy(), add_n_rslt.asnumpy(), atol=1e-5)
 
 
+def test_get_all_registered_operators():
+ops = get_all_registered_operators()
+ok_(isinstance(ops, list))
+ok_(len(ops) > 0)
 
 Review comment:
   ok, I thought in the future we might dynamically build with only the 
operators needed, so I didn't want to make this asumption. For now we can 
assume that some basic op will be present.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on a change in pull request #15364: Expose get_all_registered_operators and get_operator_arguments in the…

2019-06-26 Thread GitBox
larroy commented on a change in pull request #15364: Expose 
get_all_registered_operators and get_operator_arguments in the…
URL: https://github.com/apache/incubator-mxnet/pull/15364#discussion_r297887040
 
 

 ##
 File path: tests/python/unittest/test_operator.py
 ##
 @@ -8655,6 +8656,17 @@ def test_add_n():
 assert_almost_equal(rslt.asnumpy(), add_n_rslt.asnumpy(), atol=1e-5)
 
 
+def test_get_all_registered_operators():
+ops = get_all_registered_operators()
+ok_(isinstance(ops, list))
+ok_(len(ops) > 0)
+
+
+def test_get_operator_arguments():
+operator_arguments = 
get_operator_arguments(mx.operator.get_all_registered_operators()[0])
+ok_(isinstance(operator_arguments, OperatorArguments))
 
 Review comment:
   Ok


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on issue #15374: NightlyTestsForBinaries tutorials test broken

2019-06-26 Thread GitBox
apeforest commented on issue #15374: NightlyTestsForBinaries tutorials test 
broken
URL: 
https://github.com/apache/incubator-mxnet/issues/15374#issuecomment-506063524
 
 
   https://github.com/apache/incubator-mxnet/pull/15360 will hopefully fix this.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15167: [WIP] Pointwise fusion for GPU

2019-06-26 Thread GitBox
larroy commented on issue #15167: [WIP] Pointwise fusion for GPU
URL: https://github.com/apache/incubator-mxnet/pull/15167#issuecomment-506062730
 
 
   Thanks for addressing the comments.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #14779: Fully connected, higher order grad

2019-06-26 Thread GitBox
larroy commented on issue #14779: Fully connected, higher order grad
URL: https://github.com/apache/incubator-mxnet/pull/14779#issuecomment-506062378
 
 
   @apeforest @kshitij12345 @sxjscience thanks for the review. Is this good to 
go?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on a change in pull request #14779: Fully connected, higher order grad

2019-06-26 Thread GitBox
larroy commented on a change in pull request #14779: Fully connected, higher 
order grad
URL: https://github.com/apache/incubator-mxnet/pull/14779#discussion_r297885337
 
 

 ##
 File path: tests/python/unittest/test_higher_order_grad.py
 ##
 @@ -129,6 +131,44 @@ def check_second_order_unary(x, op, grad_grad_op):
 # Validate the gradients.
 assert_almost_equal(expected_grad_grad, x.grad.asnumpy())
 
+class RandomShapes(object):
+def __init__(self, dim):
+self.dim = dim
+self.curdim = 1
+
+def __iter__(self):
+return self
+
+def next(self):
+return self.__next__()
+
+def __next__(self):
+if self.curdim > self.dim:
+raise StopIteration
+shape = rand_shape_nd(self.curdim)
+print(shape)
+x = nd.random.normal(shape=shape)
+self.curdim += 1
+return x
+
+
+@with_seed()
+def test_dense_backward():
 
 Review comment:
   Makes sense, but for convenience I prefer to test with Gluon. I don't think 
is worth the complexity and extra work now. Let me know if you disagree. 
Extending this argument the test should be in C++, we are testing like this all 
the time.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on a change in pull request #14779: Fully connected, higher order grad

2019-06-26 Thread GitBox
larroy commented on a change in pull request #14779: Fully connected, higher 
order grad
URL: https://github.com/apache/incubator-mxnet/pull/14779#discussion_r297885337
 
 

 ##
 File path: tests/python/unittest/test_higher_order_grad.py
 ##
 @@ -129,6 +131,44 @@ def check_second_order_unary(x, op, grad_grad_op):
 # Validate the gradients.
 assert_almost_equal(expected_grad_grad, x.grad.asnumpy())
 
+class RandomShapes(object):
+def __init__(self, dim):
+self.dim = dim
+self.curdim = 1
+
+def __iter__(self):
+return self
+
+def next(self):
+return self.__next__()
+
+def __next__(self):
+if self.curdim > self.dim:
+raise StopIteration
+shape = rand_shape_nd(self.curdim)
+print(shape)
+x = nd.random.normal(shape=shape)
+self.curdim += 1
+return x
+
+
+@with_seed()
+def test_dense_backward():
 
 Review comment:
   Makes sense, but for convenience I prefer to test with Gluon. I don't think 
is worth the complexity and extra work now. Let me know if you disagree.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on issue #15360: Revert default return type for indices in argsort() and topk() to fp32

2019-06-26 Thread GitBox
apeforest commented on issue #15360: Revert default return type for indices in 
argsort() and topk() to fp32
URL: https://github.com/apache/incubator-mxnet/pull/15360#issuecomment-506057166
 
 
   @frankfliu  could you please point to the test that failed after 
@access2rohit PR #15170
   
   @access2rohit Please add this API change as TODO in MXNet 2.0. Thanks


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   >