[incubator-mxnet] branch v1.5.x updated: Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND` (#15751) (#15792)

2019-08-07 Thread wkcn
This is an automated email from the ASF dual-hosted git repository.

wkcn pushed a commit to branch v1.5.x
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/v1.5.x by this push:
 new 804403e  Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND` 
(#15751) (#15792)
804403e is described below

commit 804403e999d1567f371c5243f5565127ad7f2f93
Author: JackieWu 
AuthorDate: Thu Aug 8 13:55:35 2019 +0800

Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND` (#15751) 
(#15792)

* fix push sync nd api

* align code

* update test for syncnd

* fix bug in tests/cpp/engine/threaded_engine_test

* add more testcases for MXEnginePushSyncND and MXEnginePushAsyncND

* fix test

* fix

* fix

* lint

* ci

* retrigger CI
---
 include/mxnet/c_api.h|  22 +++---
 src/c_api/c_api.cc   |  40 +--
 tests/cpp/engine/threaded_engine_test.cc | 117 +++
 3 files changed, 105 insertions(+), 74 deletions(-)

diff --git a/include/mxnet/c_api.h b/include/mxnet/c_api.h
index a2da6db..c73b366 100644
--- a/include/mxnet/c_api.h
+++ b/include/mxnet/c_api.h
@@ -2863,12 +2863,12 @@ MXNET_DLL int MXEnginePushSync(EngineSyncFunc 
sync_func, void* func_param,
   * \param wait Whether this is a WaitForVar operation.
   */
 MXNET_DLL int MXEnginePushAsyncND(EngineAsyncFunc async_func, void* func_param,
-EngineFuncParamDeleter deleter, ContextHandle 
ctx_handle,
-NDArrayHandle const_nds_handle, int 
num_const_nds,
-NDArrayHandle mutable_nds_handle, int 
num_mutable_nds,
-EngineFnPropertyHandle prop_handle 
DEFAULT(NULL),
-int priority DEFAULT(0), const char* opr_name 
DEFAULT(NULL),
-bool wait DEFAULT(false));
+  EngineFuncParamDeleter deleter, 
ContextHandle ctx_handle,
+  NDArrayHandle* const_nds_handle, int 
num_const_nds,
+  NDArrayHandle* mutable_nds_handle, int 
num_mutable_nds,
+  EngineFnPropertyHandle prop_handle 
DEFAULT(NULL),
+  int priority DEFAULT(0), const char* 
opr_name DEFAULT(NULL),
+  bool wait DEFAULT(false));
 
 /*!
   * \brief Push a synchronous operation to the engine.
@@ -2886,11 +2886,11 @@ MXNET_DLL int MXEnginePushAsyncND(EngineAsyncFunc 
async_func, void* func_param,
   * \param opr_name The operation name.
   */
 MXNET_DLL int MXEnginePushSyncND(EngineSyncFunc sync_func, void* func_param,
-   EngineFuncParamDeleter deleter, ContextHandle 
ctx_handle,
-   NDArrayHandle const_nds_handle, int 
num_const_nds,
-   NDArrayHandle mutable_nds_handle, int 
num_mutable_nds,
-   EngineFnPropertyHandle prop_handle 
DEFAULT(NULL),
-   int priority DEFAULT(0), const char* opr_name 
DEFAULT(NULL));
+ EngineFuncParamDeleter deleter, ContextHandle 
ctx_handle,
+ NDArrayHandle* const_nds_handle, int 
num_const_nds,
+ NDArrayHandle* mutable_nds_handle, int 
num_mutable_nds,
+ EngineFnPropertyHandle prop_handle 
DEFAULT(NULL),
+ int priority DEFAULT(0), const char* opr_name 
DEFAULT(NULL));
 
 #ifdef __cplusplus
 }
diff --git a/src/c_api/c_api.cc b/src/c_api/c_api.cc
index 35bd3ee..6ba46bd 100644
--- a/src/c_api/c_api.cc
+++ b/src/c_api/c_api.cc
@@ -1535,18 +1535,18 @@ int MXEnginePushSync(EngineSyncFunc sync_func, void* 
func_param,
 }
 
 int MXEnginePushAsyncND(EngineAsyncFunc async_func, void* func_param,
-  EngineFuncParamDeleter deleter, ContextHandle ctx_handle,
-  NDArrayHandle const_nds_handle, int num_const_nds,
-  NDArrayHandle mutable_nds_handle, int num_mutable_nds,
-  EngineFnPropertyHandle prop_handle, int priority,
-  const char* opr_name, bool wait) {
-  API_BEGIN();
-  NDArray* const_nds = static_cast(const_nds_handle);
-  NDArray* mutable_nds = static_cast(mutable_nds_handle);
+EngineFuncParamDeleter deleter, ContextHandle 
ctx_handle,
+NDArrayHandle* const_nds_handle, int num_const_nds,
+NDArrayHandle* mutable_nds_handle, int num_mutable_nds,
+EngineFnPropertyHandle prop_handle, int priority,
+const char* opr_name, bool wait) {
+  API_BEGIN();
+  NDArray** const_nds = 

[GitHub] [incubator-mxnet] wkcn merged pull request #15792: [Backport][v1.5.x] Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
wkcn merged pull request #15792: [Backport][v1.5.x] Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15792
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ChaiBapchya opened a new pull request #15794: Add power, exponent, log ops large tensor support

2019-08-07 Thread GitBox
ChaiBapchya opened a new pull request #15794: Add power, exponent, log ops 
large tensor support
URL: https://github.com/apache/incubator-mxnet/pull/15794
 
 
   ## Description ##
   Added large tensor support to follow ops
   Exponent & Log - exp,expm1, log, log2, log10, log1p
   Power - sqrt, rsqrt, cbrt, rcbrt, square, reciprocal
   
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [ ] The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to 
the relevant [JIRA issue](https://issues.apache.org/jira/projects/MXNET/issues) 
created (except PRs with tiny changes)
   - [ ] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage:
   - [ ] Code is well-documented: 
   - [ ] To the my best knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311864007
 
 

 ##
 File path: tests/nightly/test_large_vector.py
 ##
 @@ -0,0 +1,37 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import numpy as np
+import mxnet as mx
+from mxnet.test_utils import rand_ndarray, assert_almost_equal, rand_coord_2d
+from mxnet import gluon, nd
+from tests.python.unittest.common import with_seed
+
+# dimension constants
+LARGE_X = 50
+MEDIUM_X = 10
+
+
+def test_slice():
+a = nd.ones(LARGE_X)
+res = nd.slice(a, begin=(LARGE_X-MEDIUM_X), end=LARGE_X)
 
 Review comment:
   add space between operator '-'


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311863890
 
 

 ##
 File path: src/c_api/c_api_symbolic.cc
 ##
 @@ -528,7 +528,7 @@ int MXSymbolInferShape(SymbolHandle sym,
const mx_uint ***aux_shape_data,
int *complete) {
   nnvm::Symbol *s = static_cast(sym);
-  MXAPIThreadLocalEntry *ret = MXAPIThreadLocalStore::Get();
+  MXAPIThreadLocalEntry<> *ret = MXAPIThreadLocalStore<>::Get();
   API_BEGIN();
   nnvm::Graph g = Symbol2Graph(*s);
 
 Review comment:
   Can we replace this body with `SymbolInferShapeImpl` as well?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311863514
 
 

 ##
 File path: python/mxnet/symbol/symbol.py
 ##
 @@ -52,6 +52,7 @@
"ones", "full", "arange", "linspace", "histogram", "split_v2"]
 
 
+
 
 Review comment:
   remove blank line


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311863319
 
 

 ##
 File path: include/mxnet/c_api.h
 ##
 @@ -1582,6 +1660,22 @@ MXNET_DLL int MXSymbolInferShape(SymbolHandle sym,
  const mx_uint ***aux_shape_data,
  int *complete);
 
+MXNET_DLL int MXSymbolInferShape64(SymbolHandle sym,
 
 Review comment:
   Is there an implementation of this function? If no, maybe we don't need to 
declare it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311862760
 
 

 ##
 File path: src/c_api/c_api_symbolic.cc
 ##
 @@ -585,6 +585,78 @@ int MXSymbolInferShape(SymbolHandle sym,
   API_END();
 }
 
+template
+inline void SymbolInferShapeImpl(const char** keys,
+ mx_uint num_args,
+ const dtype* arg_shape_data,
+ const itype* arg_ind_ptr,
+ const int** in_shape_ndim,
+ const dtype*** in_shape_data,
+ const int** out_shape_ndim,
+ const dtype*** out_shape_data,
+ const int** aux_shape_ndim,
+ const dtype*** aux_shape_data,
+ nnvm::Symbol* s,
+ MXAPIThreadLocalEntry* ret,
+ stype* in_shape_size,
+ stype* out_shape_size,
+ stype* aux_shape_size,
+ int* complete) {
+nnvm::Graph g = Symbol2Graph(*s);
 
 Review comment:
   indent?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311862462
 
 

 ##
 File path: src/c_api/c_api_symbolic.cc
 ##
 @@ -585,6 +585,78 @@ int MXSymbolInferShape(SymbolHandle sym,
   API_END();
 }
 
+template
+inline void SymbolInferShapeImpl(const char** keys,
+ mx_uint num_args,
+ const dtype* arg_shape_data,
+ const itype* arg_ind_ptr,
+ const int** in_shape_ndim,
+ const dtype*** in_shape_data,
+ const int** out_shape_ndim,
+ const dtype*** out_shape_data,
+ const int** aux_shape_ndim,
+ const dtype*** aux_shape_data,
+ nnvm::Symbol* s,
+ MXAPIThreadLocalEntry* ret,
+ stype* in_shape_size,
+ stype* out_shape_size,
+ stype* aux_shape_size,
+ int* complete) {
+nnvm::Graph g = Symbol2Graph(*s);
+mxnet::ShapeVector arg_shapes(g.indexed_graph().input_nodes().size(), 
mxnet::TShape());
+if (keys == nullptr && num_args != 0) {
+  std::vector < uint32_t > read_only_args = 
mxnet::ReadOnlyArgIndices(g.indexed_graph());
+  CHECK_LE(num_args, read_only_args.size());
+  for (mx_uint i = 0; i < num_args; ++i) {
+arg_shapes[read_only_args[i]] = mxnet::ShapeTypeCast(arg_shape_data + 
arg_ind_ptr[i],
+ arg_shape_data + 
arg_ind_ptr[i + 1]);
+  }
+} else {
+  std::unordered_map kwargs;
+  for (mx_uint i = 0; i < num_args; ++i) {
+kwargs[keys[i]] = mxnet::ShapeTypeCast(arg_shape_data + arg_ind_ptr[i],
+   arg_shape_data + arg_ind_ptr[i + 
1]);
+  }
+  mxnet::MatchArguments(g.indexed_graph(), kwargs, _shapes, "InferShape");
+}
+try {
+  g = mxnet::exec::InferShape(std::move(g), std::move(arg_shapes), 
"__shape__");
+} catch (const mxnet::op::InferShapeError& err) {
+  throw dmlc::Error(err.msg);
+}
+// if use legacy shape definition, need to convert numpy shape to legacy shape
+mxnet::ShapeVector shapes = g.GetAttr("shape");
+if (!Imperative::Get()->is_np_shape()) {
+  common::ConvertToLegacyShape();
+}
+// copy back
+CopyAttr(g.indexed_graph(), shapes, &(ret->arg_shapes), &(ret->out_shapes), 
&(ret->aux_shapes));
+// copy data back
+MXAPIThreadLocalEntry::SetupShapeArrayReturnWithBufferEx(ret->arg_shapes,
+
&(ret->arg_shape_ndim_ex),
+
&(ret->arg_shape_data_ex),
+
&(ret->arg_shape_buffer_ex));
+MXAPIThreadLocalEntry::SetupShapeArrayReturnWithBufferEx(ret->out_shapes,
+
&(ret->out_shape_ndim_ex),
+
&(ret->out_shape_data_ex),
+
&(ret->out_shape_buffer_ex));
+MXAPIThreadLocalEntry::SetupShapeArrayReturnWithBufferEx(ret->aux_shapes,
+
&(ret->aux_shape_ndim_ex),
+
&(ret->aux_shape_data_ex),
+
&(ret->aux_shape_buffer_ex));
+  *in_shape_size = static_cast(ret->arg_shapes.size());
+  *in_shape_ndim = dmlc::BeginPtr(ret->arg_shape_ndim_ex);
+  *in_shape_data = dmlc::BeginPtr(ret->arg_shape_data_ex);
+  *out_shape_size = static_cast(ret->out_shapes.size());
+  *out_shape_ndim = dmlc::BeginPtr(ret->out_shape_ndim_ex);
+  *out_shape_data = dmlc::BeginPtr(ret->out_shape_data_ex);
+  *aux_shape_size = static_cast(ret->aux_shapes.size());
+  *aux_shape_ndim = dmlc::BeginPtr(ret->aux_shape_ndim_ex);
+  *aux_shape_data = dmlc::BeginPtr(ret->aux_shape_data_ex);
+// mark complete
+*complete = (g.GetAttr("shape_num_unknown_nodes") == 0);
 
 Review comment:
   indent


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311859584
 
 

 ##
 File path: src/c_api/c_api_common.h
 ##
 @@ -57,6 +57,7 @@
 using namespace mxnet;
 
 /*! \brief entry to to easily hold returning information */
+template
 
 Review comment:
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] KellenSunderland commented on issue #15613: [Discussion] 1.5.1 Patch Release

2019-08-07 Thread GitBox
KellenSunderland commented on issue #15613: [Discussion] 1.5.1 Patch Release
URL: 
https://github.com/apache/incubator-mxnet/issues/15613#issuecomment-519368787
 
 
   What's the issue with MShadow?  Did they have to do a force push / history 
re-write again?  We may indeed have to backport that to other release branches.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311859462
 
 

 ##
 File path: src/c_api/c_api.cc
 ##
 @@ -537,20 +557,18 @@ int MXNDArrayGetShape(NDArrayHandle handle,
   API_END();
 }
 
-int MXNDArrayGetShapeEx(NDArrayHandle handle,
-int *out_dim,
-const int **out_pdata) {
-  MXAPIThreadLocalEntry *ret = MXAPIThreadLocalStore::Get();
-  API_BEGIN();
-  NDArray *arr = static_cast(handle);
+template
+inline void GetShape(NDArrayHandle handle, const dtype** out_pdata, int* 
out_dim,
 
 Review comment:
   Naming should be consistent with `CreateNDArrayImpl`. If @larroy feels 
`Impl` is not the best suffix, we can consider `CreateNDArrayHelper` and 
`GetShapeHelper`. But all these internal template names sholud have consistent 
styles since we may have many of them.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311859169
 
 

 ##
 File path: src/c_api/c_api.cc
 ##
 @@ -537,20 +557,18 @@ int MXNDArrayGetShape(NDArrayHandle handle,
   API_END();
 }
 
-int MXNDArrayGetShapeEx(NDArrayHandle handle,
-int *out_dim,
-const int **out_pdata) {
-  MXAPIThreadLocalEntry *ret = MXAPIThreadLocalStore::Get();
-  API_BEGIN();
-  NDArray *arr = static_cast(handle);
+template
+inline void GetShape(NDArrayHandle handle, const dtype** out_pdata, int* 
out_dim,
+  MXAPIThreadLocalEntry* ret) {
 
 Review comment:
   nit: indent


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311858351
 
 

 ##
 File path: src/c_api/c_api.cc
 ##
 @@ -174,48 +174,68 @@ int MXNDArrayCreateNone(NDArrayHandle *out) {
   API_END();
 }
 
+template
+void CreateNDArrayImpl(const DataType* shape,
+   dimtype ndim,
+   int dev_type,
+   int dev_id,
+   int delay_alloc,
+   int dtype,
+   NDArrayHandle* out) {
+  *out = new NDArray(mxnet::TShape(shape, shape + ndim),
+ 
Context::Create(static_cast(dev_type), dev_id),
+ delay_alloc != 0, dtype);
+}
+
 int MXNDArrayCreate(const mx_uint *shape,
 mx_uint ndim,
 int dev_type,
 int dev_id,
 int delay_alloc,
 NDArrayHandle *out) {
   API_BEGIN();
-  *out = new NDArray(
-  mxnet::TShape(shape, shape + ndim),
-  Context::Create(static_cast(dev_type), dev_id),
-  delay_alloc != 0);
+  *out = new NDArray(mxnet::TShape(shape, shape + ndim),
 
 Review comment:
   @access2rohit Please also use CreateNDArrayImpl as @larroy suggested. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] vdantu commented on issue #15787: RuntimeError: Cannot find the MXNet library.

2019-08-07 Thread GitBox
vdantu commented on issue #15787: RuntimeError: Cannot find the MXNet library.
URL: 
https://github.com/apache/incubator-mxnet/issues/15787#issuecomment-519366492
 
 
   @mxnet-label-bot add [question]
   As @larroy suggested, you would have to build the project first before you 
install the python package. Or you could use the published PyPi packages, with 
"pip install". 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] vdantu commented on issue #15788: mxnet convert to onnx: No conversion function registered for op type null yet.

2019-08-07 Thread GitBox
vdantu commented on issue #15788: mxnet convert to onnx: No conversion function 
registered for op type null yet.
URL: 
https://github.com/apache/incubator-mxnet/issues/15788#issuecomment-519366126
 
 
   @mxnet-label-bot add [onnx, question]
   @vandanavk would you be able to answer this?
   
   @lookup1980 Does your onnx_mxnet come from mx.contrib.onnx_mxnet?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] vdantu commented on issue #15784: Simple_Bind failure in 1.5.0

2019-08-07 Thread GitBox
vdantu commented on issue #15784: Simple_Bind failure in 1.5.0
URL: 
https://github.com/apache/incubator-mxnet/issues/15784#issuecomment-519365567
 
 
   @samskalicky : Should this be closed? I see that the PR responsible for the 
fix is being tracked in the Patch Release discussion. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] vdantu commented on issue #15793: onnx2mx error.

2019-08-07 Thread GitBox
vdantu commented on issue #15793: onnx2mx error.
URL: 
https://github.com/apache/incubator-mxnet/issues/15793#issuecomment-519364998
 
 
   @mxnet-label-bot add [onnx, question]
   @vandanavk would you be able to help here? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] vdantu commented on issue #15791: fail to install mxnet r package within r 3.5.3 under ubuntu 16.04.6

2019-08-07 Thread GitBox
vdantu commented on issue #15791: fail to install mxnet r package within r 
3.5.3 under ubuntu 16.04.6
URL: 
https://github.com/apache/incubator-mxnet/issues/15791#issuecomment-519364841
 
 
   @mxnet-label-bot add [R, build, question]
   @anirudhacharya Would you be able to answer this question?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] TaoLv commented on issue #15613: [Discussion] 1.5.1 Patch Release

2019-08-07 Thread GitBox
TaoLv commented on issue #15613: [Discussion] 1.5.1 Patch Release
URL: 
https://github.com/apache/incubator-mxnet/issues/15613#issuecomment-519364786
 
 
   > > > we also need to update mshadow on 1.5.x branch #15600
   > > 
   > > 
   > > @szha is this needed?
   > 
   > @TaoLv without this when you checkout the v1.5.x branch the 
3rdparty/mshadow directory is empty and mxnet fails to compile. This is 
definitely needed.
   
   Then you need backport this change to all of the release branches. Can this 
be mitigated by `git submodule sync` and `git submodule update --recursive 
--init`?
   I'm afraid even if this change is picked to v1.5.x, users will get git 
complains when they try to pull the latest code if they are on a old commit.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] samskalicky commented on issue #15613: [Discussion] 1.5.1 Patch Release

2019-08-07 Thread GitBox
samskalicky commented on issue #15613: [Discussion] 1.5.1 Patch Release
URL: 
https://github.com/apache/incubator-mxnet/issues/15613#issuecomment-519360804
 
 
   > > we also need to update mshadow on 1.5.x branch #15600
   > 
   > @szha is this needed?
   
   @TaoLv without this when you checkout the v1.5.x branch the 3rdparty/mshadow 
directory is empty and mxnet fails to compile. This is definitely needed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] zerofo opened a new issue #15793: onnx2mx error.

2019-08-07 Thread GitBox
zerofo opened a new issue #15793: onnx2mx error.
URL: https://github.com/apache/incubator-mxnet/issues/15793
 
 
   docker images: ngc  mxnet 19.07-py3(ubuntu18.04 )
   use  keras2onnx(master ver.  and upgrade onnxconverter-common) on this repo 
https://github.com/bedapudi6788/NudeNet-models
   it work fine, on the onnxruntime-gpu.
   but it is 'channel_last'.
   i use mx.contrib.onnx.import_model(error in module.bind ) and 
mx.contrib.onnx.import_to_gloun (error in prediction)
   
   NHWC
   ```
   input_1: (1, 256, 256, 3)
   Error in operator Add12: [03:36:06] 
src/operator/tensor/./elemwise_binary_broadcast_op.h:68: Check failed: l == 1 
|| r == 1: operands could not be broadcast together with shapes [1,128,62,62] 
[1,128,63,63]
   Stack trace:
 [bt] (0) 
/usr/local/lib/libmxnet.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x43) 
[0x7f87b5fb4783]
 [bt] (1) 
/usr/local/lib/libmxnet.so(mxnet::op::BinaryBroadcastShape(nnvm::NodeAttrs 
const&, std::vector >*, 
std::vector >*)+0xbf1) [0x7f87b60ba6d1]
 [bt] (2) /usr/local/lib/libmxnet.so(+0x2f6e6e0) [0x7f87b81746e0]
 [bt] (3) /usr/local/lib/libmxnet.so(mxnet::exec::InferShape(nnvm::Graph&&, 
std::vector >&&, 
std::__cxx11::basic_string, std::alloc
   ator > const&)+0x1476) [0x7f87b8177de6]
 [bt] (4) 
/usr/local/lib/libmxnet.so(mxnet::exec::GraphExecutor::Init(nnvm::Symbol, 
mxnet::Context const&, std::map, std::allocator >, mxnet::C
   ontext, std::less, 
std::allocator > >, 
std::allocator, std::allocator >
   const, mxnet::Context> > > const&, std::vector > const&, std::vector > const&, std::vector > const&, 
std::unordered_map, 
std::allocator >, mxnet::TShape, 
std::hash, std::allocator > >, 
std::equal_to, 
std::allocator > >, 
std::allocator, std::allocator > const, mxnet::TShape> > > const&, 
std::unordered_map, 
std::allocator >, int, std::hash, std::allocator > >, 
std::equal_to, 
std::allocator > >, std::allocator, std::allocator > const, int> > > 
const&, std::unordered_map, std::allocator >, int, std::hash, std::allocator > >, 
std::equal_to, 
std::allocator > >, std::allocator, std::allocator > const, 
int> > > const&, std::vector 
> const&, std::unordered_set, std::allocator >, 
std::hash, 
std::allocator > >, std::equal_to, std::allocator > >, 
std::allocator, 
std::allocator > > > const&, std::vector >*, std::vector >*, 
std::vector >*, 
std::unordered_map, std::allocator >, mxnet::NDArray, 
std::hash, 
std::allocator > >, std::equal_to,
std::allocator > >, 
std::allocator, std::allocator > const, mxnet::NDArray> > >*, 
mxnet::Executor*, std::unordered_map > > 
const&)+0x3c0) [0x7f87b819d320]
 [bt] (5) 
/usr/local/lib/libmxnet.so(mxnet::Executor::SimpleBind(nnvm::Symbol, 
mxnet::Context const&, std::map, std::allocator >, mxnet::Contex
   t, std::less, 
std::allocator > >, 
std::allocator, std::allocator > const
   , mxnet::Context> > > const&, std::vector > const&, std::vector > const&, std::vector > const&, 
std::unordered_map, 
std::allocator >, mxnet::TShape, 
std::hash, std::allocator > >, 
std::equal_to, 
std::allocator > >, 
std::allocator, std::allocator > const, mxnet::TShape> > > const&, 
std::unordered_map, 
std::allocator >, int, std::hash, std::allocator > >, 
std::equal_to, 
std::allocator > >, std::allocator, std::allocator > const, int> > > const&, 
std::unordered_map, 
std::allocator >, int, std::hash, std::allocator > >, 
std::equal_to, 
std::allocator > >, std::allocator, std::allocator > const, int> > 
> const&, std::vector > 
const&, std::unordered_set, std::allocator >, 
std::hash, 
std::allocator > >, std::equal_to, std::allocator > >, 
std::allocator, 
std::allocator > > > const&, std::vector >*
   , std::vector >*, 
std::vector >*, 
std::unordered_map, st
   d::allocator >, mxnet::NDArray, 
std::hash, 
std::allocator > >, std::equal_to, std:
   :allocator > >, 
std::allocator, std::allocator > const, mxnet::NDArray> > >*, 
mxnet::Executor*)+0x4aa) [0x7f87b819df2a]
 [bt] (6) /usr/local/lib/libmxnet.so(MXExecutorSimpleBindEx+0x1c83) 
[0x7f87b8a59fe3]
 [bt] (7) /usr/lib/x86_64-linux-gnu/libffi.so.6(ffi_call_unix64+0x4c) 
[0x7f88490dedae]
 [bt] (8) /usr/lib/x86_64-linux-gnu/libffi.so.6(ffi_call+0x22f) 
[0x7f88490de71f]
   ```
   NCHW
   ```
   Traceback (most recent call last):   

 
 File "/opt/mxnet/python/mxnet/symbol/symbol.py", line 1623, in simple_bind 

 
   ctypes.byref(exe_handle)))   

 
 File "/opt/mxnet/python/mxnet/base.py", line 252, in check_call
   

[GitHub] [incubator-mxnet] wkcn commented on issue #15775: [Backport][v1.5.x] Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
wkcn commented on issue #15775: [Backport][v1.5.x] Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15775#issuecomment-519347358
 
 
   Hi @TaoLv , I have created a new PR #15792 committed by cherry-pick.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] wkcn opened a new pull request #15792: [Backport][v1.5.x] Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
wkcn opened a new pull request #15792: [Backport][v1.5.x] Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15792
 
 
   ## Description ##
   Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND` in v1.5.x 
release
   
   Issue: #15774 
   master branch: #15751 
   
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [x] The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to 
the relevant [JIRA issue](https://issues.apache.org/jira/projects/MXNET/issues) 
created (except PRs with tiny changes)
   - [x] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage:
   - Unit tests are added for small changes to verify correctness (e.g. adding 
a new operator)
   - Nightly tests are added for complicated/long-running ones (e.g. changing 
distributed kvstore)
   - Build tests will be added for build configuration changes (e.g. adding a 
new build option with NCCL)
   - [x] Code is well-documented: 
   - For user-facing API changes, API doc string has been updated. 
   - For new C++ functions in header files, their functionalities and arguments 
are documented. 
   - For new examples, README.md is added to explain the what the example does, 
the source of the dataset, expected performance on test set and reference to 
the original paper if applicable
   - Check the API doc at 
http://mxnet-ci-doc.s3-accelerate.dualstack.amazonaws.com/PR-$PR_ID/$BUILD_ID/index.html
   - [x] To the my best knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   
   ### Changes ###
   - [x] Modify the type of arguments to 'NDArry**'
   - [x] Update the unittest
   
   ## Comments ##
   - If this change is a backward incompatible change, why must this change be 
made.
   - Interesting edge cases to note here
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] branch master updated: Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND` (#15751)

2019-08-07 Thread wkcn
This is an automated email from the ASF dual-hosted git repository.

wkcn pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 79d8d86  Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND` 
(#15751)
79d8d86 is described below

commit 79d8d8656691c19502b7b71bf8c7d9001cdc3a4a
Author: JackieWu 
AuthorDate: Thu Aug 8 11:15:48 2019 +0800

Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND` (#15751)

* fix push sync nd api

* align code

* update test for syncnd

* fix bug in tests/cpp/engine/threaded_engine_test

* add more testcases for MXEnginePushSyncND and MXEnginePushAsyncND

* fix test

* fix

* fix

* lint

* ci

* retrigger CI
---
 include/mxnet/c_api.h|  22 +++---
 src/c_api/c_api.cc   |  40 +--
 tests/cpp/engine/threaded_engine_test.cc | 117 +++
 3 files changed, 105 insertions(+), 74 deletions(-)

diff --git a/include/mxnet/c_api.h b/include/mxnet/c_api.h
index 9d647c3..20b2aa2 100644
--- a/include/mxnet/c_api.h
+++ b/include/mxnet/c_api.h
@@ -2940,12 +2940,12 @@ MXNET_DLL int MXShallowCopySymbol(SymbolHandle src, 
SymbolHandle * out);
   * \param wait Whether this is a WaitForVar operation.
   */
 MXNET_DLL int MXEnginePushAsyncND(EngineAsyncFunc async_func, void* func_param,
-EngineFuncParamDeleter deleter, ContextHandle 
ctx_handle,
-NDArrayHandle const_nds_handle, int 
num_const_nds,
-NDArrayHandle mutable_nds_handle, int 
num_mutable_nds,
-EngineFnPropertyHandle prop_handle 
DEFAULT(NULL),
-int priority DEFAULT(0), const char* opr_name 
DEFAULT(NULL),
-bool wait DEFAULT(false));
+  EngineFuncParamDeleter deleter, 
ContextHandle ctx_handle,
+  NDArrayHandle* const_nds_handle, int 
num_const_nds,
+  NDArrayHandle* mutable_nds_handle, int 
num_mutable_nds,
+  EngineFnPropertyHandle prop_handle 
DEFAULT(NULL),
+  int priority DEFAULT(0), const char* 
opr_name DEFAULT(NULL),
+  bool wait DEFAULT(false));
 
 /*!
   * \brief Push a synchronous operation to the engine.
@@ -2963,11 +2963,11 @@ MXNET_DLL int MXEnginePushAsyncND(EngineAsyncFunc 
async_func, void* func_param,
   * \param opr_name The operation name.
   */
 MXNET_DLL int MXEnginePushSyncND(EngineSyncFunc sync_func, void* func_param,
-   EngineFuncParamDeleter deleter, ContextHandle 
ctx_handle,
-   NDArrayHandle const_nds_handle, int 
num_const_nds,
-   NDArrayHandle mutable_nds_handle, int 
num_mutable_nds,
-   EngineFnPropertyHandle prop_handle 
DEFAULT(NULL),
-   int priority DEFAULT(0), const char* opr_name 
DEFAULT(NULL));
+ EngineFuncParamDeleter deleter, ContextHandle 
ctx_handle,
+ NDArrayHandle* const_nds_handle, int 
num_const_nds,
+ NDArrayHandle* mutable_nds_handle, int 
num_mutable_nds,
+ EngineFnPropertyHandle prop_handle 
DEFAULT(NULL),
+ int priority DEFAULT(0), const char* opr_name 
DEFAULT(NULL));
 
 #ifdef __cplusplus
 }
diff --git a/src/c_api/c_api.cc b/src/c_api/c_api.cc
index dfb01dc..13f2219 100644
--- a/src/c_api/c_api.cc
+++ b/src/c_api/c_api.cc
@@ -1559,18 +1559,18 @@ int MXEnginePushSync(EngineSyncFunc sync_func, void* 
func_param,
 }
 
 int MXEnginePushAsyncND(EngineAsyncFunc async_func, void* func_param,
-  EngineFuncParamDeleter deleter, ContextHandle ctx_handle,
-  NDArrayHandle const_nds_handle, int num_const_nds,
-  NDArrayHandle mutable_nds_handle, int num_mutable_nds,
-  EngineFnPropertyHandle prop_handle, int priority,
-  const char* opr_name, bool wait) {
-  API_BEGIN();
-  NDArray* const_nds = static_cast(const_nds_handle);
-  NDArray* mutable_nds = static_cast(mutable_nds_handle);
+EngineFuncParamDeleter deleter, ContextHandle 
ctx_handle,
+NDArrayHandle* const_nds_handle, int num_const_nds,
+NDArrayHandle* mutable_nds_handle, int num_mutable_nds,
+EngineFnPropertyHandle prop_handle, int priority,
+const char* opr_name, bool wait) {
+  API_BEGIN();
+  NDArray** const_nds = 

[GitHub] [incubator-mxnet] wkcn merged pull request #15751: Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
wkcn merged pull request #15751: Fix the bug of `MXEnginePushAsyncND` and 
`MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15751
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] XiangyunHuang opened a new issue #15791: fail to install mxnet r package within r 3.5.3 under ubuntu 16.04.6

2019-08-07 Thread GitBox
XiangyunHuang opened a new issue #15791: fail to install mxnet r package within 
r 3.5.3 under ubuntu 16.04.6
URL: https://github.com/apache/incubator-mxnet/issues/15791
 
 
   
   ## Description
   
   1. download and compile mxnet
   
   ```bash
   git clone --recursive https://github.com/apache/incubator-mxnet mxnet \
 && cd mxnet \
 && echo "USE_OPENCV = 1" >> ./config.mk \
 && echo "USE_BLAS = openblas" >> ./config.mk \
 && echo "USE_JEMALLOC = 1" >> ./config.mk \
 && echo "USE_MKLDNN = 0" >> ./config.mk \
 && echo "USE_LAPACK = 1" >> ./config.mk \
 && echo "USE_CUDA = 0" >> ./config.mk \
 && echo "USE_OPENMP = 1" >> ./config.mk \
 && make -j 2
   ```
   
   There is no errors and warnings, so I think I have installed all deps
   
   2. Then, I try to build and install R-package, unfortunately, it fails
   
   ```
   make rpkg
   ```
   ```
   mkdir -p R-package/inst/libs
   cp src/io/image_recordio.h R-package/src
   cp -rf lib/libmxnet.so R-package/inst/libs
   if [ -e "lib/libmkldnn.so.0" ]; then \
   cp -rf lib/libmkldnn.so.0 R-package/inst/libs; \
   cp -rf lib/libiomp5.so R-package/inst/libs; \
   cp -rf lib/libmklml_intel.so R-package/inst/libs; \
   fi
   mkdir -p R-package/inst/include
   cp -rl include/* R-package/inst/include
   Rscript -e "if(!require(devtools)){install.packages('devtools', repo = 
'https://
   cloud.r-project.org/')}"
   Loading required package: devtools
   Loading required package: usethis
   Rscript -e "if(!require(roxygen2)||packageVersion('roxygen2') < 
'6.1.1'){install
   .packages('roxygen2', repo = 'https://cloud.r-project.org/')}"
   Loading required package: roxygen2
   Rscript -e "library(devtools); library(methods); 
options(repos=c(CRAN='https://c
   loud.r-project.org/')); install_deps(pkg='R-package', dependencies = TRUE)"
   Loading required package: usethis
   
   cp R-package/dummy.NAMESPACE R-package/NAMESPACE
   echo "import(Rcpp)" >> R-package/NAMESPACE
   R CMD INSTALL R-package
   * installing to library /usr/local/lib/R/site-library
   * installing *source* package mxnet ...
   ** libs
   make[1]: Entering directory '/home/mxnet/R-package/src'
   g++ -std=gnu++11 -I"/opt/R/R-3.5.3/lib/R/include" -DNDEBUG -I../inst/include 
-I"
   /usr/local/lib/R/site-library/Rcpp/include" -I/usr/local/include   -fpic  -g 
-O2
-c executor.cc -o executor.o
   In file included from executor.cc:28:0:
   ./base.h:31:23: fatal error: dmlc/base.h: No such file or directory
#include 
  ^
   compilation terminated.
   /opt/R/R-3.5.3/lib/R/etc/Makeconf:170: recipe for target 'executor.o' failed
   make[1]: *** [executor.o] Error 1
   make[1]: Leaving directory '/home/mxnet/R-package/src'
   ERROR: compilation failed for package mxnet
   * removing /usr/local/lib/R/site-library/mxnet
   Makefile:691: recipe for target 'rpkg' failed
   make: *** [rpkg] Error 1
   ```
   
   however, I can find `base.h`
   ```
   ls R-package/src
   ```
   ```
   base.h   export.h  io.cc   Makevars  ndarray.cc
   executor.cc  im2rec.cc io.hMakevars.win  ndarray.h
   executor.h   im2rec.h  kvstore.cc  mxnet.cc  symbol.cc
   export.ccimage_recordio.h  kvstore.h   name.hsymbol.h
   ```
   
   ## Environment info (Required)
   
   ```r
   sessionInfo()
   ```
   ```
   R version 3.5.3 (2019-03-11)
   Platform: x86_64-pc-linux-gnu (64-bit)
   Running under: Ubuntu 16.04.6 LTS
   
   Matrix products: default
   BLAS: /opt/R/R-3.5.3/lib/R/lib/libRblas.so
   LAPACK: /opt/R/R-3.5.3/lib/R/lib/libRlapack.so
   
   locale:
[1] LC_CTYPE=en_US.UTF-8   LC_NUMERIC=C
[3] LC_TIME=en_US.UTF-8LC_COLLATE=en_US.UTF-8
[5] LC_MONETARY=en_US.UTF-8LC_MESSAGES=en_US.UTF-8
[7] LC_PAPER=en_US.UTF-8   LC_NAME=C
[9] LC_ADDRESS=C   LC_TELEPHONE=C
   [11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C
   
   attached base packages:
   [1] stats graphics  grDevices utils datasets  methods   base
   
   loaded via a namespace (and not attached):
   [1] compiler_3.5.3
   ```
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] mxnet-label-bot commented on issue #15791: fail to install mxnet r package within r 3.5.3 under ubuntu 16.04.6

2019-08-07 Thread GitBox
mxnet-label-bot commented on issue #15791: fail to install mxnet r package 
within r 3.5.3 under ubuntu 16.04.6
URL: 
https://github.com/apache/incubator-mxnet/issues/15791#issuecomment-519345570
 
 
   Hey, this is the MXNet Label Bot. 
Thank you for submitting the issue! I will try and suggest some labels so 
that the appropriate MXNet community members can help resolve it. 
Here are my recommended labels: Installation


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] haojin2 commented on issue #15581: Numpy-compatible Infra

2019-08-07 Thread GitBox
haojin2 commented on issue #15581: Numpy-compatible Infra
URL: https://github.com/apache/incubator-mxnet/pull/15581#issuecomment-519342741
 
 
   CI passed with the recovery of `data.mxnet.io`, merging now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] haojin2 merged pull request #15581: Numpy-compatible Infra

2019-08-07 Thread GitBox
haojin2 merged pull request #15581: Numpy-compatible Infra
URL: https://github.com/apache/incubator-mxnet/pull/15581
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15787: RuntimeError: Cannot find the MXNet library.

2019-08-07 Thread GitBox
larroy commented on issue #15787: RuntimeError: Cannot find the MXNet library.
URL: 
https://github.com/apache/incubator-mxnet/issues/15787#issuecomment-519339493
 
 
   You have to compile the project first. Are you building from source? 
   Please follow docs here:
   
   
https://mxnet.incubator.apache.org/versions/master/install/build_from_source.html


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15790: Can't train CIFAR10 with debug flag using CMake / GPU / MKL

2019-08-07 Thread GitBox
larroy commented on issue #15790: Can't train CIFAR10 with debug flag using 
CMake / GPU / MKL
URL: 
https://github.com/apache/incubator-mxnet/issues/15790#issuecomment-519339292
 
 
   @mxnet-label-bot add [Bug, Build, CMake]


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy opened a new issue #15790: Can't train CIFAR10 with debug flag using CMake / GPU / MKL

2019-08-07 Thread GitBox
larroy opened a new issue #15790: Can't train CIFAR10 with debug flag using 
CMake / GPU / MKL
URL: https://github.com/apache/incubator-mxnet/issues/15790
 
 
## Description
   
   train cifar_10 doesn't seem to make any progress when MXNet is built with 
CMake and debug flags.
   
   
   ## Environment info (Required)
   
   Console  is flooded with OMP warnings:
   
   ```
   OMP: Error #13: Assertion failure at kmp_runtime.cpp(6481).
   OMP: Hint: Please submit a bug report with this message, compile and run 
commands used, and machine configuration info including native compiler and 
operating system versions. Faster response will be obtained by including all 
program sources. For information on submitting this issue, please see 
https://bugs.llvm.org/.
   Assertion failure at kmp_runtime.cpp(6481): __kmp_team_pool == __null.
   OMP: Error #13: Assertion failure at kmp_runtime.cpp(6481).
   OMP: Hint: Please submit a bug report with this message, compile and run 
commands used, and machine configuration info including native compiler and 
operating system versions. Faster response will be obtained by including all 
program sources. For information on submitting this issue, please see 
https://bugs.llvm.org/.
   Assertion failure at kmp_runtime.cpp(6481): __kmp_team_pool == __null.
   OMP: Error #13: Assertion failure at kmp_runtime.cpp(6481).
   OMP: Hint: Please submit a bug report with this message, compile and run 
commands used, and machine configuration info including native compiler and 
operating system versions. Faster response will be obtained by including all 
program sources. For information on submitting this issue, please see 
https://bugs.llvm.org/.
   Assertion failure at kmp_runtime.cpp(6481): __kmp_team_pool == __null.
   OMP: Error #13: Assertion failure at kmp_runtime.cpp(6481).
   OMP: Hint: Please submit a bug report with this message, compile and run 
commands used, and machine configuration info including native compiler and 
operating system versions. Faster response will be obtained by including all 
program sources. For information on submitting this issue, please see 
https://bugs.llvm.org/.
   Version  : 1.6.0
   Directory: /home/piotr/mxnet_master_cmake_debug/python/mxnet
   Commit hash file 
"/home/piotr/mxnet_master_cmake_debug/python/mxnet/COMMIT_HASH" not found. Not 
installed from pre-built package or built from source.
   Library  : 
['/home/piotr/mxnet_master_cmake_debug/python/mxnet/../../build/libmxnet.so']
   Build features:
   ✔ CUDA
   ✔ CUDNN
   ✖ NCCL
   ✔ CUDA_RTC
   ✖ TENSORRT
   ✔ CPU_SSE
   ✔ CPU_SSE2
   ✔ CPU_SSE3
   ✔ CPU_SSE4_1
   ✔ CPU_SSE4_2
   ✖ CPU_SSE4A
   ✔ CPU_AVX
   ✖ CPU_AVX2
   ✔ OPENMP
   ✖ SSE
   ✔ F16C
   ✔ JEMALLOC
   ✔ BLAS_OPEN
   ✖ BLAS_ATLAS
   ✖ BLAS_MKL
   ✖ BLAS_APPLE
   ✔ LAPACK
   ✔ MKLDNN
   ✔ OPENCV
   ✖ CAFFE
   ✖ PROFILER
   ✖ DIST_KVSTORE
   ✖ CXX14
   ✖ INT64_TENSOR_SIZE
   ✔ SIGNAL_HANDLER
   ✔ DEBUG
   ✖ TVM_OP
   --System Info--
   Platform : Linux-4.15.0-1044-aws-x86_64-with-Ubuntu-18.04-bionic
   system   : Linux
   node : ip-172-31-21-194
   release  : 4.15.0-1044-aws
   version  : #46-Ubuntu SMP Thu Jul 4 13:38:28 UTC 2019
   --Hardware Info--
   machine  : x86_64
   processor: x86_64
   Assertion failure at kmp_runtime.cpp(6481): __kmp_team_pool == __null.
   OMP: Error #13: Assertion failure at kmp_runtime.cpp(6481).
   OMP: Hint: Please submit a bug report with this message, compile and run 
commands used, and machine configuration info including native compiler and 
operating system versions. Faster response will be obtained by including all 
program sources. For information on submitting this issue, please see 
https://bugs.llvm.org/.
   --Network Test--
   Setting timeout: 10
   Timing for MXNet: https://github.com/apache/incubator-mxnet, DNS: 0.0107 
sec, LOAD: 0.4519 sec.
   Timing for Gluon Tutorial(en): http://gluon.mxnet.io, DNS: 0.0877 sec, LOAD: 
0.0986 sec.
   Timing for Gluon Tutorial(cn): https://zh.gluon.ai, DNS: 0.2109 sec, LOAD: 
0.2410 sec.
   Timing for FashionMNIST: 
https://apache-mxnet.s3-accelerate.dualstack.amazonaws.com/gluon/dataset/fashion-mnist/train-labels-idx1-ubyte.gz,
 DNS: 0.0080 sec, LOAD: 0.0885 sec.
   Timing for PYPI: https://pypi.python.org/pypi/pip, DNS: 0.0036 sec, LOAD: 
0.3234 sec.
   Timing for Conda: https://repo.continuum.io/pkgs/free/, DNS: 0.0113 sec, 
LOAD: 0.0685 sec.
   
   
   ```
   
   Package used (Python/R/Scala/Julia):
   (I'm using ...)
   
   For Scala user, please provide:
   1. Java version: (`java -version`)
   2. Maven version: (`mvn -version`)
   3. Scala runtime if applicable: (`scala -version`)
   
   For R user, please provide R `sessionInfo()`:
   
   ## Build info (Required if built from source)
   
   Compiler (gcc/clang/mingw/visual studio):
   
   MXNet commit hash:
   (Paste the output of `git rev-parse HEAD` here.)
   
   Build config:
   (Paste the 

[GitHub] [incubator-mxnet] mxnet-label-bot commented on issue #15790: Can't train CIFAR10 with debug flag using CMake / GPU / MKL

2019-08-07 Thread GitBox
mxnet-label-bot commented on issue #15790: Can't train CIFAR10 with debug flag 
using CMake / GPU / MKL
URL: 
https://github.com/apache/incubator-mxnet/issues/15790#issuecomment-519338675
 
 
   Hey, this is the MXNet Label Bot. 
Thank you for submitting the issue! I will try and suggest some labels so 
that the appropriate MXNet community members can help resolve it. 
Here are my recommended labels: Build


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15789: Can't build in debug mode with Make

2019-08-07 Thread GitBox
larroy commented on issue #15789: Can't build in debug mode with Make
URL: 
https://github.com/apache/incubator-mxnet/issues/15789#issuecomment-519337978
 
 
   @mxnet-label-bot add [Bug]


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] mxnet-label-bot commented on issue #15789: Can't build in debug mode with Make

2019-08-07 Thread GitBox
mxnet-label-bot commented on issue #15789: Can't build in debug mode with Make
URL: 
https://github.com/apache/incubator-mxnet/issues/15789#issuecomment-519337970
 
 
   Hey, this is the MXNet Label Bot. 
Thank you for submitting the issue! I will try and suggest some labels so 
that the appropriate MXNet community members can help resolve it. 
Here are my recommended labels: Build


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15789: Can't build in debug mode with Make

2019-08-07 Thread GitBox
larroy commented on issue #15789: Can't build in debug mode with Make
URL: 
https://github.com/apache/incubator-mxnet/issues/15789#issuecomment-519338013
 
 
   @mxnet-label-bot add [Build]


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy opened a new issue #15789: Can't build in debug mode with Make

2019-08-07 Thread GitBox
larroy opened a new issue #15789: Can't build in debug mode with Make
URL: https://github.com/apache/incubator-mxnet/issues/15789
 
 
   ## Description
   Can't build in debug mode with Make.
   
   ## Environment info (Required)
   
   ```
   --Python Info--
   ('Version  :', '2.7.15+')
   ('Compiler :', 'GCC 7.3.0')
   ('Build:', ('default', 'Nov 27 2018 23:36:35'))
   ('Arch :', ('64bit', ''))
   Pip Info---
   No corresponding pip install for current python.
   --MXNet Info---
   No MXNet installed.
   --System Info--
   ('Platform :', 'Linux-4.15.0-1044-aws-x86_64-with-Ubuntu-18.04-bionic')
   ('system   :', 'Linux')
   ('node :', 'ip-172-31-21-194')
   ('release  :', '4.15.0-1044-aws')
   ('version  :', '#46-Ubuntu SMP Thu Jul 4 13:38:28 UTC 2019')
   --Hardware Info--
   ('machine  :', 'x86_64')
   ('processor:', 'x86_64')
   Architecture:x86_64
   CPU op-mode(s):  32-bit, 64-bit
   Byte Order:  Little Endian
   CPU(s):  8
   On-line CPU(s) list: 0-7
   Thread(s) per core:  2
   Core(s) per socket:  4
   Socket(s):   1
   NUMA node(s):1
   Vendor ID:   GenuineIntel
   CPU family:  6
   Model:   79
   Model name:  Intel(R) Xeon(R) CPU E5-2686 v4 @ 2.30GHz
   Stepping:1
   CPU MHz: 2699.804
   CPU max MHz: 3000.
   CPU min MHz: 1200.
   BogoMIPS:4600.08
   Hypervisor vendor:   Xen
   Virtualization type: full
   L1d cache:   32K
   L1i cache:   32K
   L2 cache:256K
   L3 cache:46080K
   NUMA node0 CPU(s):   0-7
   Flags:   fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge 
mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx pdpe1gb rdtscp lm 
constant_tsc rep_good nopl xtopology nonstop_tsc cpuid aperfmperf pni pclmulqdq 
ssse3 fma cx16 pcid sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes 
xsave avx f16c rdrand hypervisor lahf_lm abm 3dnowprefetch cpuid_fault 
invpcid_single pti fsgsbase bmi1 hle avx2 smep bmi2 erms invpcid rtm rdseed adx 
xsaveopt
   --Network Test--
   Setting timeout: 10
   Timing for MXNet: https://github.com/apache/incubator-mxnet, DNS: 0.0026 
sec, LOAD: 0.6484 sec.
   Timing for PYPI: https://pypi.python.org/pypi/pip, DNS: 0.0031 sec, LOAD: 
0.3433 sec.
   Timing for FashionMNIST: 
https://apache-mxnet.s3-accelerate.dualstack.amazonaws.com/gluon/dataset/fashion-mnist/train-labels-idx1-ubyte.gz,
 DNS: 0.0067 sec, LOAD: 0.0846 sec.
   Timing for Conda: https://repo.continuum.io/pkgs/free/, DNS: 0.0095 sec, 
LOAD: 0.0843 sec.
   Timing for Gluon Tutorial(en): http://gluon.mxnet.io, DNS: 0.0466 sec, LOAD: 
0.0871 sec.
   Timing for Gluon Tutorial(cn): https://zh.gluon.ai, DNS: 0.2034 sec, LOAD: 
0.2443 sec.
   ```
   
   Package used (Python/R/Scala/Julia):
   (I'm using ...)
   
   For Scala user, please provide:
   1. Java version: (`java -version`)
   2. Maven version: (`mvn -version`)
   3. Scala runtime if applicable: (`scala -version`)
   
   For R user, please provide R `sessionInfo()`:
   
   ## Build info (Required if built from source)
   ```
   Using built-in specs.
   COLLECT_GCC=gcc
   COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-linux-gnu/7/lto-wrapper
   OFFLOAD_TARGET_NAMES=nvptx-none
   OFFLOAD_TARGET_DEFAULT=1
   Target: x86_64-linux-gnu
   Configured with: ../src/configure -v --with-pkgversion='Ubuntu 
7.4.0-1ubuntu1~18.04.1' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs 
--enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++ --prefix=/usr 
--with-gcc-major-version-only --program-suffix=-7 
--program-prefix=x86_64-linux-gnu- --enable-shared --enable-linker-build-id 
--libexecdir=/usr/lib --without-included-gettext --enable-threads=posix 
--libdir=/usr/lib --enable-nls --with-sysroot=/ --enable-clocale=gnu 
--enable-libstdcxx-debug --enable-libstdcxx-time=yes 
--with-default-libstdcxx-abi=new --enable-gnu-unique-object 
--disable-vtable-verify --enable-libmpx --enable-plugin --enable-default-pie 
--with-system-zlib --with-target-system-zlib --enable-objc-gc=auto 
--enable-multiarch --disable-werror --with-arch-32=i686 --with-abi=m64 
--with-multilib-list=m32,m64,mx32 --enable-multilib --with-tune=generic 
--enable-offload-targets=nvptx-none --without-cuda-driver 
--enable-checking=release --build=x86_64-linux-gnu --host=x86_64-linux-gnu 
--target=x86_64-linux-gnu
   Thread model: posix
   gcc version 7.4.0 (Ubuntu 7.4.0-1ubuntu1~18.04.1) 
   
   ```
   
   MXNet commit hash:
   
   commit a2b11aed6851c20f7fcf419849dbb3f4b8c4c192 (HEAD -> master, 
upstream/master)
   
   
   Build config:
   (Paste the content of config.mk, or the build command.)
   
   ```
   piotr@ip-172-31-21-194:0: ~/mxnet_master_make_debug [master]> diff config.mk 
make/config.mk 
   51c51
   < DEV = 1
   ---
   > DEV = 

[GitHub] [incubator-mxnet] lookup1980 opened a new issue #15788: mxnet convert to onnx: No conversion function registered for op type null yet.

2019-08-07 Thread GitBox
lookup1980 opened a new issue #15788: mxnet convert to onnx: No conversion 
function registered for op type null yet.
URL: https://github.com/apache/incubator-mxnet/issues/15788
 
 
   ## Description
   I tried to convert mxnet mode to onnx, but got this error: No conversion 
function registered for op type null yet.
   
   onnx version is:
   ```
   >>> import onnx
   >>> print(onnx.__version__)
   1.5.0
   ```
   
   The model and my code is:
   ```
   sym = './resnet-18-symbol.json'
   params = './resnet-18-.params'
   
   onnx_file = './mxnet_exported.onnx'
   
   input_shape = (1,3,512,512)
   
   converted_model_path = onnx_mxnet.export_model(sym, params, [input_shape], 
np.float32, onnx_file)
   ```
   
   
   ## Environment info (Required)
   
   
   --Python Info--
   Version  : 3.7.3
   Compiler : GCC 7.3.0
   Build: ('default', 'Mar 27 2019 22:11:17')
   Arch : ('64bit', '')
   Pip Info---
   Version  : 19.1.1
   Directory: 
/home/mgu/anaconda3/envs/mxnet-py37-cpu/lib/python3.7/site-packages/pip
   --MXNet Info---
   Version  : 1.5.0
   Directory: 
/home/mgu/anaconda3/envs/mxnet-py37-cpu/lib/python3.7/site-packages/mxnet
   Commit Hash   : 75a9e187d00a8b7ebc71412a02ed0e3ae489d91f
   Library  : 
['/home/mgu/anaconda3/envs/mxnet-py37-cpu/lib/python3.7/site-packages/mxnet/libmxnet.so']
   Build features:
   ✖ CUDA
   ✖ CUDNN
   ✖ NCCL
   ✖ CUDA_RTC
   ✖ TENSORRT
   ✔ CPU_SSE
   ✔ CPU_SSE2
   ✔ CPU_SSE3
   ✔ CPU_SSE4_1
   ✔ CPU_SSE4_2
   ✖ CPU_SSE4A
   ✔ CPU_AVX
   ✖ CPU_AVX2
   ✖ OPENMP
   ✖ SSE
   ✔ F16C
   ✖ JEMALLOC
   ✖ BLAS_OPEN
   ✖ BLAS_ATLAS
   ✖ BLAS_MKL
   ✖ BLAS_APPLE
   ✔ LAPACK
   ✖ MKLDNN
   ✔ OPENCV
   ✖ CAFFE
   ✖ PROFILER
   ✔ DIST_KVSTORE
   ✖ CXX14
   ✖ INT64_TENSOR_SIZE
   ✔ SIGNAL_HANDLER
   ✖ DEBUG
   --System Info--
   Platform : Linux-4.15.0-55-generic-x86_64-with-debian-stretch-sid
   system   : Linux
   node : mgu-P520c
   release  : 4.15.0-55-generic
   version  : #60~16.04.2-Ubuntu SMP Thu Jul 4 09:03:09 UTC 2019
   --Hardware Info--
   machine  : x86_64
   processor: x86_64
   Architecture:  x86_64
   CPU op-mode(s):32-bit, 64-bit
   Byte Order:Little Endian
   CPU(s):8
   On-line CPU(s) list:   0-7
   Thread(s) per core:2
   Core(s) per socket:4
   Socket(s): 1
   NUMA node(s):  1
   Vendor ID: GenuineIntel
   CPU family:6
   Model: 85
   Model name:Intel(R) Xeon(R) W-2123 CPU @ 3.60GHz
   Stepping:  4
   CPU MHz:   1200.080
   CPU max MHz:   3900.
   CPU min MHz:   1200.
   BogoMIPS:  7200.00
   Virtualization:VT-x
   L1d cache: 32K
   L1i cache: 32K
   L2 cache:  1024K
   L3 cache:  8448K
   NUMA node0 CPU(s): 0-7
   Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge 
mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx 
pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl 
xtopology nonstop_tsc cpuid aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx 
smx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid dca sse4_1 sse4_2 x2apic movbe 
popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm 3dnowprefetch 
cpuid_fault epb cat_l3 cdp_l3 invpcid_single pti intel_ppin ssbd mba ibrs ibpb 
stibp tpr_shadow vnmi flexpriority ept vpid fsgsbase tsc_adjust bmi1 hle avx2 
smep bmi2 erms invpcid rtm cqm mpx rdt_a avx512f avx512dq rdseed adx smap 
clflushopt clwb intel_pt avx512cd avx512bw avx512vl xsaveopt xsavec xgetbv1 
xsaves cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local dtherm ida arat pln 
pts hwp hwp_act_window hwp_epp hwp_pkg_req md_clear flush_l1d
   --Network Test--
   Setting timeout: 10
   Timing for MXNet: https://github.com/apache/incubator-mxnet, DNS: 0.0061 
sec, LOAD: 0.5550 sec.
   Timing for Gluon Tutorial(en): http://gluon.mxnet.io, DNS: 0.0462 sec, LOAD: 
0.0659 sec.
   Timing for Gluon Tutorial(cn): https://zh.gluon.ai, DNS: 0.0188 sec, LOAD: 
0.1447 sec.
   Timing for FashionMNIST: 
https://apache-mxnet.s3-accelerate.dualstack.amazonaws.com/gluon/dataset/fashion-mnist/train-labels-idx1-ubyte.gz,
 DNS: 0.0236 sec, LOAD: 0.1065 sec.
   Timing for PYPI: https://pypi.python.org/pypi/pip, DNS: 0.0056 sec, LOAD: 
0.3537 sec.
   Timing for Conda: https://repo.continuum.io/pkgs/free/, DNS: 0.0158 sec, 
LOAD: 0.0744 sec.
   
   
   Package used (Python/R/Scala/Julia):
   (I'm using ...)
   
   For Scala user, please provide:
   1. Java version: (`java -version`)
   2. Maven version: (`mvn -version`)
   3. Scala runtime if applicable: (`scala -version`)
   
   For R user, please provide R `sessionInfo()`:
   
   ## Build info (Required if built from source)
   
   Compiler 

[GitHub] [incubator-mxnet] TaoLv commented on issue #15613: [Discussion] 1.5.1 Patch Release

2019-08-07 Thread GitBox
TaoLv commented on issue #15613: [Discussion] 1.5.1 Patch Release
URL: 
https://github.com/apache/incubator-mxnet/issues/15613#issuecomment-519335053
 
 
   > nightly test failure need to be fixed:
   > #15374
   > 
   > 
http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/NightlyTestsForBinaries/detail/master/395/pipeline/
   
   @roywei @lebeg Could you help to check if the issue has been fixed via 
https://github.com/apache/incubator-mxnet/pull/15452? If so,  I will include 
the fix to the 1.5.1 patch release. Thanks!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] mseth10 commented on a change in pull request #15762: Refactor LibraryInitializer so it's thread safe. Fixes random sporadical concurrency crashes.

2019-08-07 Thread GitBox
mseth10 commented on a change in pull request #15762: Refactor 
LibraryInitializer so it's thread safe. Fixes random sporadical concurrency 
crashes.
URL: https://github.com/apache/incubator-mxnet/pull/15762#discussion_r311830815
 
 

 ##
 File path: src/common/utils.h
 ##
 @@ -50,9 +50,22 @@
 #include "../operator/nn/mkldnn/mkldnn_base-inl.h"
 #endif
 
+#if defined(_WIN32) || defined(_WIN64) || defined(__WINDOWS__)
+#include 
+#else
+#include 
+#include 
 
 Review comment:
   Is this being used?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] TaoLv commented on issue #15542: License issues need to be fixed before 1.6 release

2019-08-07 Thread GitBox
TaoLv commented on issue #15542: License issues need to be fixed before 1.6 
release
URL: 
https://github.com/apache/incubator-mxnet/issues/15542#issuecomment-519332616
 
 
   @roywei do you happen to know any updates for these license issues?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] TaoLv commented on issue #15569: Cub license issue

2019-08-07 Thread GitBox
TaoLv commented on issue #15569: Cub license issue 
URL: 
https://github.com/apache/incubator-mxnet/issues/15569#issuecomment-519331992
 
 
   Hi @roywei @ptrendx  is this got fixed on the master branch? Do we need to 
include the fix to 1.5.1 patch release?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] phy12321 commented on issue #13520: Check failed: b < len (21 vs. 21) slicing with begin[0]=21 exceends limit of 21

2019-08-07 Thread GitBox
phy12321 commented on issue #13520: Check failed: b < len (21 vs. 21) slicing 
with begin[0]=21 exceends limit of 21
URL: 
https://github.com/apache/incubator-mxnet/issues/13520#issuecomment-519330892
 
 
Has the issue been resolved for you?  I got the same error when i want to 
print or slice the output of network,then i found not only the output of 
network cannot be printed  or sliced.I made a ndarray and it cannot be printed, 
either.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15762: Refactor LibraryInitializer so it's thread safe. Fixes random sporadical concurrency crashes.

2019-08-07 Thread GitBox
larroy commented on issue #15762: Refactor LibraryInitializer so it's thread 
safe. Fixes random sporadical concurrency crashes.
URL: https://github.com/apache/incubator-mxnet/pull/15762#issuecomment-519329986
 
 
   @mseth10 @samskalicky 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] wkcn closed pull request #15775: [Backport][v1.5.x] Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
wkcn closed pull request #15775: [Backport][v1.5.x] Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15775
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] wkcn commented on issue #15775: [Backport][v1.5.x] Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
wkcn commented on issue #15775: [Backport][v1.5.x] Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15775#issuecomment-519329880
 
 
   I see. Close this PR, and I will reopen a PR committed by cherry-pick : )


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] wkcn commented on issue #15775: [Backport][v1.5.x] Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
wkcn commented on issue #15775: [Backport][v1.5.x] Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15775#issuecomment-519329655
 
 
   @TaoLv Should I reopen a PR created by cherry-pick?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] TaoLv commented on issue #15613: [Discussion] 1.5.1 Patch Release

2019-08-07 Thread GitBox
TaoLv commented on issue #15613: [Discussion] 1.5.1 Patch Release
URL: 
https://github.com/apache/incubator-mxnet/issues/15613#issuecomment-519329573
 
 
   > we also need to update mshadow on 1.5.x branch #15600
   
   @szha is this needed?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] TaoLv commented on issue #15613: [Discussion] 1.5.1 Patch Release

2019-08-07 Thread GitBox
TaoLv commented on issue #15613: [Discussion] 1.5.1 Patch Release
URL: 
https://github.com/apache/incubator-mxnet/issues/15613#issuecomment-519329493
 
 
   > #15784 needs to be fixed in 1.5.1, big impact for simple_bind. The fix is 
in #15620. @TaoLv please include this too. Thanks!
   
   Sure. I will do that.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] TaoLv commented on issue #15613: [Discussion] 1.5.1 Patch Release

2019-08-07 Thread GitBox
TaoLv commented on issue #15613: [Discussion] 1.5.1 Patch Release
URL: 
https://github.com/apache/incubator-mxnet/issues/15613#issuecomment-519329433
 
 
   > I'd like to backport a few TensorRT patches we've contributed to master. 
No functional changes but they'll provide support for many additional models 
that would otherwise not be supported.
   
   Thank you @KellenSunderland . Could you help to list them to the cwiki page?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] yzz-cver commented on issue #15787: RuntimeError: Cannot find the MXNet library.

2019-08-07 Thread GitBox
yzz-cver commented on issue #15787: RuntimeError: Cannot find the MXNet library.
URL: 
https://github.com/apache/incubator-mxnet/issues/15787#issuecomment-519328591
 
 
   
![image](https://user-images.githubusercontent.com/53545589/62668886-252bd600-b9c0-11e9-94e7-2f8aa01df0cf.png)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] TaoLv commented on issue #15703: Storage manager / memory usage regression in v1.5

2019-08-07 Thread GitBox
TaoLv commented on issue #15703: Storage manager / memory usage regression in 
v1.5
URL: 
https://github.com/apache/incubator-mxnet/issues/15703#issuecomment-519328568
 
 
   I'm not sure. I just located the issue to this commit by bisecting. The 
compilation flag is not set when I built mxnet from source.
   @apeforest do you have any idea?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] yzz-cver opened a new issue #15787: RuntimeError: Cannot find the MXNet library.

2019-08-07 Thread GitBox
yzz-cver opened a new issue #15787: RuntimeError: Cannot find the MXNet library.
URL: https://github.com/apache/incubator-mxnet/issues/15787
 
 
   after 
   cd python  
   python setup.py install
   
   I have encountered the following problems, please help me.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet-site] branch asf-site updated: Bump the publish timestamp.

2019-08-07 Thread marcoabreu
This is an automated email from the ASF dual-hosted git repository.

marcoabreu pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 3f4fbed  Bump the publish timestamp.
3f4fbed is described below

commit 3f4fbed8da6860eb785e1b255f919756f3eac200
Author: mxnet-ci 
AuthorDate: Thu Aug 8 01:34:07 2019 +

Bump the publish timestamp.
---
 date.txt | 1 +
 1 file changed, 1 insertion(+)

diff --git a/date.txt b/date.txt
new file mode 100644
index 000..e7e7abe
--- /dev/null
+++ b/date.txt
@@ -0,0 +1 @@
+Thu Aug  8 01:34:07 UTC 2019



[GitHub] [incubator-mxnet] pengzhao-intel commented on issue #15780: FP16 gemm on cpu not implemented!

2019-08-07 Thread GitBox
pengzhao-intel commented on issue #15780: FP16 gemm on cpu not implemented!
URL: 
https://github.com/apache/incubator-mxnet/issues/15780#issuecomment-519326889
 
 
   @KhurramPirov  This is expected behavior on CPU where the FP16 GEMM doesn't 
support.
   Seems your problem is the memory leak so I suggest filing a bug to report 
leak problem.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] pengzhao-intel commented on issue #15767: FullyConnected op with float64 and MKL-DNN fails if gradient are not set in a specific way

2019-08-07 Thread GitBox
pengzhao-intel commented on issue #15767: FullyConnected op with float64 and 
MKL-DNN fails if gradient are not set in a specific way
URL: 
https://github.com/apache/incubator-mxnet/issues/15767#issuecomment-519325620
 
 
   > @pengzhao-intel @TaoLv v1.5.0 doesn't have this issue. So don't need to 
fix in v1.5.1.
   
   It's nice and we can try to resolve in 1.6.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] pengzhao-intel commented on issue #15706: [mkldnn-v1.0] Initiate the transition to MKL-DNN v1.0

2019-08-07 Thread GitBox
pengzhao-intel commented on issue #15706: [mkldnn-v1.0] Initiate the transition 
to MKL-DNN v1.0
URL: https://github.com/apache/incubator-mxnet/pull/15706#issuecomment-519325292
 
 
   Frist PR for MKL-DNN 1.0 upgrading :)
   CC @sandeep-krishnamurthy @zheng-da @szha @eric-haibin-lin 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] wuxun-zhang commented on issue #15767: FullyConnected op with float64 and MKL-DNN fails if gradient are not set in a specific way

2019-08-07 Thread GitBox
wuxun-zhang commented on issue #15767: FullyConnected op with float64 and 
MKL-DNN fails if gradient are not set in a specific way
URL: 
https://github.com/apache/incubator-mxnet/issues/15767#issuecomment-519323609
 
 
   @ZhennanQin Can we add data type check here 
[#L1663](https://github.com/apache/incubator-mxnet/blob/7186123874/src/executor/graph_executor.cc#L1663)
 to disable subgraph when input data type is not supported by MKL-DNN?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ChaiBapchya commented on a change in pull request #15785: Add large tensor support binary arithmetic

2019-08-07 Thread GitBox
ChaiBapchya commented on a change in pull request #15785: Add large tensor 
support binary arithmetic
URL: https://github.com/apache/incubator-mxnet/pull/15785#discussion_r311820011
 
 

 ##
 File path: tests/nightly/test_large_array.py
 ##
 @@ -352,6 +352,93 @@ def test_topk():
 assert l.sum() == np.sum(np.arange(0, SMALL_Y))
 
 
+def test_add():
+a = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+b = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+mx_res = a.__add__(b)
+np_res = b.asnumpy()+a.asnumpy()
 
 Review comment:
   I ran `make lint` and `make pylint` before pushing this commit.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ZhennanQin commented on issue #15767: FullyConnected op with float64 and MKL-DNN fails if gradient are not set in a specific way

2019-08-07 Thread GitBox
ZhennanQin commented on issue #15767: FullyConnected op with float64 and 
MKL-DNN fails if gradient are not set in a specific way
URL: 
https://github.com/apache/incubator-mxnet/issues/15767#issuecomment-519322278
 
 
   @pengzhao-intel @TaoLv v1.5.0 doesn't have this issue. So don't need to fix 
in v1.5.1.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] TaoLv commented on issue #15775: [Backport][v1.5.x] Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
TaoLv commented on issue #15775: [Backport][v1.5.x] Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15775#issuecomment-519321114
 
 
   To make the git history consistent, should we cherrypick the commit to v
   1.5.x after #15751 is merged to master?
   
   发自我的 iPhone
   
   在 2019年8月8日,上午7:47,JackieWu 
mailto:notificati...@github.com>> 写道:
   
   
   Thank @szha ! The PR will be merged if there is no 
change in #15751
   
   ―
   You are receiving this because you are subscribed to this thread.
   Reply to this email directly, view it on 
GitHub,
 or mute the 
thread.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on a change in pull request #15760: Fix PR #15489 (Dynamic Library Loading Support)

2019-08-07 Thread GitBox
larroy commented on a change in pull request #15760: Fix PR #15489 (Dynamic 
Library Loading Support)
URL: https://github.com/apache/incubator-mxnet/pull/15760#discussion_r311816516
 
 

 ##
 File path: src/common/library.h
 ##
 @@ -0,0 +1,57 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+/*!
+ * Copyright (c) 2015 by Contributors
+ * \file library.h
+ * \brief Defining library loading functions
+ */
+#ifndef MXNET_COMMON_LIBRARY_H_
+#define MXNET_COMMON_LIBRARY_H_
+
+#include 
+#include 
+#include 
+#include "dmlc/io.h"
+
+// map of libraries loaded
+static std::map loaded_libs;
 
 Review comment:
   This should not be static in a header. I will change it in my PR. This 
creates a symbol in every compilation unit.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] cyrusbehr commented on issue #15520: most time cost in NDArray GetData() func

2019-08-07 Thread GitBox
cyrusbehr commented on issue #15520: most time cost in NDArray GetData() func
URL: 
https://github.com/apache/incubator-mxnet/issues/15520#issuecomment-519318051
 
 
   I am wondering the same thing. Can we omit the `WaitAll` if we are running 
the `NaiveEngine`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ZhennanQin edited a comment on issue #15767: FullyConnected op with float64 and MKL-DNN fails if gradient are not set in a specific way

2019-08-07 Thread GitBox
ZhennanQin edited a comment on issue #15767: FullyConnected op with float64 and 
MKL-DNN fails if gradient are not set in a specific way
URL: 
https://github.com/apache/incubator-mxnet/issues/15767#issuecomment-519317089
 
 
   It's not all about float64, but about `MKLDNN` subgraph backend. The problem 
is, recently we enabled MKLDNN subgraph backend by default on master, and this 
will break the fallback mechanism when handing float64. So for nightly build 
from master, please use `export MXNET_SUBGRAPH_BACKEND=NONE` to work around 
shortly, for MXNet v1.5.0, please `unset MXNET_SUBGRAPH_BACKEND`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] branch master updated (07eb482 -> a2b11ae)

2019-08-07 Thread zachgk
This is an automated email from the ASF dual-hosted git repository.

zachgk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 07eb482  fix tvm cmake (#15781)
 add a2b11ae  Fix PR #15489 (Dynamic Library Loading Support) (#15760)

No new revisions were added by this update.

Summary of changes:
 CMakeLists.txt |   6 +
 Makefile   |   7 +-
 ci/jenkins/Jenkins_steps.groovy|  10 +-
 example/{rnn/large_word_lm => lib_api}/Makefile|  18 ++-
 example/lib_api/libtest.cc |  78 +
 .../torch_criterion.cu => example/lib_api/mylib.cc |  28 ++---
 .../contrib/__init__.py => example/lib_api/test.py |  18 +--
 include/mxnet/c_api.h  |   7 ++
 include/mxnet/lib_api.h|  50 +
 python/mxnet/__init__.py   |   1 +
 python/mxnet/base.py   |   2 +-
 python/mxnet/{io/__init__.py => library.py}|  36 --
 src/c_api/c_api.cc |  15 +++
 src/common/library.cc  | 125 +
 src/common/library.h   |  57 ++
 src/initialize.cc  |   8 ++
 tests/python/gpu/test_operator_gpu.py  |   1 +
 tests/python/unittest/test_library_loading.py  |  48 
 18 files changed, 471 insertions(+), 44 deletions(-)
 copy example/{rnn/large_word_lm => lib_api}/Makefile (78%)
 create mode 100644 example/lib_api/libtest.cc
 copy plugin/torch/torch_criterion.cu => example/lib_api/mylib.cc (71%)
 copy python/mxnet/gluon/contrib/__init__.py => example/lib_api/test.py (74%)
 create mode 100644 include/mxnet/lib_api.h
 copy python/mxnet/{io/__init__.py => library.py} (50%)
 create mode 100644 src/common/library.cc
 create mode 100644 src/common/library.h
 create mode 100644 tests/python/unittest/test_library_loading.py



[GitHub] [incubator-mxnet] zachgk merged pull request #15760: Fix PR #15489 (Dynamic Library Loading Support)

2019-08-07 Thread GitBox
zachgk merged pull request #15760: Fix PR #15489 (Dynamic Library Loading 
Support)
URL: https://github.com/apache/incubator-mxnet/pull/15760
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ZhennanQin commented on issue #15767: FullyConnected op with float64 and MKL-DNN fails if gradient are not set in a specific way

2019-08-07 Thread GitBox
ZhennanQin commented on issue #15767: FullyConnected op with float64 and 
MKL-DNN fails if gradient are not set in a specific way
URL: 
https://github.com/apache/incubator-mxnet/issues/15767#issuecomment-519317089
 
 
   It's not all about float64, but about `MKLDNN` subgraph backend. The problem 
is, recently we enabled MKLDNN subgraph backend by default, and this will break 
the fallback mechanism when handing float64. Please use `export 
MXNET_SUBGRAPH_BACKEND=NONE` to work around shortly.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] yzhliu commented on a change in pull request #15776: Numpy numpy op blackman

2019-08-07 Thread GitBox
yzhliu commented on a change in pull request #15776: Numpy  numpy op blackman
URL: https://github.com/apache/incubator-mxnet/pull/15776#discussion_r311812161
 
 

 ##
 File path: src/operator/numpy/np_window_op.h
 ##
 @@ -0,0 +1,143 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+/*!
+ *  Copyright (c) 2019 by Contributors
+ * \file np_window_op.h
+ * \brief CPU Implementation of unary op hanning, hamming, blackman window.
+ */
+
+#ifndef MXNET_OPERATOR_NUMPY_NP_WINDOW_OP_H_
+#define MXNET_OPERATOR_NUMPY_NP_WINDOW_OP_H_
+
+#include 
+#include 
+#include "./np_init_op.h"
+
+namespace mxnet {
+namespace op {
+
+#ifdef __CUDA_ARCH__
+__constant__ const float PI = 3.14159265358979323846;
+#else
+const float PI = 3.14159265358979323846;
+using std::isnan;
+#endif
+
+struct NumpyWindowsParam : public dmlc::Parameter {
+  dmlc::optional M;
+  std::string ctx;
+  int dtype;
+  DMLC_DECLARE_PARAMETER(NumpyWindowsParam) {
+  DMLC_DECLARE_FIELD(M)
+  .set_default(dmlc::optional())
+  .describe("Number of points in the output window. "
+"If zero or less, an empty array is returned.");
+  DMLC_DECLARE_FIELD(ctx)
+  .set_default("")
+  .describe("Context of output, in format [cpu|gpu|cpu_pinned](n)."
+  "Only used for imperative calls.");
+  DMLC_DECLARE_FIELD(dtype)
+  .set_default(mshadow::kFloat64)
+  MXNET_ADD_ALL_TYPES
+  .describe("Data-type of the returned array.");
+  }
+};
+
+inline bool NumpyWindowsShape(const nnvm::NodeAttrs& attrs,
+  mxnet::ShapeVector* in_shapes,
+  mxnet::ShapeVector* out_shapes) {
+  const NumpyWindowsParam& param = nnvm::get(attrs.parsed);
+  CHECK_EQ(in_shapes->size(), 0U);
+  CHECK_EQ(out_shapes->size(), 1U);
+  CHECK(param.M.has_value()) << "missing 1 required positional argument: 'M'";
+  int64_t out_size = param.M.value() <= 0 ? 0 : param.M.value();
+  SHAPE_ASSIGN_CHECK(*out_shapes, 0, 
mxnet::TShape({static_cast(out_size)}));
+  return true;
+}
+
+struct hanning_fwd {
+  template
+  MSHADOW_XINLINE static void Map(index_t i, index_t M, int req, DType* out) {
+if (M == 1) {
+  KERNEL_ASSIGN(out[i], req, static_cast(1));
+} else {
+  KERNEL_ASSIGN(out[i], req, DType(0.5) - DType(0.5) * math::cos(DType(2 * 
PI * i / (M - 1;
+}
+  }
+};
+
+struct hamming_fwd {
+  template
+  MSHADOW_XINLINE static void Map(index_t i, index_t M, int req, DType* out) {
+if (M == 1) {
+  KERNEL_ASSIGN(out[i], req, static_cast(1));
+} else {
+  KERNEL_ASSIGN(out[i], req,
+DType(0.54) - DType(0.46) * math::cos(DType(2 * PI * i / 
(M - 1;
+}
+  }
+};
+
+struct blackman_fwd {
+  template
+  MSHADOW_XINLINE static void Map(index_t i, index_t M, int req, DType* out) {
+if (M == 1) {
+  KERNEL_ASSIGN(out[i], req, static_cast(1));
+} else {
+  KERNEL_ASSIGN(out[i], req, DType(0.42) - DType(0.5) * math::cos(DType(2 
* PI * i /(M - 1))) +
+  DType(0.08) * math::cos(DType(4 * PI * i /(M - 1;
+}
+  }
+};
+
+template
+void NumpyWindowCompute(const nnvm::NodeAttrs& attrs,
+const OpContext& ctx,
+const std::vector& inputs,
+const std::vector& req,
+const std::vector& outputs) {
+  using namespace mxnet_op;
+  mshadow::Stream *s = ctx.get_stream();
+  const NumpyWindowsParam& param = nnvm::get(attrs.parsed);
+  if (param.M.has_value()) {
+if (param.M.value() <= 0) {
 
 Review comment:
   ```suggestion
   if (param.M.has_value() && param.M.value() <= 0) {
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] KellenSunderland commented on issue #15545: Softmax fwd optimization for GPU

2019-08-07 Thread GitBox
KellenSunderland commented on issue #15545: Softmax fwd optimization for GPU
URL: https://github.com/apache/incubator-mxnet/pull/15545#issuecomment-519314268
 
 
   @sxjscience totally agree.  This would provide a lot of benefit across the 
framework (for example the layernorm op).
   
   @ptrendx
   I see what you mean "fatal error C1002: compiler is out of heap space in 
pass 2".  The CI windows machines should have a fair amount of RAM so this is a 
little strange.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] branch master updated: fix tvm cmake (#15781)

2019-08-07 Thread liuyizhi
This is an automated email from the ASF dual-hosted git repository.

liuyizhi pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 07eb482  fix tvm cmake (#15781)
07eb482 is described below

commit 07eb482670c5e7891b6baa0184f361a9b9621786
Author: Haozheng Fan 
AuthorDate: Thu Aug 8 07:56:26 2019 +0800

fix tvm cmake (#15781)
---
 CMakeLists.txt  | 2 +-
 cmake/BuildTVM.cmake| 2 +-
 src/operator/contrib/tvmop/ufunc.cc | 4 ++--
 3 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/CMakeLists.txt b/CMakeLists.txt
index 7c479f7..b33d195 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -751,7 +751,7 @@ if(USE_TVM_OP)
   add_custom_command(TARGET mxnet POST_BUILD
 COMMAND ${CMAKE_COMMAND} -E env
   
PYTHONPATH="${CMAKE_CURRENT_SOURCE_DIR}/3rdparty/tvm/python:${CMAKE_CURRENT_SOURCE_DIR}/3rdparty/tvm/topi/python:${CMAKE_CURRENT_SOURCE_DIR}/contrib"
-  LD_LIBRARY_PATH="${CMAKE_CURRENT_BINARY_DIR}/3rdparty/tvm/build"
+  LD_LIBRARY_PATH="${CMAKE_CURRENT_BINARY_DIR}/3rdparty/tvm"
   ${Python3_EXECUTABLE} 
${CMAKE_CURRENT_SOURCE_DIR}/contrib/tvmop/compile.py 
-o${CMAKE_CURRENT_BINARY_DIR}/libtvmop.so
 )
 endif()
diff --git a/cmake/BuildTVM.cmake b/cmake/BuildTVM.cmake
index ad8517c..db8b33b 100644
--- a/cmake/BuildTVM.cmake
+++ b/cmake/BuildTVM.cmake
@@ -16,7 +16,7 @@
 # under the License.
 
 message(STATUS "Prepare external packages for TVM...")
-execute_process(COMMAND 
"${CMAKE_CURRENT_SOURCE_DIR}/contrib/tvmop/prepare_tvm.sh")
+execute_process(COMMAND "sh" 
"${CMAKE_CURRENT_SOURCE_DIR}/contrib/tvmop/prepare_tvm.sh")
 
 # Whether enable ROCM runtime
 #
diff --git a/src/operator/contrib/tvmop/ufunc.cc 
b/src/operator/contrib/tvmop/ufunc.cc
index faba671..3475a21 100644
--- a/src/operator/contrib/tvmop/ufunc.cc
+++ b/src/operator/contrib/tvmop/ufunc.cc
@@ -56,10 +56,10 @@ NNVM_REGISTER_OP(_contrib_tvm_vadd)
 .add_argument("b", "NDArray-or-Symbol", "second input")
 .set_attr("FInferShape", BinaryBroadcastShape)
 .set_attr("FInferType", mxnet::op::ElemwiseType<2, 1>)
-.set_attr("FCompute", 
mxnet::op::TVMBroadcastCompute)
 #if MXNET_USE_CUDA
-.set_attr("FCompute", 
mxnet::op::TVMBroadcastCompute);
+.set_attr("FCompute", 
mxnet::op::TVMBroadcastCompute)
 #endif  // MXNET_USE_CUDA
+.set_attr("FCompute", 
mxnet::op::TVMBroadcastCompute);
 
 }  // namespace op
 }  // namespace mxnet



[GitHub] [incubator-mxnet] yzhliu commented on issue #15781: Fix minor bugs in tvm

2019-08-07 Thread GitBox
yzhliu commented on issue #15781: Fix minor bugs in tvm
URL: https://github.com/apache/incubator-mxnet/pull/15781#issuecomment-519310194
 
 
   Thanks @hzfan 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] yzhliu merged pull request #15781: Fix minor bugs in tvm

2019-08-07 Thread GitBox
yzhliu merged pull request #15781: Fix minor bugs in tvm
URL: https://github.com/apache/incubator-mxnet/pull/15781
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ChaiBapchya commented on issue #15786: LeNet-5 inference FAIL

2019-08-07 Thread GitBox
ChaiBapchya commented on issue #15786: LeNet-5 inference FAIL
URL: 
https://github.com/apache/incubator-mxnet/issues/15786#issuecomment-519307615
 
 
   Not sure if this is Bug on MXNet side. Since MXNet accuracy higher than 
TensorRT


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ban1080 edited a comment on issue #15181: [Feature Request] Support ONNX export of MultiBox operators

2019-08-07 Thread GitBox
ban1080 edited a comment on issue #15181: [Feature Request] Support ONNX export 
of MultiBox operators
URL: 
https://github.com/apache/incubator-mxnet/issues/15181#issuecomment-519307238
 
 
   I'm getting a "AttributeError: No conversion function registered for op type 
_contrib_MultiBoxPrior yet.". Same situations as 
https://stackoverflow.com/questions/56229207/export-mxnet-model-to-onnx-with-contrib-multiboxprior-error
 
   
   - I trained using AWS's object detection algorithm (Resnet50 + SSD).
   - Converted resultant training model (on S3 bucket) to a deploy model using 
"deploy.py" in mxnet's SSD examples.
   - Attempted to convert to from deploy model to ONNX model. Used 
   
   ```import mxnet as mx
   import numpy as np
   from mxnet.contrib import onnx as onnx_mxnet
   import logging
   logging.basicConfig(level=logging.INFO)
   
   # Downloaded input symbol and params files
   sym = './deploy_model_algo_1-symbol.json'
   params = './deploy_model_algo_1-.params'
   
   # Standard Imagenet input - 3 channels, 512*512
   input_shape = (1,3,512,512)
   
   # Path of the output file
   onnx_file = './mxnet_exported.onnx'
   
   converted_model_path = onnx_mxnet.export_model(sym, params, [input_shape], 
np.float32, onnx_file)```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ban1080 edited a comment on issue #15181: [Feature Request] Support ONNX export of MultiBox operators

2019-08-07 Thread GitBox
ban1080 edited a comment on issue #15181: [Feature Request] Support ONNX export 
of MultiBox operators
URL: 
https://github.com/apache/incubator-mxnet/issues/15181#issuecomment-519307238
 
 
   I'm getting a "AttributeError: No conversion function registered for op type 
_contrib_MultiBoxPrior yet.".
   
   - I trained using AWS's object detection algorithm (Resnet50 + SSD).
   - Converted resultant training model (on S3 bucket) to a deploy model using 
"deploy.py" in mxnet's SSD examples.
   - Attempted to convert to from deploy model to ONNX model. Used 
   
   ```import mxnet as mx
   import numpy as np
   from mxnet.contrib import onnx as onnx_mxnet
   import logging
   logging.basicConfig(level=logging.INFO)
   
   # Downloaded input symbol and params files
   sym = './deploy_model_algo_1-symbol.json'
   params = './deploy_model_algo_1-.params'
   
   # Standard Imagenet input - 3 channels, 512*512
   input_shape = (1,3,512,512)
   
   # Path of the output file
   onnx_file = './mxnet_exported.onnx'
   
   converted_model_path = onnx_mxnet.export_model(sym, params, [input_shape], 
np.float32, onnx_file)```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ban1080 commented on issue #15181: [Feature Request] Support ONNX export of MultiBox operators

2019-08-07 Thread GitBox
ban1080 commented on issue #15181: [Feature Request] Support ONNX export of 
MultiBox operators
URL: 
https://github.com/apache/incubator-mxnet/issues/15181#issuecomment-519307238
 
 
   I'm getting a "AttributeError: No conversion function registered for op type 
_contrib_MultiBoxPrior yet.".
   
   - I trained using AWS's object detection algorithm (Resnet50 + SSD).
   - Converted resultant training model (on S3 bucket) to a deploy model using 
"deploy.py" in mxnet's SSD examples.
   - Attempted to convert to from deploy model to ONNX model. Used 
   `import mxnet as mx
   import numpy as np
   from mxnet.contrib import onnx as onnx_mxnet
   import logging
   logging.basicConfig(level=logging.INFO)
   
   # Downloaded input symbol and params files
   sym = './deploy_model_algo_1-symbol.json'
   params = './deploy_model_algo_1-.params'
   
   # Standard Imagenet input - 3 channels, 512*512
   input_shape = (1,3,512,512)
   
   # Path of the output file
   onnx_file = './mxnet_exported.onnx'
   
   converted_model_path = onnx_mxnet.export_model(sym, params, [input_shape], 
np.float32, onnx_file)`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on issue #15516: Fix memory leak reported by ASAN in NNVM to ONNX conversion

2019-08-07 Thread GitBox
larroy commented on issue #15516: Fix memory leak reported by ASAN in NNVM to 
ONNX conversion
URL: https://github.com/apache/incubator-mxnet/pull/15516#issuecomment-519306908
 
 
   I also have concerns about this line:
   
   ```
 reinterpret_cast(constant));
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
larroy commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311804430
 
 

 ##
 File path: src/c_api/c_api.cc
 ##
 @@ -174,48 +174,68 @@ int MXNDArrayCreateNone(NDArrayHandle *out) {
   API_END();
 }
 
+template
+void CreateNDArrayImpl(const DataType* shape,
+   dimtype ndim,
+   int dev_type,
+   int dev_id,
+   int delay_alloc,
+   int dtype,
+   NDArrayHandle* out) {
+  *out = new NDArray(mxnet::TShape(shape, shape + ndim),
+ 
Context::Create(static_cast(dev_type), dev_id),
+ delay_alloc != 0, dtype);
+}
+
 int MXNDArrayCreate(const mx_uint *shape,
 mx_uint ndim,
 int dev_type,
 int dev_id,
 int delay_alloc,
 NDArrayHandle *out) {
   API_BEGIN();
-  *out = new NDArray(
-  mxnet::TShape(shape, shape + ndim),
-  Context::Create(static_cast(dev_type), dev_id),
-  delay_alloc != 0);
+  *out = new NDArray(mxnet::TShape(shape, shape + ndim),
 
 Review comment:
   Why don't we use CreateNDArrayImpl for consistency here as well? Also it's 
more like a wrapper than an Impl. Maybe the "Impl" part of the name could be 
refined?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on a change in pull request #15785: Add large tensor support binary arithmetic

2019-08-07 Thread GitBox
larroy commented on a change in pull request #15785: Add large tensor support 
binary arithmetic
URL: https://github.com/apache/incubator-mxnet/pull/15785#discussion_r311802355
 
 

 ##
 File path: tests/nightly/test_large_array.py
 ##
 @@ -352,6 +352,93 @@ def test_topk():
 assert l.sum() == np.sum(np.arange(0, SMALL_Y))
 
 
+def test_add():
+a = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+b = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+mx_res = a.__add__(b)
+np_res = b.asnumpy()+a.asnumpy()
 
 Review comment:
   Shouldn't this be caught by the linter? I think we are not linting this 
folder. There are similar whitespace issues down, also spaces after commas etc. 
I suggest running lint.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] larroy commented on a change in pull request #15785: Add large tensor support binary arithmetic

2019-08-07 Thread GitBox
larroy commented on a change in pull request #15785: Add large tensor support 
binary arithmetic
URL: https://github.com/apache/incubator-mxnet/pull/15785#discussion_r311802355
 
 

 ##
 File path: tests/nightly/test_large_array.py
 ##
 @@ -352,6 +352,93 @@ def test_topk():
 assert l.sum() == np.sum(np.arange(0, SMALL_Y))
 
 
+def test_add():
+a = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+b = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+mx_res = a.__add__(b)
+np_res = b.asnumpy()+a.asnumpy()
 
 Review comment:
   Shouldn't this be caught by the linter? I think we are not linting this 
folder. There are similar whitespace issues down.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ChaiBapchya opened a new issue #15786: LeNet-5 inference FAIL

2019-08-07 Thread GitBox
ChaiBapchya opened a new issue #15786: LeNet-5 inference FAIL
URL: https://github.com/apache/incubator-mxnet/issues/15786
 
 
   PR #15783 
   Build pipeline -
   
http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/mxnet-validation%2Funix-gpu/detail/PR-15783/6/pipeline/
   ```
   ==
   FAIL: Run LeNet-5 inference comparison between MXNet and TensorRT.
   --
   Traceback (most recent call last):
 File "/usr/local/lib/python3.6/dist-packages/nose/case.py", line 198, in 
runTest
   self.test(*self.arg)
 File "/work/mxnet/tests/python/tensorrt/test_tensorrt_lenet5.py", line 
102, in test_tensorrt_inference
   MXNet = %f, TensorRT = %f""" % (absolute_accuracy_diff, epsilon, mx_pct, 
trt_pct)
   AssertionError: Absolute diff. between MXNet & TensorRT accuracy (0.02) 
exceeds threshold (0.010100):
  MXNet = 99.14, TensorRT = 99.12
    >> begin captured logging << 
   root: INFO: train-labels-idx1-ubyte.gz exists, skipping download
   root: INFO: train-images-idx3-ubyte.gz exists, skipping download
   root: INFO: t10k-labels-idx1-ubyte.gz exists, skipping download
   root: INFO: t10k-images-idx3-ubyte.gz exists, skipping download
   - >> end captured logging << -
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] mxnet-label-bot commented on issue #15786: LeNet-5 inference FAIL

2019-08-07 Thread GitBox
mxnet-label-bot commented on issue #15786: LeNet-5 inference FAIL
URL: 
https://github.com/apache/incubator-mxnet/issues/15786#issuecomment-519301636
 
 
   Hey, this is the MXNet Label Bot. 
Thank you for submitting the issue! I will try and suggest some labels so 
that the appropriate MXNet community members can help resolve it. 
Here are my recommended labels: Bug


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sxjscience commented on a change in pull request #15751: Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
sxjscience commented on a change in pull request #15751: Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15751#discussion_r311799715
 
 

 ##
 File path: tests/cpp/engine/threaded_engine_test.cc
 ##
 @@ -257,49 +257,80 @@ TEST(Engine, PushFunc) {
 
 TEST(Engine, PushFuncND) {
   auto ctx = mxnet::Context{};
-  mxnet::NDArray nd(ctx);
-
-  // Test #1
-  LOG(INFO) << "= Test #1: PushAsyncND param and deleter =";
-  int* a = new int(100);
-  int res = MXEnginePushAsyncND(FooAsyncFunc, a, FooFuncDeleter, , , 1, 
nullptr, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #2
-  LOG(INFO) << "= Test #2: PushAsyncND NULL param and NULL deleter =";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , nullptr, 0, 
, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #3
-  LOG(INFO) << "= Test #3: PushAsyncND invalid number of const nds =";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , , -1, 
nullptr, 0);
-  EXPECT_EQ(res, -1);
-
-  // Test #4
-  LOG(INFO) << "= Test #4: PushAsyncND invalid number of mutable nds 
=";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , nullptr, 0, 
, -1);
-  EXPECT_EQ(res, -1);
-
-  // Test #5
-  LOG(INFO) << "= Test #5: PushSyncND param and deleter =";
-  int* b = new int(101);
-  res = MXEnginePushSyncND(FooSyncFunc, b, FooFuncDeleter, , , 1, 
nullptr, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #6
-  LOG(INFO) << "= Test #6: PushSyncND NULL param and NULL deleter =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , nullptr, 0, 
, 1);
-  EXPECT_EQ(res, 0);
-
-  // Test #7
-  LOG(INFO) << "= Test #7: PushSyncND invalid number of const nds =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , , -1, 
nullptr, 0);
-  EXPECT_EQ(res, -1);
-
-  // Test #8
-  LOG(INFO) << "= Test #8: PushSyncND invalid number of mutable nds =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , nullptr, 0, 
, -1);
-  EXPECT_EQ(res, -1);
+  std::vector nds;
 
 Review comment:
   I now understand the logic here. To make the API consistent, I think we 
should also change the interface of `MXEnginePushAsync` and `MXEnginePushSync`. 
We should be safe to replace `EngineVarHandle` with `VarHandle*`. Am I right 
here ? @apeforest @yuxihu 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15785: Add large tensor support binary arithmetic

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15785: Add large tensor 
support binary arithmetic
URL: https://github.com/apache/incubator-mxnet/pull/15785#discussion_r311798914
 
 

 ##
 File path: tests/nightly/test_large_array.py
 ##
 @@ -352,6 +352,93 @@ def test_topk():
 assert l.sum() == np.sum(np.arange(0, SMALL_Y))
 
 
+def test_add():
+a = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+b = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+mx_res = a.__add__(b)
+np_res = b.asnumpy()+a.asnumpy()
+assert mx_res.asnumpy()[-1][0]==np_res[-1][0]
+
+
+def test_sub():
+a = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+b = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+mx_res = a.__sub__(b)
+np_res = a.asnumpy()-b.asnumpy()
+assert mx_res.asnumpy()[-1][0]==np_res[-1][0]
+
+
+def test_rsub():
+a = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+b = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+mx_res = a.__rsub__(b)
+np_res = b.asnumpy()-a.asnumpy()
 
 Review comment:
   same here


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15785: Add large tensor support binary arithmetic

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15785: Add large tensor 
support binary arithmetic
URL: https://github.com/apache/incubator-mxnet/pull/15785#discussion_r311798615
 
 

 ##
 File path: tests/nightly/test_large_array.py
 ##
 @@ -352,6 +352,93 @@ def test_topk():
 assert l.sum() == np.sum(np.arange(0, SMALL_Y))
 
 
+def test_add():
+a = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+b = create_2d_tensor(rows=LARGE_X, columns=SMALL_Y, dtype=np.float64)
+mx_res = a.__add__(b)
+np_res = b.asnumpy()+a.asnumpy()
 
 Review comment:
   nit: add space around operator + 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311798249
 
 

 ##
 File path: tests/nightly/test_large_vector.py
 ##
 @@ -0,0 +1,37 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import numpy as np
+import mxnet as mx
+from mxnet.test_utils import rand_ndarray, assert_almost_equal, rand_coord_2d
+from mxnet import gluon, nd
+from tests.python.unittest.common import with_seed
+
+# dimension constants
+LARGE_X = 50
+SMALL_Y = 1
+
+
+def test_slice():
+a = nd.ones(LARGE_X)
+res = nd.slice(a, begin=(LARGE_X-10), end=(LARGE_X))
 
 Review comment:
   Replacing 10 with a const


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311797917
 
 

 ##
 File path: tests/nightly/test_large_vector.py
 ##
 @@ -0,0 +1,37 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import numpy as np
+import mxnet as mx
+from mxnet.test_utils import rand_ndarray, assert_almost_equal, rand_coord_2d
+from mxnet import gluon, nd
+from tests.python.unittest.common import with_seed
+
+# dimension constants
+LARGE_X = 50
+SMALL_Y = 1
+
+
+def test_slice():
+a = nd.ones(LARGE_X)
+res = nd.slice(a, begin=(LARGE_X-10), end=(LARGE_X))
 
 Review comment:
   remove parenthesis around LARGE_X


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] apeforest commented on a change in pull request #15593: Large Index Support for Slice

2019-08-07 Thread GitBox
apeforest commented on a change in pull request #15593: Large Index Support for 
Slice
URL: https://github.com/apache/incubator-mxnet/pull/15593#discussion_r311797867
 
 

 ##
 File path: tests/nightly/test_large_vector.py
 ##
 @@ -0,0 +1,37 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import numpy as np
+import mxnet as mx
+from mxnet.test_utils import rand_ndarray, assert_almost_equal, rand_coord_2d
+from mxnet import gluon, nd
+from tests.python.unittest.common import with_seed
+
+# dimension constants
+LARGE_X = 50
+SMALL_Y = 1
 
 Review comment:
   remove SMALL_Y


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] ChaiBapchya opened a new pull request #15785: Add large tensor support binary arithmetic

2019-08-07 Thread GitBox
ChaiBapchya opened a new pull request #15785: Add large tensor support binary 
arithmetic
URL: https://github.com/apache/incubator-mxnet/pull/15785
 
 
   ## Description ##
   Added binary arithmetic operators - add, sub, rsub, neg, mul, div, rdiv, 
mod, rmod, imod, pow
   
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [ ] The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to 
the relevant [JIRA issue](https://issues.apache.org/jira/projects/MXNET/issues) 
created (except PRs with tiny changes)
   - [ ] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage:
   - [ ] Code is well-documented: 
   - [ ] To the my best knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] KellenSunderland commented on issue #15516: Fix memory leak reported by ASAN in NNVM to ONNX conversion

2019-08-07 Thread GitBox
KellenSunderland commented on issue #15516: Fix memory leak reported by ASAN in 
NNVM to ONNX conversion
URL: https://github.com/apache/incubator-mxnet/pull/15516#issuecomment-519296687
 
 
   Looks like there's still a compiler issue:
   ```
   /work/mxnet/src/operator/subgraph/tensorrt/nnvm_to_onnx.cc:567:54: error: 
reinterpret_cast from type 'const short unsigned int*' to type 'int32_t* {aka 
int*}' casts away qualifiers
   
  reinterpret_cast(constant));
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] antonmilev commented on issue #10562: IOError: [Errno 32] Broken pipe in Windows version

2019-08-07 Thread GitBox
antonmilev commented on issue #10562: IOError: [Errno 32] Broken pipe in 
Windows version
URL: 
https://github.com/apache/incubator-mxnet/issues/10562#issuecomment-519296449
 
 
   I am with mxnet 1.5.0 and have this problem..


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] wkcn commented on a change in pull request #15751: Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
wkcn commented on a change in pull request #15751: Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15751#discussion_r311794355
 
 

 ##
 File path: tests/cpp/engine/threaded_engine_test.cc
 ##
 @@ -257,49 +257,80 @@ TEST(Engine, PushFunc) {
 
 TEST(Engine, PushFuncND) {
   auto ctx = mxnet::Context{};
-  mxnet::NDArray nd(ctx);
-
-  // Test #1
-  LOG(INFO) << "= Test #1: PushAsyncND param and deleter =";
-  int* a = new int(100);
-  int res = MXEnginePushAsyncND(FooAsyncFunc, a, FooFuncDeleter, , , 1, 
nullptr, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #2
-  LOG(INFO) << "= Test #2: PushAsyncND NULL param and NULL deleter =";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , nullptr, 0, 
, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #3
-  LOG(INFO) << "= Test #3: PushAsyncND invalid number of const nds =";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , , -1, 
nullptr, 0);
-  EXPECT_EQ(res, -1);
-
-  // Test #4
-  LOG(INFO) << "= Test #4: PushAsyncND invalid number of mutable nds 
=";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , nullptr, 0, 
, -1);
-  EXPECT_EQ(res, -1);
-
-  // Test #5
-  LOG(INFO) << "= Test #5: PushSyncND param and deleter =";
-  int* b = new int(101);
-  res = MXEnginePushSyncND(FooSyncFunc, b, FooFuncDeleter, , , 1, 
nullptr, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #6
-  LOG(INFO) << "= Test #6: PushSyncND NULL param and NULL deleter =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , nullptr, 0, 
, 1);
-  EXPECT_EQ(res, 0);
-
-  // Test #7
-  LOG(INFO) << "= Test #7: PushSyncND invalid number of const nds =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , , -1, 
nullptr, 0);
-  EXPECT_EQ(res, -1);
-
-  // Test #8
-  LOG(INFO) << "= Test #8: PushSyncND invalid number of mutable nds =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , nullptr, 0, 
, -1);
-  EXPECT_EQ(res, -1);
+  std::vector nds;
 
 Review comment:
   To keep the consistency, `const_var_handle` and `mutable_var_handle` is the 
pointer of an array of `VarHandle`, and `const_nds_handle` and 
`mutable_var_handle` is the pointer of an array of `NDArrayHandle`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] wkcn commented on a change in pull request #15751: Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
wkcn commented on a change in pull request #15751: Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15751#discussion_r311792986
 
 

 ##
 File path: tests/cpp/engine/threaded_engine_test.cc
 ##
 @@ -257,49 +257,80 @@ TEST(Engine, PushFunc) {
 
 TEST(Engine, PushFuncND) {
   auto ctx = mxnet::Context{};
-  mxnet::NDArray nd(ctx);
-
-  // Test #1
-  LOG(INFO) << "= Test #1: PushAsyncND param and deleter =";
-  int* a = new int(100);
-  int res = MXEnginePushAsyncND(FooAsyncFunc, a, FooFuncDeleter, , , 1, 
nullptr, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #2
-  LOG(INFO) << "= Test #2: PushAsyncND NULL param and NULL deleter =";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , nullptr, 0, 
, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #3
-  LOG(INFO) << "= Test #3: PushAsyncND invalid number of const nds =";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , , -1, 
nullptr, 0);
-  EXPECT_EQ(res, -1);
-
-  // Test #4
-  LOG(INFO) << "= Test #4: PushAsyncND invalid number of mutable nds 
=";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , nullptr, 0, 
, -1);
-  EXPECT_EQ(res, -1);
-
-  // Test #5
-  LOG(INFO) << "= Test #5: PushSyncND param and deleter =";
-  int* b = new int(101);
-  res = MXEnginePushSyncND(FooSyncFunc, b, FooFuncDeleter, , , 1, 
nullptr, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #6
-  LOG(INFO) << "= Test #6: PushSyncND NULL param and NULL deleter =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , nullptr, 0, 
, 1);
-  EXPECT_EQ(res, 0);
-
-  // Test #7
-  LOG(INFO) << "= Test #7: PushSyncND invalid number of const nds =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , , -1, 
nullptr, 0);
-  EXPECT_EQ(res, -1);
-
-  // Test #8
-  LOG(INFO) << "= Test #8: PushSyncND invalid number of mutable nds =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , nullptr, 0, 
, -1);
-  EXPECT_EQ(res, -1);
+  std::vector nds;
 
 Review comment:
   It sounds good, but I am worry that the type of arguments 
(`const_vars_handle` and `mutable_vars_handle`) of the two APIs 
[`MXEnginePushAsync`](https://github.com/apache/incubator-mxnet/blob/master/src/c_api/c_api.cc#L1466)
 and 
[`MXEnginePushSync`](https://github.com/apache/incubator-mxnet/blob/master/src/c_api/c_api.cc#L1507)
 is `EngineVarHandle`, namely `void*`. It casts `void*` to `VarHandle*`, namely 
`Var**` in 
https://github.com/apache/incubator-mxnet/blob/master/src/c_api/c_api.cc#L1475. 
Therefore, I don't know how to decide the type of `const_nds_handle` and 
`mutable_nds_handle` in `MXEnginePushAsyncND` and `MXEnginePushSyncND`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] samskalicky edited a comment on issue #15613: [Discussion] 1.5.1 Patch Release

2019-08-07 Thread GitBox
samskalicky edited a comment on issue #15613: [Discussion] 1.5.1 Patch Release
URL: 
https://github.com/apache/incubator-mxnet/issues/15613#issuecomment-519227680
 
 
   https://github.com/apache/incubator-mxnet/issues/15784 needs to be fixed in 
1.5.1, big impact for simple_bind. The fix is in #15620. @TaoLv please include 
this too. Thanks!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] samskalicky commented on issue #15784: Simple_Bind failure in 1.5.0

2019-08-07 Thread GitBox
samskalicky commented on issue #15784: Simple_Bind failure in 1.5.0
URL: 
https://github.com/apache/incubator-mxnet/issues/15784#issuecomment-519293498
 
 
   Thanks @reminisce for pointing me to #15620 as the fix. I tried the previous 
PR #15137 and found that it was still failing. So #15620 fixes this issue.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] sxjscience commented on a change in pull request #15751: Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
sxjscience commented on a change in pull request #15751: Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15751#discussion_r311789501
 
 

 ##
 File path: tests/cpp/engine/threaded_engine_test.cc
 ##
 @@ -257,49 +257,80 @@ TEST(Engine, PushFunc) {
 
 TEST(Engine, PushFuncND) {
   auto ctx = mxnet::Context{};
-  mxnet::NDArray nd(ctx);
-
-  // Test #1
-  LOG(INFO) << "= Test #1: PushAsyncND param and deleter =";
-  int* a = new int(100);
-  int res = MXEnginePushAsyncND(FooAsyncFunc, a, FooFuncDeleter, , , 1, 
nullptr, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #2
-  LOG(INFO) << "= Test #2: PushAsyncND NULL param and NULL deleter =";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , nullptr, 0, 
, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #3
-  LOG(INFO) << "= Test #3: PushAsyncND invalid number of const nds =";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , , -1, 
nullptr, 0);
-  EXPECT_EQ(res, -1);
-
-  // Test #4
-  LOG(INFO) << "= Test #4: PushAsyncND invalid number of mutable nds 
=";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , nullptr, 0, 
, -1);
-  EXPECT_EQ(res, -1);
-
-  // Test #5
-  LOG(INFO) << "= Test #5: PushSyncND param and deleter =";
-  int* b = new int(101);
-  res = MXEnginePushSyncND(FooSyncFunc, b, FooFuncDeleter, , , 1, 
nullptr, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #6
-  LOG(INFO) << "= Test #6: PushSyncND NULL param and NULL deleter =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , nullptr, 0, 
, 1);
-  EXPECT_EQ(res, 0);
-
-  // Test #7
-  LOG(INFO) << "= Test #7: PushSyncND invalid number of const nds =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , , -1, 
nullptr, 0);
-  EXPECT_EQ(res, -1);
-
-  // Test #8
-  LOG(INFO) << "= Test #8: PushSyncND invalid number of mutable nds =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , nullptr, 0, 
, -1);
-  EXPECT_EQ(res, -1);
+  std::vector nds;
 
 Review comment:
   Yes, we have to use `std::vector` if we keep the interface to be 
`NDArrayHandle *`. However, I'm thinking whether we could directly use the 
`std::vector vec;` and use 
`nds.emplace_back(mxnet::NDArray(ctx))` or `nds.push_back(std::move(temp_arr))` 
to fill the vector. In that case, the existing API will not be changed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] wkcn commented on issue #15775: [Backport][v1.5.x] Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
wkcn commented on issue #15775: [Backport][v1.5.x] Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15775#issuecomment-519290594
 
 
   Thank @szha ! The PR will be merged if there is no change in 
https://github.com/apache/incubator-mxnet/pull/15751


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] wkcn commented on a change in pull request #15751: Fix the bug of `MXEnginePushAsyncND` and `MXEnginePushSyncND`

2019-08-07 Thread GitBox
wkcn commented on a change in pull request #15751: Fix the bug of 
`MXEnginePushAsyncND` and `MXEnginePushSyncND`
URL: https://github.com/apache/incubator-mxnet/pull/15751#discussion_r311787447
 
 

 ##
 File path: tests/cpp/engine/threaded_engine_test.cc
 ##
 @@ -257,49 +257,80 @@ TEST(Engine, PushFunc) {
 
 TEST(Engine, PushFuncND) {
   auto ctx = mxnet::Context{};
-  mxnet::NDArray nd(ctx);
-
-  // Test #1
-  LOG(INFO) << "= Test #1: PushAsyncND param and deleter =";
-  int* a = new int(100);
-  int res = MXEnginePushAsyncND(FooAsyncFunc, a, FooFuncDeleter, , , 1, 
nullptr, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #2
-  LOG(INFO) << "= Test #2: PushAsyncND NULL param and NULL deleter =";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , nullptr, 0, 
, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #3
-  LOG(INFO) << "= Test #3: PushAsyncND invalid number of const nds =";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , , -1, 
nullptr, 0);
-  EXPECT_EQ(res, -1);
-
-  // Test #4
-  LOG(INFO) << "= Test #4: PushAsyncND invalid number of mutable nds 
=";
-  res = MXEnginePushAsyncND(FooAsyncFunc, nullptr, nullptr, , nullptr, 0, 
, -1);
-  EXPECT_EQ(res, -1);
-
-  // Test #5
-  LOG(INFO) << "= Test #5: PushSyncND param and deleter =";
-  int* b = new int(101);
-  res = MXEnginePushSyncND(FooSyncFunc, b, FooFuncDeleter, , , 1, 
nullptr, 0);
-  EXPECT_EQ(res, 0);
-
-  // Test #6
-  LOG(INFO) << "= Test #6: PushSyncND NULL param and NULL deleter =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , nullptr, 0, 
, 1);
-  EXPECT_EQ(res, 0);
-
-  // Test #7
-  LOG(INFO) << "= Test #7: PushSyncND invalid number of const nds =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , , -1, 
nullptr, 0);
-  EXPECT_EQ(res, -1);
-
-  // Test #8
-  LOG(INFO) << "= Test #8: PushSyncND invalid number of mutable nds =";
-  res = MXEnginePushSyncND(FooSyncFunc, nullptr, nullptr, , nullptr, 0, 
, -1);
-  EXPECT_EQ(res, -1);
+  std::vector nds;
 
 Review comment:
   There is an argument Context in the constructor of NDArray. I do not know 
how to use `std::vector`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] branch master updated (aadef2d -> 45db8ea)

2019-08-07 Thread reminisce
This is an automated email from the ASF dual-hosted git repository.

reminisce pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from aadef2d  Fix flaky test test_global_metric (#15756)
 add 45db8ea  Add matrix determinant operator in linalg (#15007)

No new revisions were added by this update.

Summary of changes:
 docs/api/python/symbol/linalg.md |   2 +
 python/mxnet/contrib/amp/lists/symbol.py |   4 +
 src/operator/linalg.h|  50 +--
 src/operator/linalg_impl.h   | 243 ---
 src/operator/tensor/la_op-inl.h  | 136 -
 src/operator/tensor/la_op.cc | 166 +
 src/operator/tensor/la_op.cu |  12 ++
 src/operator/tensor/la_op.h  | 180 +++
 tests/python/unittest/test_operator.py   |  23 ++-
 9 files changed, 679 insertions(+), 137 deletions(-)



  1   2   3   >