[GitHub] chowkamlee81 commented on issue #7907: Dense upsamling operation rather than deconvolution implementation

2017-09-18 Thread git
chowkamlee81 commented on issue #7907: Dense upsamling operation rather than 
deconvolution implementation
URL: 
https://github.com/apache/incubator-mxnet/issues/7907#issuecomment-330437118
 
 
   Hai szha,
   
   I tried according to you and i removed crop function as you suggested. After 
that module started training. Even after several epochs training log loss 
measure remains same after several epochs.
   
   Below is report
   Epoch[0] Batch [10] Speed: 3.88 samples/sec Train-FCNLogLoss= 2.91
   .
   
   Epoch[0] Batch [730] Speed: 3.843 samples/sec Train-FCNLogLoss= 2.91.
   
   Kindly suggest how to go ahead and solve this issue
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] chowkamlee81 commented on issue #7717: Subpixel convolution(state of art) implementation rather than using Deconvolution.

2017-09-18 Thread git
chowkamlee81 commented on issue #7717: Subpixel convolution(state of art) 
implementation rather than using Deconvolution.
URL: 
https://github.com/apache/incubator-mxnet/issues/7717#issuecomment-330436790
 
 
   Dear Eldercrow,
   
   I tried according to you and i removed crop function as you suggested. After 
that module started training. Even after several epochs training log loss 
measure remains same after several epochs.
   
   Below is report
   Epoch[0] Batch [10] Speed: 3.88 samples/sec Train-FCNLogLoss= 2.91
   .
   
   Epoch[0] Batch [730] Speed: 3.843 samples/sec Train-FCNLogLoss= 2.91.
   
   Kindly suggest how to go ahead and solve this issue
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 commented on issue #7935: Zeroes CSR still needs a valid row_pointer array.

2017-09-18 Thread git
cjolivier01 commented on issue #7935: Zeroes CSR still needs a valid 
row_pointer array.
URL: https://github.com/apache/incubator-mxnet/pull/7935#issuecomment-330434805
 
 
   I don't see any burden on memory allocation greater than normal
   NDArray.zeros() allocation.  It'll be less, actually, more often than not.
   But it's certainly not some "abnormal" memory allocation situation by any
   stretch of the imagination.
   
   On Mon, Sep 18, 2017 at 9:11 PM reminisce  wrote:
   
   > I agree that an all zero initialized indptr in a zero csr sounds more
   > correct from the definition point of view of csr and alleviates the burden
   > on writing client code. If memory allocation is not a concern, I'm okay
   > with the change.
   >
   > ?
   > You are receiving this because you authored the thread.
   > Reply to this email directly, view it on GitHub
   > 
,
   > or mute the thread
   > 

   > .
   >
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] reminisce commented on issue #7935: Zeroes CSR still needs a valid row_pointer array.

2017-09-18 Thread git
reminisce commented on issue #7935: Zeroes CSR still needs a valid row_pointer 
array.
URL: https://github.com/apache/incubator-mxnet/pull/7935#issuecomment-330423935
 
 
   I agree that an all zero initialized `indptr` in a zero csr sounds more 
correct from the definition point of view of csr and alleviates the burden on 
writing client code. If memory allocation is not a concern, I'm okay with the 
change.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mbaijal commented on issue #7941: DO NOT MERGE: Just a test of adding properties to JenkinsFile

2017-09-18 Thread git
mbaijal commented on issue #7941: DO NOT MERGE: Just a test of adding 
properties to JenkinsFile
URL: https://github.com/apache/incubator-mxnet/pull/7941#issuecomment-330403575
 
 
   Build Now
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mbaijal opened a new pull request #7941: DO NOT MERGE: Just a test of adding properties to JenkinsFile

2017-09-18 Thread git
mbaijal opened a new pull request #7941: DO NOT MERGE: Just a test of adding 
properties to JenkinsFile
URL: https://github.com/apache/incubator-mxnet/pull/7941
 
 
   Please DO NOT MERGE!!
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 commented on issue #7935: Zeroes CSR still needs a valid row_pointer array.

2017-09-18 Thread git
cjolivier01 commented on issue #7935: Zeroes CSR still needs a valid 
row_pointer array.
URL: https://github.com/apache/incubator-mxnet/pull/7935#issuecomment-330400617
 
 
   While an unintialized RSP matrix is a valid all-zero matrix, this is not the 
case for CSR. Thus RSP returns a valid matrix and CSR does not.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 commented on issue #7935: Zeroes CSR still needs a valid row_pointer array.

2017-09-18 Thread git
cjolivier01 commented on issue #7935: Zeroes CSR still needs a valid 
row_pointer array.
URL: https://github.com/apache/incubator-mxnet/pull/7935#issuecomment-330398788
 
 
   I respectfully disagree. Maybe something like empty() should return 
uninitialized, but returning an Invald CSR array from zeros() and expecting the 
consuming function to initialize the array seems backwards and difficult to 
maintain.  zeros() should return a CSR that appears as a matrix of zeroes. It 
is not doing that.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] thirdwing commented on issue #7476: R-package RNN refactor

2017-09-18 Thread git
thirdwing commented on issue #7476: R-package RNN refactor
URL: https://github.com/apache/incubator-mxnet/pull/7476#issuecomment-330396699
 
 
   @jeremiedb Can you update with master branch? Let's see the CI test.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] reminisce commented on issue #7682: Fix shape inference bug

2017-09-18 Thread git
reminisce commented on issue #7682: Fix shape inference bug
URL: https://github.com/apache/incubator-mxnet/pull/7682#issuecomment-330377972
 
 
   @tqchen I will do that. Thanks.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #7938: instance norm and reflection padding

2017-09-18 Thread git
szha commented on issue #7938: instance norm and reflection padding
URL: https://github.com/apache/incubator-mxnet/pull/7938#issuecomment-33038
 
 
   Could you fix lint? The errors can be obtained by running `make pylint`
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] tqchen commented on issue #7682: Fix shape inference bug

2017-09-18 Thread git
tqchen commented on issue #7682: Fix shape inference bug
URL: https://github.com/apache/incubator-mxnet/pull/7682#issuecomment-330377101
 
 
   @reminisce Can you also backport this to nnvm?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] eric-haibin-lin commented on a change in pull request #7935: Zeroes CSR still needs a valid row_pointer array.

2017-09-18 Thread git
eric-haibin-lin commented on a change in pull request #7935: Zeroes CSR still 
needs a valid row_pointer array.
URL: https://github.com/apache/incubator-mxnet/pull/7935#discussion_r139555468
 
 

 ##
 File path: src/operator/tensor/init_op.h
 ##
 @@ -197,14 +197,21 @@ void FillZerosRspImpl(mshadow::Stream *s, NDArray 
*dst) {
   dst->set_aux_shape(rowsparse::kIdx, TShape(mshadow::Shape1(0)));
 }
 
-// Fill a CSR NDArray with zeros by updating the aux shape.
+/*! \brief Fill a CSR NDArray with zeros by updating the aux shape
+ *
+ * @tparam xpu - cpu or gpu
 
 Review comment:
   Sorry about that. Learnt a new thing today ? 
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 commented on a change in pull request #7935: Zeroes CSR still needs a valid row_pointer array.

2017-09-18 Thread git
cjolivier01 commented on a change in pull request #7935: Zeroes CSR still needs 
a valid row_pointer array.
URL: https://github.com/apache/incubator-mxnet/pull/7935#discussion_r139555019
 
 

 ##
 File path: src/operator/tensor/init_op.h
 ##
 @@ -197,14 +197,21 @@ void FillZerosRspImpl(mshadow::Stream *s, NDArray 
*dst) {
   dst->set_aux_shape(rowsparse::kIdx, TShape(mshadow::Shape1(0)));
 }
 
-// Fill a CSR NDArray with zeros by updating the aux shape.
+/*! \brief Fill a CSR NDArray with zeros by updating the aux shape
+ *
+ * @tparam xpu - cpu or gpu
 
 Review comment:
   https://www.stack.nl/~dimitri/doxygen/manual/commands.html
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] reminisce commented on issue #7935: Zeroes CSR still needs a valid row_pointer array.

2017-09-18 Thread git
reminisce commented on issue #7935: Zeroes CSR still needs a valid row_pointer 
array.
URL: https://github.com/apache/incubator-mxnet/pull/7935#issuecomment-330371832
 
 
   I think it's okay for operators to work on a zero csr with empty `indptr` as 
long as the csr is checked if zero before performing any regular operations 
like the one exposed in https://github.com/apache/incubator-mxnet/issues/7920. 
In addition to what @eric-haibin-lin said about the intention of delaying 
memory allocation, I think it's also the intended way to avoid unnecessary 
memory allocation for zero csr tensors. We can provide a util function checking 
whether a sparse tensor (csr or rsp) is zero for being used in operator 
implementation.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 commented on a change in pull request #7935: Zeroes CSR still needs a valid row_pointer array.

2017-09-18 Thread git
cjolivier01 commented on a change in pull request #7935: Zeroes CSR still needs 
a valid row_pointer array.
URL: https://github.com/apache/incubator-mxnet/pull/7935#discussion_r139554395
 
 

 ##
 File path: src/operator/tensor/init_op.h
 ##
 @@ -197,14 +197,21 @@ void FillZerosRspImpl(mshadow::Stream *s, NDArray 
*dst) {
   dst->set_aux_shape(rowsparse::kIdx, TShape(mshadow::Shape1(0)));
 }
 
-// Fill a CSR NDArray with zeros by updating the aux shape.
+/*! \brief Fill a CSR NDArray with zeros by updating the aux shape
+ *
+ * @tparam xpu - cpu or gpu
 
 Review comment:
   what?  cryptic comment :)
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] eric-haibin-lin commented on a change in pull request #7935: Zeroes CSR still needs a valid row_pointer array.

2017-09-18 Thread git
eric-haibin-lin commented on a change in pull request #7935: Zeroes CSR still 
needs a valid row_pointer array.
URL: https://github.com/apache/incubator-mxnet/pull/7935#discussion_r139547763
 
 

 ##
 File path: src/operator/tensor/init_op.h
 ##
 @@ -197,14 +197,21 @@ void FillZerosRspImpl(mshadow::Stream *s, NDArray 
*dst) {
   dst->set_aux_shape(rowsparse::kIdx, TShape(mshadow::Shape1(0)));
 }
 
-// Fill a CSR NDArray with zeros by updating the aux shape.
+/*! \brief Fill a CSR NDArray with zeros by updating the aux shape
+ *
+ * @tparam xpu - cpu or gpu
 
 Review comment:
   `tparam ` -> `param`
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lichen11 commented on issue #6943: How to read rec files in R

2017-09-18 Thread git
lichen11 commented on issue #6943: How to read rec files in R
URL: 
https://github.com/apache/incubator-mxnet/issues/6943#issuecomment-330366761
 
 
   Hi, I have a follow-up question:
   dataiter <- mx.io.ImageRecordIter(
 path.imgrec = "./data/cifar/train.rec",
 path.imglist= "./data/cifar/train.lst",
 mean.img= "./data/cifar/cifar10_mean.bin",
 batch.size  = 100,
 data.shape  = c(28, 28, 3),
 rand.crop   = TRUE,
 rand.mirror = TRUE
   )
   dataiter$reset()
   dataiter$iter.next()
   
   I assign variables
   
   labels=dataiter$value()$label
   mydata = dataiter$value()$data
   
   dim(mydata) = 28 28 3 100
   
but if I do 
   
   mydata[,,,1] or labels[1:10]
   Then an error occurs:
   "Object of type 'externalptr' is not subsettable. "
   
   How do I access the values from mydata and labels?
   
   Thanks!
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lichen11 commented on issue #6943: How to read rec files in R

2017-09-18 Thread git
lichen11 commented on issue #6943: How to read rec files in R
URL: 
https://github.com/apache/incubator-mxnet/issues/6943#issuecomment-330366761
 
 
   Hi, I have a follow-up question:
   dataiter <- mx.io.ImageRecordIter(
 path.imgrec = "./data/cifar/train.rec",
 path.imglist= "./data/cifar/train.lst",
 mean.img= "./data/cifar/cifar10_mean.bin",
 batch.size  = 100,
 data.shape  = c(28, 28, 3),
 rand.crop   = TRUE,
 rand.mirror = TRUE
   )
   dataiter$reset()
   dataiter$iter.next()
   # I assign variables
   labels=dataiter$value()$label
   mydata = dataiter$value()$data
   
   dim(mydata) = 28 28 3 100
   # but if I do 
   mydata[,,,1] or labels[1:10]
   Then an error occurs:
   "Object of type 'externalptr' is not subsettable. "
   
   How do I access the values from mydata and labels?
   
   Thanks!
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] larroy commented on issue #7852: Trouble installing MXNet on Raspberry Pi 3

2017-09-18 Thread git
larroy commented on issue #7852: Trouble installing MXNet on Raspberry Pi 3
URL: 
https://github.com/apache/incubator-mxnet/issues/7852#issuecomment-330361445
 
 
   you should install as root or using a virtualenv
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ykim362 commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
ykim362 commented on a change in pull request #7931: MKL-DNN integration: 
request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139533086
 
 

 ##
 File path: prepare_mkl.sh
 ##
 @@ -115,7 +115,7 @@ if [ -z $MKLROOT ]; then
 fi
 
 # Check what MKL lib we have in MKLROOT
-if [ -z `find $MKLROOT -name libmklml_gnu.so -o -name libmklml.dylib -print 
-quit` ]; then
+if [ -z `find $MKLROOT -name libmklml_gnu.so -print -quit` ]; then
   USE_MKLML=0
 
 Review comment:
   @szha This new change from @ashokei would resolve the problem.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ykim362 commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
ykim362 commented on a change in pull request #7931: MKL-DNN integration: 
request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139530795
 
 

 ##
 File path: src/operator/mkl/mkl_conv-common-inl.h
 ##
 @@ -0,0 +1,82 @@
+/***
+* Copyright 2016-2017 Intel Corporation
+*
+* Licensed under the Apache License, Version 2.0 (the "License");
+* you may not use this file except in compliance with the License.
+* You may obtain a copy of the License at
+*
+* http://www.apache.org/licenses/LICENSE-2.0
+*
+* Unless required by applicable law or agreed to in writing, software
+* distributed under the License is distributed on an "AS IS" BASIS,
+* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+* See the License for the specific language governing permissions and
+* limitations under the License.
+*
+* \file mkl_convolution-inl.h
+* \brief
+* \author lingyan@intel.com
+* zhenlin@intel.com
+*
+***/
+#ifndef MXNET_OPERATOR_MKL_MKL_CONV_COMMON_INL_H_
+#define MXNET_OPERATOR_MKL_MKL_CONV_COMMON_INL_H_
+
+#include 
+#include 
+#include 
+#include 
+#include 
+#include 
+#include 
+#include 
+#include "mkl_util-inl.h"
+
+
+namespace mxnet {
+namespace op {
+
+template 
+class MKLConvCommon {
+ public:
+  MKLConvCommon(): width_(0), height_(0), width_out_(0),
+height_out_(0), kernel_w_(0), kernel_h_(0),
+stride_w_(0), stride_h_(0), pad_w_(0), pad_h_(0)  {}
+  virtual ~MKLConvCommon() {}
+
+  void AddToModeAllocAndStoreBuffer(void *src, int blob_size, Storage::Handle 
*pws) {
+int blob_byte_size = blob_size * sizeof(DType);
+*pws = Storage::Get()->Alloc(blob_byte_size, Context::CPU());
+memcpy(pws->dptr, src, blob_byte_size);
+  }
+  void AddToModeAddAndReleaseBuffer(Storage::Handle *pws, void *dst_, int 
blob_size) {
+DType *dst = reinterpret_cast(dst_);
+DType *src = reinterpret_cast(pws->dptr);
+for (int i = 0; i < blob_size; i++) {
+  dst[i] += src[i];
+}
+if (pws->dptr)
+  Storage::Get()->Free(*pws);
+pws->dptr = NULL;
+  }
+
+ protected:
+  int width_,
+height_,
+width_out_,
+height_out_,
+kernel_w_,
+kernel_h_,
+stride_w_,
+stride_h_;
+  int group_,
+num_,
 
 Review comment:
   Yes, I will do!
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ykim362 commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
ykim362 commented on a change in pull request #7931: MKL-DNN integration: 
request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139526050
 
 

 ##
 File path: src/operator/mkl/mkl_conv-common-inl.h
 ##
 @@ -0,0 +1,82 @@
+/***
+* Copyright 2016-2017 Intel Corporation
+*
+* Licensed under the Apache License, Version 2.0 (the "License");
+* you may not use this file except in compliance with the License.
+* You may obtain a copy of the License at
+*
+* http://www.apache.org/licenses/LICENSE-2.0
+*
+* Unless required by applicable law or agreed to in writing, software
+* distributed under the License is distributed on an "AS IS" BASIS,
+* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+* See the License for the specific language governing permissions and
+* limitations under the License.
+*
+* \file mkl_convolution-inl.h
+* \brief
+* \author lingyan@intel.com
+* zhenlin@intel.com
+*
+***/
+#ifndef MXNET_OPERATOR_MKL_MKL_CONV_COMMON_INL_H_
+#define MXNET_OPERATOR_MKL_MKL_CONV_COMMON_INL_H_
+
+#include 
+#include 
+#include 
+#include 
+#include 
+#include 
+#include 
+#include 
+#include "mkl_util-inl.h"
+
+
+namespace mxnet {
+namespace op {
+
+template 
+class MKLConvCommon {
+ public:
+  MKLConvCommon(): width_(0), height_(0), width_out_(0),
+height_out_(0), kernel_w_(0), kernel_h_(0),
+stride_w_(0), stride_h_(0), pad_w_(0), pad_h_(0)  {}
+  virtual ~MKLConvCommon() {}
+
+  void AddToModeAllocAndStoreBuffer(void *src, int blob_size, Storage::Handle 
*pws) {
+int blob_byte_size = blob_size * sizeof(DType);
+*pws = Storage::Get()->Alloc(blob_byte_size, Context::CPU());
+memcpy(pws->dptr, src, blob_byte_size);
+  }
+  void AddToModeAddAndReleaseBuffer(Storage::Handle *pws, void *dst_, int 
blob_size) {
+DType *dst = reinterpret_cast(dst_);
+DType *src = reinterpret_cast(pws->dptr);
+for (int i = 0; i < blob_size; i++) {
+  dst[i] += src[i];
+}
+if (pws->dptr)
+  Storage::Get()->Free(*pws);
+pws->dptr = NULL;
+  }
+
+ protected:
+  int width_,
+height_,
+width_out_,
+height_out_,
+kernel_w_,
+kernel_h_,
+stride_w_,
+stride_h_;
+  int group_,
+num_,
 
 Review comment:
   This means batch size.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ykim362 commented on issue #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
ykim362 commented on issue #7931: MKL-DNN integration: request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#issuecomment-330340158
 
 
   @piiswrong Yes, MKL-DNN build passes 'test_operator.py' unit test same as 
MKLML build. So, we are doing several experiments.
   
   @szha I will looks into it to replicate the MKLML case.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] SumNeuron commented on issue #7900: Request: finish python gpu enabled guide for install

2017-09-18 Thread git
SumNeuron commented on issue #7900: Request: finish python gpu enabled guide 
for install
URL: 
https://github.com/apache/incubator-mxnet/issues/7900#issuecomment-330339788
 
 
   Trying to import ndarray to do the test from the python subdir in the local 
mxnet results in a error from ctypes...
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] SumNeuron commented on issue #7900: Request: finish python gpu enabled guide for install

2017-09-18 Thread git
SumNeuron commented on issue #7900: Request: finish python gpu enabled guide 
for install
URL: 
https://github.com/apache/incubator-mxnet/issues/7900#issuecomment-330336051
 
 
   which is the right version of python :P
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #7900: Request: finish python gpu enabled guide for install

2017-09-18 Thread git
szha commented on issue #7900: Request: finish python gpu enabled guide for 
install
URL: 
https://github.com/apache/incubator-mxnet/issues/7900#issuecomment-330335895
 
 
   setting your path to use the right copy of python should do. PATH setting 
should be in your .bashrc/.zshrc
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhanghang1989 commented on issue #7570: Gluon InstanceNorm and ReflectancePadding

2017-09-18 Thread git
zhanghang1989 commented on issue #7570: Gluon InstanceNorm and 
ReflectancePadding
URL: https://github.com/apache/incubator-mxnet/pull/7570#issuecomment-330335943
 
 
   Got messed up ... Creating a new PR
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhanghang1989 closed pull request #7570: Gluon InstanceNorm and ReflectancePadding

2017-09-18 Thread git
zhanghang1989 closed pull request #7570: Gluon InstanceNorm and 
ReflectancePadding
URL: https://github.com/apache/incubator-mxnet/pull/7570
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
szha commented on issue #7931: MKL-DNN integration: request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#issuecomment-330334021
 
 
   @ykim362 could you add a job for testing mxnet with mkldnn in Jenkinsfile? 
Otherwise the tests don't reflect the change.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
szha commented on a change in pull request #7931: MKL-DNN integration: request 
for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139519618
 
 

 ##
 File path: src/operator/concat.cc
 ##
 @@ -50,6 +55,18 @@ Operator* CreateOp(ConcatParam param, int dtype) {
   if (enableMKLWarnGenerated())
 LOG(INFO) << MKLConcatOp::getName() << " Skip MKL 
optimization";
 #endif
+#if MXNET_USE_MKLDNN == 1
+  if ((1 == param.dim) && (param.num_args > 1)) {
+switch (dtype) {
+  case mshadow::kFloat32:
+return new MKLDNNConcatOp(param);
 
 Review comment:
   There are several other ops that have the same problem. I omitted them for 
brevity. Let me know once you fixed the types.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on issue #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
piiswrong commented on issue #7931: MKL-DNN integration: request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#issuecomment-330333421
 
 
   Does consistency tests pass for mkldnn operators? If it passes it should 
converge
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on issue #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
piiswrong commented on issue #7931: MKL-DNN integration: request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#issuecomment-330333421
 
 
   Does consistency tests pass for mkldnn operators?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] SumNeuron commented on issue #7900: Request: finish python gpu enabled guide for install

2017-09-18 Thread git
SumNeuron commented on issue #7900: Request: finish python gpu enabled guide 
for install
URL: 
https://github.com/apache/incubator-mxnet/issues/7900#issuecomment-330333210
 
 
   Good catch.
   Dumb question: how do I take the master mxnet installed above and replace 
the other mxnet...
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #7900: Request: finish python gpu enabled guide for install

2017-09-18 Thread git
szha commented on issue #7900: Request: finish python gpu enabled guide for 
install
URL: 
https://github.com/apache/incubator-mxnet/issues/7900#issuecomment-330332132
 
 
   You seem to have two copies of python in your system, one in 
`/usr/local/lib/python3.6/` and the other in 
`/Users/sumner/Library/Python/3.6/`. The error you see says the mxnet you're 
using is not compiled with `USE_CUDA=1` while you obviously did based on the 
config.mk you sent. I tested locally that `USE_CUDA=1` works, and it's shown in 
your build log too.
   
   To sum up, are you sure you're verifying the right copy of mxnet?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ykim362 commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
ykim362 commented on a change in pull request #7931: MKL-DNN integration: 
request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139516810
 
 

 ##
 File path: src/operator/concat.cc
 ##
 @@ -50,6 +55,18 @@ Operator* CreateOp(ConcatParam param, int dtype) {
   if (enableMKLWarnGenerated())
 LOG(INFO) << MKLConcatOp::getName() << " Skip MKL 
optimization";
 #endif
+#if MXNET_USE_MKLDNN == 1
+  if ((1 == param.dim) && (param.num_args > 1)) {
+switch (dtype) {
+  case mshadow::kFloat32:
+return new MKLDNNConcatOp(param);
 
 Review comment:
   This is missing. I will fix.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ykim362 commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
ykim362 commented on a change in pull request #7931: MKL-DNN integration: 
request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139516729
 
 

 ##
 File path: src/operator/activation.cc
 ##
 @@ -29,12 +29,30 @@
 #include "./mkl/mkl_memory-inl.h"
 #include "./mkl/mkl_relu-inl.h"
 #endif  // MXNET_USE_MKL2017
+#if MXNET_USE_MKLDNN == 1
+#include 
+#include "./mkl/mkldnn_memory-inl.h"
+#include "./mkl/mkldnn_relu-inl.h"
+#endif  // MXNET_USE_MKLDNN
 
 namespace mxnet {
 namespace op {
 template<>
 Operator *CreateOp(ActivationParam param, int dtype, const TShape& 
dshape) {
   Operator *op = NULL;
+#if MXNET_USE_MKLDNN == 1
+  if (param.act_type == activation::kReLU) {
+switch (dtype) {
+case mshadow::kFloat32:
+case mshadow::kInt8:
+case mshadow::kInt32:
+case mshadow::kUint8:
+  return new MKLDNNReluOp();
 
 Review comment:
   Either one of MKLML(2017) or MKLDNN should be used. They cannot be used 
together for now.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ykim362 commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
ykim362 commented on a change in pull request #7931: MKL-DNN integration: 
request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139516496
 
 

 ##
 File path: src/operator/activation.cc
 ##
 @@ -29,12 +29,30 @@
 #include "./mkl/mkl_memory-inl.h"
 #include "./mkl/mkl_relu-inl.h"
 #endif  // MXNET_USE_MKL2017
+#if MXNET_USE_MKLDNN == 1
+#include 
+#include "./mkl/mkldnn_memory-inl.h"
+#include "./mkl/mkldnn_relu-inl.h"
+#endif  // MXNET_USE_MKLDNN
 
 namespace mxnet {
 namespace op {
 template<>
 Operator *CreateOp(ActivationParam param, int dtype, const TShape& 
dshape) {
   Operator *op = NULL;
+#if MXNET_USE_MKLDNN == 1
+  if (param.act_type == activation::kReLU) {
+switch (dtype) {
+case mshadow::kFloat32:
+case mshadow::kInt8:
+case mshadow::kInt32:
+case mshadow::kUint8:
+  return new MKLDNNReluOp();
 
 Review comment:
   This is missing. I will add double for MKLDNN.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] sbodenstein commented on issue #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
sbodenstein commented on issue #7931: MKL-DNN integration: request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#issuecomment-330329680
 
 
   @szha: I asked the same thing, this was the response: 
https://github.com/01org/mkl-dnn/issues/10#issuecomment-323730811
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on a change in pull request #7903: Refactor AdaGrad optimizer to support sparse tensors

2017-09-18 Thread git
szha commented on a change in pull request #7903: Refactor AdaGrad optimizer to 
support sparse tensors
URL: https://github.com/apache/incubator-mxnet/pull/7903#discussion_r139514542
 
 

 ##
 File path: python/mxnet/optimizer.py
 ##
 @@ -665,26 +667,46 @@ class AdaGrad(Optimizer):
 eps: float, optional
 Small value to avoid division by 0.
 """
-def __init__(self, eps=1e-7, **kwargs):
+def __init__(self, eps=1e-7, stype='default', **kwargs):
 super(AdaGrad, self).__init__(**kwargs)
 self.float_stable_eps = eps
+self.stype = stype
 
 def create_state(self, index, weight):
-return zeros(weight.shape, weight.context)  # history
+return zeros(weight.shape, weight.context, stype=self.stype)  # history
 
 def update(self, index, weight, grad, state):
+#print("ENTER ADAGRAD UPDATE")
 assert(isinstance(weight, NDArray))
 assert(isinstance(grad, NDArray))
 self._update_count(index)
 lr = self._get_lr(index)
 wd = self._get_wd(index)
-
+save_grad_stype = grad.stype
 grad = grad * self.rescale_grad
 if self.clip_gradient is not None:
 grad = clip(grad, -self.clip_gradient, self.clip_gradient)
 history = state
-history[:] += (grad * grad)
-weight[:] += -lr * (grad / sqrt(history + self.float_stable_eps) + wd 
* weight)
+save_history_stype = history.stype
+
+is_sparse = True if weight.stype != 'default' or grad.stype != 
'default' else False
 
 Review comment:
   `x = True if cond else False` <=> `x = cond`
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
szha commented on issue #7931: MKL-DNN integration: request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#issuecomment-330324846
 
 
   General comment: I didn't see mac being supported on mkl-dnn page. It only 
says the software was validated on RHEL7. What does this change imply for mac 
users?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
szha commented on a change in pull request #7931: MKL-DNN integration: request 
for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139511137
 
 

 ##
 File path: src/operator/mkl/mkl_memory.h
 ##
 @@ -112,6 +117,10 @@ struct MKLMemHolder {
 head_(HEAD_AT_CPU), prv_descriptor_(nullptr),
 b_disable_prv_2_cpu(false), b_eager_mode(false) {}
 };
+
+// bool compare_mkl_memholder(std::shared_ptr holder_a,
+//   std::shared_ptr holder_b);
 
 Review comment:
   Remove commented code.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
szha commented on a change in pull request #7931: MKL-DNN integration: request 
for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139510894
 
 

 ##
 File path: src/operator/mkl/mkl_conv-common-inl.h
 ##
 @@ -0,0 +1,82 @@
+/***
+* Copyright 2016-2017 Intel Corporation
+*
+* Licensed under the Apache License, Version 2.0 (the "License");
+* you may not use this file except in compliance with the License.
+* You may obtain a copy of the License at
+*
+* http://www.apache.org/licenses/LICENSE-2.0
+*
+* Unless required by applicable law or agreed to in writing, software
+* distributed under the License is distributed on an "AS IS" BASIS,
+* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+* See the License for the specific language governing permissions and
+* limitations under the License.
+*
+* \file mkl_convolution-inl.h
+* \brief
+* \author lingyan@intel.com
+* zhenlin@intel.com
+*
+***/
+#ifndef MXNET_OPERATOR_MKL_MKL_CONV_COMMON_INL_H_
+#define MXNET_OPERATOR_MKL_MKL_CONV_COMMON_INL_H_
+
+#include 
+#include 
+#include 
+#include 
+#include 
+#include 
+#include 
+#include 
+#include "mkl_util-inl.h"
+
+
+namespace mxnet {
+namespace op {
+
+template 
+class MKLConvCommon {
+ public:
+  MKLConvCommon(): width_(0), height_(0), width_out_(0),
+height_out_(0), kernel_w_(0), kernel_h_(0),
+stride_w_(0), stride_h_(0), pad_w_(0), pad_h_(0)  {}
+  virtual ~MKLConvCommon() {}
+
+  void AddToModeAllocAndStoreBuffer(void *src, int blob_size, Storage::Handle 
*pws) {
+int blob_byte_size = blob_size * sizeof(DType);
+*pws = Storage::Get()->Alloc(blob_byte_size, Context::CPU());
+memcpy(pws->dptr, src, blob_byte_size);
+  }
+  void AddToModeAddAndReleaseBuffer(Storage::Handle *pws, void *dst_, int 
blob_size) {
+DType *dst = reinterpret_cast(dst_);
+DType *src = reinterpret_cast(pws->dptr);
+for (int i = 0; i < blob_size; i++) {
+  dst[i] += src[i];
+}
+if (pws->dptr)
+  Storage::Get()->Free(*pws);
+pws->dptr = NULL;
+  }
+
+ protected:
+  int width_,
+height_,
+width_out_,
+height_out_,
+kernel_w_,
+kernel_h_,
+stride_w_,
+stride_h_;
+  int group_,
+num_,
 
 Review comment:
   What's num?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
szha commented on a change in pull request #7931: MKL-DNN integration: request 
for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139510706
 
 

 ##
 File path: src/operator/lrn.cc
 ##
 @@ -40,6 +46,9 @@ Operator* CreateOp(LRNParam param, int dtype) {
 #if MXNET_USE_MKL2017 == 1
   return new MKLLRNOp(param);
 #endif
+#if MXNET_USE_MKLDNN == 1
+  return new MKLDNNLRNOp(param);
 
 Review comment:
   If both MKL2017 and MKLDNN are on, this would become dead code.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
szha commented on a change in pull request #7931: MKL-DNN integration: request 
for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139510469
 
 

 ##
 File path: src/operator/concat.cc
 ##
 @@ -50,6 +55,18 @@ Operator* CreateOp(ConcatParam param, int dtype) {
   if (enableMKLWarnGenerated())
 LOG(INFO) << MKLConcatOp::getName() << " Skip MKL 
optimization";
 #endif
+#if MXNET_USE_MKLDNN == 1
+  if ((1 == param.dim) && (param.num_args > 1)) {
+switch (dtype) {
+  case mshadow::kFloat32:
+return new MKLDNNConcatOp(param);
 
 Review comment:
   Similar comment as above.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
szha commented on a change in pull request #7931: MKL-DNN integration: request 
for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139509832
 
 

 ##
 File path: prepare_mkl.sh
 ##
 @@ -115,7 +115,7 @@ if [ -z $MKLROOT ]; then
 fi
 
 # Check what MKL lib we have in MKLROOT
-if [ -z `find $MKLROOT -name libmklml_gnu.so -o -name libmklml.dylib -print 
-quit` ]; then
+if [ -z `find $MKLROOT -name libmklml_gnu.so -print -quit` ]; then
   USE_MKLML=0
 
 Review comment:
   Not at the price of breaking mac please.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #7570: Gluon InstanceNorm and ReflectancePadding

2017-09-18 Thread git
szha commented on issue #7570: Gluon InstanceNorm and ReflectancePadding
URL: https://github.com/apache/incubator-mxnet/pull/7570#issuecomment-330321800
 
 
   I suppose this means you passed. Congrats :)
   
   Could you do a rebase onto the latest master to let the tests run?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
szha commented on a change in pull request #7931: MKL-DNN integration: request 
for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139508448
 
 

 ##
 File path: src/operator/activation.cc
 ##
 @@ -29,12 +29,30 @@
 #include "./mkl/mkl_memory-inl.h"
 #include "./mkl/mkl_relu-inl.h"
 #endif  // MXNET_USE_MKL2017
+#if MXNET_USE_MKLDNN == 1
+#include 
+#include "./mkl/mkldnn_memory-inl.h"
+#include "./mkl/mkldnn_relu-inl.h"
+#endif  // MXNET_USE_MKLDNN
 
 namespace mxnet {
 namespace op {
 template<>
 Operator *CreateOp(ActivationParam param, int dtype, const TShape& 
dshape) {
   Operator *op = NULL;
+#if MXNET_USE_MKLDNN == 1
+  if (param.act_type == activation::kReLU) {
+switch (dtype) {
+case mshadow::kFloat32:
+case mshadow::kInt8:
+case mshadow::kInt32:
+case mshadow::kUint8:
+  return new MKLDNNReluOp();
 
 Review comment:
   Should it be allowed to turn on both MKLDNN and MKL2017? MKLRelu below 
supports double which isn't supported by mkldnn here.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ykim362 commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
ykim362 commented on a change in pull request #7931: MKL-DNN integration: 
request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139507607
 
 

 ##
 File path: prepare_mkl.sh
 ##
 @@ -115,7 +115,7 @@ if [ -z $MKLROOT ]; then
 fi
 
 # Check what MKL lib we have in MKLROOT
-if [ -z `find $MKLROOT -name libmklml_gnu.so -o -name libmklml.dylib -print 
-quit` ]; then
+if [ -z `find $MKLROOT -name libmklml_gnu.so -print -quit` ]; then
   USE_MKLML=0
 
 Review comment:
   This change makes USE_MKLML to 0 on linux machines.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ykim362 commented on issue #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
ykim362 commented on issue #7931: MKL-DNN integration: request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#issuecomment-330319229
 
 
   @piiswrong MKL-DNN doesn't converge with Resnet. It converges with mnist. I 
am looking into it to find a root cause. MKLML works fine for both resnet and 
mnist.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
szha commented on a change in pull request #7931: MKL-DNN integration: request 
for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139506527
 
 

 ##
 File path: prepare_mkl.sh
 ##
 @@ -115,7 +115,7 @@ if [ -z $MKLROOT ]; then
 fi
 
 # Check what MKL lib we have in MKLROOT
-if [ -z `find $MKLROOT -name libmklml_gnu.so -o -name libmklml.dylib -print 
-quit` ]; then
+if [ -z `find $MKLROOT -name libmklml_gnu.so -print -quit` ]; then
   USE_MKLML=0
 
 Review comment:
   Also, why is libmklml_gnu listed here while in most cases the linked library 
is libmklml_intel?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
szha commented on a change in pull request #7931: MKL-DNN integration: request 
for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139506331
 
 

 ##
 File path: prepare_mkl.sh
 ##
 @@ -115,7 +115,7 @@ if [ -z $MKLROOT ]; then
 fi
 
 # Check what MKL lib we have in MKLROOT
-if [ -z `find $MKLROOT -name libmklml_gnu.so -o -name libmklml.dylib -print 
-quit` ]; then
+if [ -z `find $MKLROOT -name libmklml_gnu.so -print -quit` ]; then
   USE_MKLML=0
 
 Review comment:
   I enabled mac mkl a while back. What's wrong with this?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] gautamkmr opened a new pull request #7937: Increase the tolerance

2017-09-18 Thread git
gautamkmr opened a new pull request #7937: Increase the tolerance
URL: https://github.com/apache/incubator-mxnet/pull/7937
 
 
   @piiswrong @cjolivier01 
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] gautamkmr commented on issue #7926: Increase the tolerance

2017-09-18 Thread git
gautamkmr commented on issue #7926: Increase the tolerance
URL: https://github.com/apache/incubator-mxnet/pull/7926#issuecomment-330315224
 
 
   Whops wrong test fix.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] gautamkmr commented on issue #7926: Increase the tolerance

2017-09-18 Thread git
gautamkmr commented on issue #7926: Increase the tolerance
URL: https://github.com/apache/incubator-mxnet/pull/7926#issuecomment-330313865
 
 
   The test has been failing continuously  here is another instance today 
   
https://builds.apache.org/blue/organizations/jenkins/incubator-mxnet/detail/master/395/pipeline
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] gautamkmr commented on issue #7926: Increase the tolerance

2017-09-18 Thread git
gautamkmr commented on issue #7926: Increase the tolerance
URL: https://github.com/apache/incubator-mxnet/pull/7926#issuecomment-330313317
 
 
   @cjolivier01   I saw the failure since last two weeks, haven't noticed 
before.  
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on issue #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
piiswrong commented on issue #7931: MKL-DNN integration: request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#issuecomment-330308578
 
 
   what's the issue with resnet convergence?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhanghang1989 commented on issue #7570: Gluon InstanceNorm and ReflectancePadding

2017-09-18 Thread git
zhanghang1989 commented on issue #7570: Gluon InstanceNorm and 
ReflectancePadding
URL: https://github.com/apache/incubator-mxnet/pull/7570#issuecomment-330307254
 
 
   Hey folks, I was busy with my thesis defense last week. Any further 
feedbacks for this PR?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
piiswrong commented on a change in pull request #7931: MKL-DNN integration: 
request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931#discussion_r139495738
 
 

 ##
 File path: src/operator/mkl/mkldnn_memory.cc
 ##
 @@ -0,0 +1,285 @@
+/***
+* Copyright 2016-2017 Intel Corporation
 
 Review comment:
   I'm not sure if Apache allows this.
   @smarthi 
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha opened a new pull request #7936: simplify CTC forward after namespace reorganize

2017-09-18 Thread git
szha opened a new pull request #7936: simplify CTC forward after namespace 
reorganize
URL: https://github.com/apache/incubator-mxnet/pull/7936
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 commented on a change in pull request #7903: Refactor AdaGrad optimizer to support sparse tensors

2017-09-18 Thread git
cjolivier01 commented on a change in pull request #7903: Refactor AdaGrad 
optimizer to support sparse tensors
URL: https://github.com/apache/incubator-mxnet/pull/7903#discussion_r139483314
 
 

 ##
 File path: python/mxnet/optimizer.py
 ##
 @@ -665,26 +667,46 @@ class AdaGrad(Optimizer):
 eps: float, optional
 Small value to avoid division by 0.
 """
-def __init__(self, eps=1e-7, **kwargs):
+def __init__(self, eps=1e-7, stype='default', **kwargs):
 
 Review comment:
   done
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 commented on a change in pull request #7903: Refactor AdaGrad optimizer to support sparse tensors

2017-09-18 Thread git
cjolivier01 commented on a change in pull request #7903: Refactor AdaGrad 
optimizer to support sparse tensors
URL: https://github.com/apache/incubator-mxnet/pull/7903#discussion_r139483293
 
 

 ##
 File path: python/mxnet/optimizer.py
 ##
 @@ -665,26 +667,46 @@ class AdaGrad(Optimizer):
 eps: float, optional
 Small value to avoid division by 0.
 """
-def __init__(self, eps=1e-7, **kwargs):
+def __init__(self, eps=1e-7, stype='default', **kwargs):
 super(AdaGrad, self).__init__(**kwargs)
 self.float_stable_eps = eps
+self.stype = stype
 
 def create_state(self, index, weight):
-return zeros(weight.shape, weight.context)  # history
+return zeros(weight.shape, weight.context, stype=self.stype)  # history
 
 Review comment:
   done
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] FrancisTse8 commented on issue #7852: Trouble installing MXNet on Raspberry Pi 3

2017-09-18 Thread git
FrancisTse8 commented on issue #7852: Trouble installing MXNet on Raspberry Pi 3
URL: 
https://github.com/apache/incubator-mxnet/issues/7852#issuecomment-330285714
 
 
   When I was running the make file, I was able to generate a libmxnet.so file. 
Is this the .so file you are referring to?
   
   Since I seem to have the libmxnet.so file on the Raspberry Pi with stretch 
Raspbian Pi, I tried to continue following the instructions to install python 
bindings. However, I got the following OSerror:
   ```bash
   pi@raspberrypi:~/mxnet $ cd python
   pi@raspberrypi:~/mxnet/python $ pip install --upgrade pip
   Collecting pip
 Downloading pip-9.0.1-py2.py3-none-any.whl (1.3MB)
   100% || 1.3MB 193kB/s 
   Installing collected packages: pip
   Successfully installed pip-9.0.1
   pi@raspberrypi:~/mxnet/python $ pip install -e .
   Obtaining file:///home/pi/mxnet/python
   Collecting graphviz (from mxnet==0.11.1)
 Downloading graphviz-0.8-py2.py3-none-any.whl
   Requirement already satisfied: numpy in /usr/lib/python2.7/dist-packages 
(from mxnet==0.11.1)
   Requirement already satisfied: requests in /usr/lib/python2.7/dist-packages 
(from mxnet==0.11.1)
   Installing collected packages: graphviz, mxnet
   Exception:
   Traceback (most recent call last):
 File "/home/pi/.local/lib/python2.7/site-packages/pip/basecommand.py", 
line 215, in main
   status = self.run(options, args)
 File 
"/home/pi/.local/lib/python2.7/site-packages/pip/commands/install.py", line 
342, in run
   prefix=options.prefix_path,
 File "/home/pi/.local/lib/python2.7/site-packages/pip/req/req_set.py", 
line 784, in install
   **kwargs
 File "/home/pi/.local/lib/python2.7/site-packages/pip/req/req_install.py", 
line 851, in install
   self.move_wheel_files(self.source_dir, root=root, prefix=prefix)
 File "/home/pi/.local/lib/python2.7/site-packages/pip/req/req_install.py", 
line 1064, in move_wheel_files
   isolated=self.isolated,
 File "/home/pi/.local/lib/python2.7/site-packages/pip/wheel.py", line 345, 
in move_wheel_files
   clobber(source, lib_dir, True)
 File "/home/pi/.local/lib/python2.7/site-packages/pip/wheel.py", line 316, 
in clobber
   ensure_dir(destdir)
 File "/home/pi/.local/lib/python2.7/site-packages/pip/utils/__init__.py", 
line 83, in ensure_dir
   os.makedirs(path)
 File "/usr/lib/python2.7/os.py", line 157, in makedirs
   mkdir(name, mode)
   OSError: [Errno 13] Permission denied: 
'/usr/local/lib/python2.7/dist-packages/graphviz'
   ```
   Maybe I need something else from the make process besides generating the 
libmxnet.so file?
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 commented on issue #7920: csr binary operator bug

2017-09-18 Thread git
cjolivier01 commented on issue #7920: csr binary operator bug 
URL: 
https://github.com/apache/incubator-mxnet/issues/7920#issuecomment-330284076
 
 
   **Looks like problem was introduced here**  
   
https://github.com/apache/incubator-mxnet/commit/0b1363116c84dcefa751a925749b2da04c3f2614
   
   **In function** 
   void FillZerosCsrImpl(mshadow::Stream *s, NDArray *dst)
   
   **Why?**
   CSR matrix, even if all "zeros", must have m + 1 items in its row pointer 
(csr::IndPtr) array.  Current implementation leaves all aux arrays empty.
   
   **PR of fix**
   https://github.com/apache/incubator-mxnet/pull/7935
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 opened a new pull request #7935: Zeroes CSR still needs a valid row_pointer array.

2017-09-18 Thread git
cjolivier01 opened a new pull request #7935: Zeroes CSR still needs a valid 
row_pointer array.
URL: https://github.com/apache/incubator-mxnet/pull/7935
 
 
   Fix for:  https://github.com/apache/incubator-mxnet/issues/7920
   
   **Looks like problem was introduced here**  
   
https://github.com/apache/incubator-mxnet/commit/0b1363116c84dcefa751a925749b2da04c3f2614
   
   **In function** 
   void FillZerosCsrImpl(mshadow::Stream *s, NDArray *dst)
   
   **Why?**
   CSR matrix, even if all "zeros", must have m + 1 items in its row pointer 
(csr::IndPtr) array.  Current implementation leaves all aux arrays empty.
   
   While I was here, added a couple more assertions and also reduced memory 
allocation for situations where lhs and rhs are the same array.
   
   Added unit test.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] x10000year opened a new issue #7934: Bug of group2ctx? Wrong device placement?

2017-09-18 Thread git
x1year opened a new issue #7934: Bug of group2ctx? Wrong device placement?
URL: https://github.com/apache/incubator-mxnet/issues/7934
 
 
   For the following code:
   
   x = mx.symbol.MyOp()
   exe = x.bind(mx.gpu(), {}, group2ctx={"a": mx.cpu(), "b": mx.gpu()})
   exe.forward()
   
   where MyOp is a custom operator written in c++, which prints "CPU" if it is 
run in cpu context, or prints "GPU" if run in gpu context.
   
   I don't use mx.AttrScope to specify the group of x, so default context 
should be used for x. However, the above code prints "CPU", which means that x 
is run in cpu context. Why?
   
   If I set group2ctx={"b": mx.cpu(), "a": mx.gpu()}. Then it prints "GPU".
   
   Basically, I found that the group that has the alphabetically smaller name 
is chosen for x. Very strange.
   
   Is this a bug?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] reminisce commented on a change in pull request #7911: More sparse related docs

2017-09-18 Thread git
reminisce commented on a change in pull request #7911: More sparse related docs
URL: https://github.com/apache/incubator-mxnet/pull/7911#discussion_r139467525
 
 

 ##
 File path: python/mxnet/ndarray/sparse.py
 ##
 @@ -88,6 +88,8 @@ def _new_alloc_handle(stype, shape, ctx, delay_alloc, dtype, 
aux_types, aux_shap
 A new empty ndarray handle
 """
 hdl = NDArrayHandle()
+for aux_t in aux_types:
+assert(np.dtype(aux_t) == np.dtype("int64")), "only int64 is supported 
for aux types"
 
 Review comment:
   Is using `raise` more appropriate here? In production, users may not want to 
terminate their program.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 commented on a change in pull request #7903: Refactor AdaGrad optimizer to support sparse tensors

2017-09-18 Thread git
cjolivier01 commented on a change in pull request #7903: Refactor AdaGrad 
optimizer to support sparse tensors
URL: https://github.com/apache/incubator-mxnet/pull/7903#discussion_r139461575
 
 

 ##
 File path: python/mxnet/optimizer.py
 ##
 @@ -665,26 +667,46 @@ class AdaGrad(Optimizer):
 eps: float, optional
 Small value to avoid division by 0.
 """
-def __init__(self, eps=1e-7, **kwargs):
+def __init__(self, eps=1e-7, stype='default', **kwargs):
 super(AdaGrad, self).__init__(**kwargs)
 self.float_stable_eps = eps
+self.stype = stype
 
 def create_state(self, index, weight):
-return zeros(weight.shape, weight.context)  # history
+return zeros(weight.shape, weight.context, stype=self.stype)  # history
 
 def update(self, index, weight, grad, state):
+#print("ENTER ADAGRAD UPDATE")
 assert(isinstance(weight, NDArray))
 assert(isinstance(grad, NDArray))
 self._update_count(index)
 lr = self._get_lr(index)
 wd = self._get_wd(index)
-
+save_grad_stype = grad.stype
 grad = grad * self.rescale_grad
 if self.clip_gradient is not None:
 grad = clip(grad, -self.clip_gradient, self.clip_gradient)
 history = state
-history[:] += (grad * grad)
-weight[:] += -lr * (grad / sqrt(history + self.float_stable_eps) + wd 
* weight)
+save_history_stype = history.stype
+
+is_sparse = True if weight.stype != 'default' or grad.stype != 
'default' else False
+
+if is_sparse:
+history[:] = op.elemwise_add(history, op.square(grad))
+assert history.stype == save_history_stype
+srt = op.sqrt(history)
 
 Review comment:
   ok, will scatter_plus them
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 commented on a change in pull request #7903: Refactor AdaGrad optimizer to support sparse tensors

2017-09-18 Thread git
cjolivier01 commented on a change in pull request #7903: Refactor AdaGrad 
optimizer to support sparse tensors
URL: https://github.com/apache/incubator-mxnet/pull/7903#discussion_r139461575
 
 

 ##
 File path: python/mxnet/optimizer.py
 ##
 @@ -665,26 +667,46 @@ class AdaGrad(Optimizer):
 eps: float, optional
 Small value to avoid division by 0.
 """
-def __init__(self, eps=1e-7, **kwargs):
+def __init__(self, eps=1e-7, stype='default', **kwargs):
 super(AdaGrad, self).__init__(**kwargs)
 self.float_stable_eps = eps
+self.stype = stype
 
 def create_state(self, index, weight):
-return zeros(weight.shape, weight.context)  # history
+return zeros(weight.shape, weight.context, stype=self.stype)  # history
 
 def update(self, index, weight, grad, state):
+#print("ENTER ADAGRAD UPDATE")
 assert(isinstance(weight, NDArray))
 assert(isinstance(grad, NDArray))
 self._update_count(index)
 lr = self._get_lr(index)
 wd = self._get_wd(index)
-
+save_grad_stype = grad.stype
 grad = grad * self.rescale_grad
 if self.clip_gradient is not None:
 grad = clip(grad, -self.clip_gradient, self.clip_gradient)
 history = state
-history[:] += (grad * grad)
-weight[:] += -lr * (grad / sqrt(history + self.float_stable_eps) + wd 
* weight)
+save_history_stype = history.stype
+
+is_sparse = True if weight.stype != 'default' or grad.stype != 
'default' else False
+
+if is_sparse:
+history[:] = op.elemwise_add(history, op.square(grad))
+assert history.stype == save_history_stype
+srt = op.sqrt(history)
 
 Review comment:
   ok
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] loweew opened a new issue #7933: CoreML conversion with finetuned model

2017-09-18 Thread git
loweew opened a new issue #7933: CoreML conversion with finetuned model 
URL: https://github.com/apache/incubator-mxnet/issues/7933
 
 
   I have successfully converted the squeezenet and resnet50 models from the 
examples to CoreML using mxnet-to-coreml. However, when converting a model 
after fine-tuning using my own data, the predictions are seemingly random. The 
model is fine-tuned using finetune.py from the examples. The model performs 
well prior to conversion to CoreML. After conversion to CoreML, the model 
predicts the same probabilities regardless of the image. The pre-trained model 
I'm using for fine-tuning is the imagenet11k-places resnet50 model.
   
   I've tried:
   
   1. subtracting channel biases as is performed during fine-tuning. 
(--pre-processing-arguments='{"image_input_names":"data","red_bias":123.68,"blue_bias":103.939,"green_bias":116.779}')
   
   2. subtracting channel biases and scaling 1/255  
(--pre-processing-arguments='{"image_input_names":"data","red_bias":123.68,"blue_bias":103.939,"green_bias":116.779,
 "image_scale":0.00392156862}')
   
   3. subtracting scaled channel biases because I was unsure about when coreml 
performed the scaling  
(--pre-processing-arguments='{"image_input_names":["data"],"red_bias":0.485019,"blue_bias":0.407603,"green_bias":0.457956,
 "image_scale":0.00392156862}')
   
   4. not scaling or biasing channels
   
   Has anyone successfully converted a model after fine-tuning using a 
different data set? Any ideas would be greatly appreciated. I'm fairly certain 
there's something simple that I'm overlooking... 
   
   I've also examined the converted model using Model_pb2 to make sure the 
preprocessing flags are being respected, and they appear to be:
   
   print(model.neuralNetworkClassifier.preprocessing)
   
   [featureName: "data"
   scaler {
 channelScale: 0.0038006407
 blueBias: 103.939
 greenBias: 116.779
 redBias: 123.68
   }
   ]
   
   here's the entire cmd line: 
   
   mxnet_coreml_converter.py --model-prefix='imagenet11k-places-resnet-50' 
--epoch=47 --input-shape='{"data":"3,224,224"}' --mode=classifier 
--class-labels myclass_labels.txt 
--output-file="mxnetimagenet11kplaces50resnet.mlmodel"  
--pre-processing-arguments='{"image_input_names":"data","red_bias":123.68,"blue_bias":103.939,"green_bias":116.779,
 "image_scale":0.00392156862}'
   
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] jiarenyf commented on issue #7925: Resize image to fixed size

2017-09-18 Thread git
jiarenyf commented on issue #7925: Resize image to fixed size
URL: 
https://github.com/apache/incubator-mxnet/issues/7925#issuecomment-330253920
 
 
   Maybe this can help you: 
"https://github.com/jiarenyf/mxWrapper/blob/02fd9b0fcd37f7224648efad651a6f83a1f06d78/mxHelper/mxData.py#L93;
 .
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 commented on issue #7920: csr binary operator bug

2017-09-18 Thread git
cjolivier01 commented on issue #7920: csr binary operator bug 
URL: 
https://github.com/apache/incubator-mxnet/issues/7920#issuecomment-330251762
 
 
   Taking a look...
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] jiarenyf commented on issue #7710: CTC ERROR WITH CUDA ILLEGAL MEMORY ACCESS ERROR

2017-09-18 Thread git
jiarenyf commented on issue #7710: CTC ERROR WITH CUDA ILLEGAL MEMORY ACCESS 
ERROR
URL: 
https://github.com/apache/incubator-mxnet/issues/7710#issuecomment-330248910
 
 
   @szha 
   I found another reason of that problem.
   
   As in 
"https://github.com/jiarenyf/mxWrapper/blob/02fd9b0fcd37f7224648efad651a6f83a1f06d78/mxHelper/mxData.py#L158;,
 the labels is initialize to empty, while in 
"https://github.com/jiarenyf/mxWrapper/blob/02fd9b0fcd37f7224648efad651a6f83a1f06d78/mxHelper/model/model.py#L98;,
 it directly uses batch.label without considering the batch.pad.
   So if not meets "data size % batch size ==0", the error occurs when 
accessing empty labels.
   
   And the data set I offered you to debug has 800 images, and it meets: "800 % 
100 == 0" (100 is batch size). So the problem never occurs on your workplace.
   
   I found this problem because I happen to change the train set size to 111 
and the test size to 33, where "111+33 % 100 != 0".
   And after I add 
"https://github.com/jiarenyf/mxWrapper/blob/02fd9b0fcd37f7224648efad651a6f83a1f06d78/mxHelper/mxData.py#L169;
 (remove empty), and change 
"https://github.com/jiarenyf/mxWrapper/blob/02fd9b0fcd37f7224648efad651a6f83a1f06d78/mxHelper/mxData.py#L170;
 (set "pad" to 0), then the problem does not occur.
   
   I wonder is there some error or bug in my implementation: 
"https://github.com/jiarenyf/mxWrapper/blob/master/mxHelper/model/model.py; ?
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ZiyueHuang commented on issue #7925: Resize image to fixed size

2017-09-18 Thread git
ZiyueHuang commented on issue #7925: Resize image to fixed size
URL: 
https://github.com/apache/incubator-mxnet/issues/7925#issuecomment-330240283
 
 
   You can use cv2.imresize for convenience. Methods in mx.image use mx.ndarray 
internally for effiency. You can refer to mx.image.ImageIter in document for 
some examples of Augmenters.
   On 09/18/2017 22:20, Prasad9 wrote:
   
   @ZiyueHuang Thanks for the method. That is what I had wanted. But now, I am 
unable to understand how to use this Augmenter class. Because, there is no 
input for image (src) like methods of mx.image.random_crop have.
   Even in various subclasses of DataIter class, there seems to be no attribute 
to assign Augmenter. Please can you show with one example how to do it? It will 
be really helpful. If you have some documentation with examples or somewhere 
this has already been demonstrated, which I am not able to find, that will also 
be helpful.
   Thanks again for the all help.
   
   ?
   You are receiving this because you were mentioned.
   Reply to this email directly, view it on GitHub, or mute the thread.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Prasad9 commented on issue #7925: Resize image to fixed size

2017-09-18 Thread git
Prasad9 commented on issue #7925: Resize image to fixed size
URL: 
https://github.com/apache/incubator-mxnet/issues/7925#issuecomment-330237849
 
 
   @ZiyueHuang Thanks for the method. That is what I had wanted. But now, I am 
unable to understand how to use this `Augmenter` class. Because, there is no 
input for image (src) like methods of 
[mx.image.random_crop](https://mxnet.incubator.apache.org/api/python/image.html#mxnet.image.random_crop)
 have. 
   Even in various subclasses of 
[DataIter](https://mxnet.incubator.apache.org/api/python/io.html#mxnet.io.DataIter)
 class, there seems to be no attribute to assign Augmenter. Please can you show 
with one example how to do it? It will be really helpful. If you have some 
documentation with examples or somewhere this has already been demonstrated, 
which I am not able to find, that will also be helpful.
   Thanks again for the all help.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ZiyueHuang commented on issue #7925: Resize image to fixed size

2017-09-18 Thread git
ZiyueHuang commented on issue #7925: Resize image to fixed size
URL: 
https://github.com/apache/incubator-mxnet/issues/7925#issuecomment-330216607
 
 
   `mx.image.ForceResizeAug`. It uses an internal method 
`mx.nd._internal._cvimresize`.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lijuan123 commented on issue #6474: Fine-tune the mxnet ssd get mismatchfrom.shape() error

2017-09-18 Thread git
lijuan123 commented on issue #6474: Fine-tune the mxnet ssd get  
mismatchfrom.shape() error
URL: 
https://github.com/apache/incubator-mxnet/issues/6474#issuecomment-330204414
 
 
   @liumusicforever thank you!
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] liumusicforever commented on issue #6474: Fine-tune the mxnet ssd get mismatchfrom.shape() error

2017-09-18 Thread git
liumusicforever commented on issue #6474: Fine-tune the mxnet ssd get  
mismatchfrom.shape() error
URL: 
https://github.com/apache/incubator-mxnet/issues/6474#issuecomment-330188579
 
 
   I solve from making sure the `num_class` is equal with calling by symbol.py 
(importlib)"" and loading from model params file (load_checkpoint).
   
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] liumusicforever commented on issue #6474: Fine-tune the mxnet ssd get mismatchfrom.shape() error

2017-09-18 Thread git
liumusicforever commented on issue #6474: Fine-tune the mxnet ssd get  
mismatchfrom.shape() error
URL: 
https://github.com/apache/incubator-mxnet/issues/6474#issuecomment-330201273
 
 
   sorry , I make a mistake , I mean **numbers of class** not **shape of data** 
above.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] liumusicforever commented on issue #6474: Fine-tune the mxnet ssd get mismatchfrom.shape() error

2017-09-18 Thread git
liumusicforever commented on issue #6474: Fine-tune the mxnet ssd get  
mismatchfrom.shape() error
URL: 
https://github.com/apache/incubator-mxnet/issues/6474#issuecomment-330200756
 
 
   Did your classes number of pretrained model is same as classes number of 
symbol  ?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lijuan123 commented on issue #6474: Fine-tune the mxnet ssd get mismatchfrom.shape() error

2017-09-18 Thread git
lijuan123 commented on issue #6474: Fine-tune the mxnet ssd get  
mismatchfrom.shape() error
URL: 
https://github.com/apache/incubator-mxnet/issues/6474#issuecomment-330199777
 
 
   @liumusicforever oh sorry,  can you  tell me more about it. I don't 
understand well with what you mean.  The symbol  used is resnet50, and i load 
the pretrained model from epoch 0 .  Thank you again
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] SumNeuron commented on issue #7900: Request: finish python gpu enabled guide for install

2017-09-18 Thread git
SumNeuron commented on issue #7900: Request: finish python gpu enabled guide 
for install
URL: 
https://github.com/apache/incubator-mxnet/issues/7900#issuecomment-330191884
 
 
   Anyway if it helps you debug what is going on to help me:
   
   I am using:
   - MacBook Pro 13" mid 2014 model (i.e. Thunderbolt 2)
   - macOS Sierra v10.12.6
   - Thunderbolt 2 to Thunderbolt 3 adaptor
   - Akitio Node eGPU container
   - NVIDIA GeForce GTX TITAN X
   
   And if you are concerned that maybe this set up is somehow interfering with 
the GPU I can assure you it is not. I can use GPU accelerated code in other 
languages :)
   
   I somehow truly think it is the reversion to a previous command line tools 
to appease nvcc.
   
   Looking at ArrayFire, there are all equally confused
   https://github.com/arrayfire/arrayfire/issues/1384
   
   :)
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] formath opened a new pull request #7932: executor not expose its inner variables to symbol

2017-09-18 Thread git
formath opened a new pull request #7932: executor not expose its inner 
variables to symbol
URL: https://github.com/apache/incubator-mxnet/pull/7932
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] adrianloy commented on issue #6474: Fine-tune the mxnet ssd get mismatchfrom.shape() error

2017-09-18 Thread git
adrianloy commented on issue #6474: Fine-tune the mxnet ssd get  
mismatchfrom.shape() error
URL: 
https://github.com/apache/incubator-mxnet/issues/6474#issuecomment-330187205
 
 
   When I had to do with this project, I also had to adjust it in demo.py and 
when preparing the dataset. But I do not know if that is still the case, they 
changed some stuff in the last month and I am not up to date anymore. 
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] SumNeuron commented on issue #7900: Request: finish python gpu enabled guide for install

2017-09-18 Thread git
SumNeuron commented on issue #7900: Request: finish python gpu enabled guide 
for install
URL: 
https://github.com/apache/incubator-mxnet/issues/7900#issuecomment-330187016
 
 
   because when trying something else
   
   ```
   x = nd.ones(shape=(3,3))
   x_gpu = x.copyto(gpu(0))
   print(x_gpu)
   ```
   
   I get 
   
   ```
   ---
   MXNetErrorTraceback (most recent call last)
in ()
 1 x = nd.ones(shape=(3,3))
   > 2 x_gpu = x.copyto(gpu(0))
 3 print(x_gpu)
   
   /usr/local/lib/python3.6/site-packages/mxnet/ndarray.py in copyto(self, 
other)
   990 elif isinstance(other, Context):
   991 hret = NDArray(_new_alloc_handle(self.shape, other, 
True, self.dtype))
   --> 992 return _internal._copyto(self, out=hret)
   993 else:
   994 raise TypeError('copyto does not support type ' + 
str(type(other)))
   
   /usr/local/lib/python3.6/site-packages/mxnet/ndarray.py in _copyto(src, out, 
name, **kwargs)
   
   /usr/local/lib/python3.6/site-packages/mxnet/_ctypes/ndarray.py in 
_imperative_invoke(handle, ndargs, keys, vals, out)
87 ctypes.c_int(len(keys)),
88 c_array(ctypes.c_char_p, [c_str(key) for key in keys]),
   ---> 89 c_array(ctypes.c_char_p, [c_str(str(val)) for val in vals])))
90 
91 if original_output is not None:
   
   /usr/local/lib/python3.6/site-packages/mxnet/base.py in check_call(ret)
   127 """
   128 if ret != 0:
   --> 129 raise MXNetError(py_str(_LIB.MXGetLastError()))
   130 
   131 if sys.version_info[0] < 3:
   
   MXNetError: [12:57:26] src/ndarray/ndarray.cc:402: GPU is not enabled
   
   Stack trace returned 7 entries:
   [bt] (0) 0   libmxnet.so 0x0001054c6ad8 
_ZN4dmlc15LogMessageFatalD2Ev + 40
   [bt] (1) 1   libmxnet.so 0x000105cae933 
_ZN5mxnet10CopyFromToERKNS_7NDArrayEPS0_i + 1587
   [bt] (2) 2   libmxnet.so 0x000105d15513 
_ZNSt3__110__function6__funcIZN5mxnet2op20RegisterLegacyNDFuncEvE3$_4NS_9allocatorIS4_EEFvRKN4nnvm9NodeAttrsERKNS_6vectorINS2_7NDArrayENS5_ISC_PSE_EEclESA_SG_OSH_
 + 1619
   [bt] (3) 3   libmxnet.so 0x000105bc5611 
_Z20ImperativeInvokeImplRKN5mxnet7ContextERKN4nnvm9NodeAttrsEPNSt3__16vectorINS_7NDArrayENS7_9allocatorIS9_SD_
 + 625
   [bt] (4) 4   libmxnet.so 0x000105bc67f1 
MXImperativeInvoke + 433
   [bt] (5) 5   _ctypes.cpython-36m-darwin.so   0x0001040b442f 
ffi_call_unix64 + 79
   [bt] (6) 6   ??? 0x7fff5d7393e0 0x0 + 
140734761243616
   ```
   
   Which is more consistent with GPU not being enabled, although I am pretty 
sure that it is...
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] SumNeuron commented on issue #7900: Request: finish python gpu enabled guide for install

2017-09-18 Thread git
SumNeuron commented on issue #7900: Request: finish python gpu enabled guide 
for install
URL: 
https://github.com/apache/incubator-mxnet/issues/7900#issuecomment-330186609
 
 
   But the main issue is that it seems that your GPU command examples might not 
be correct, given that the errors are raised by the operation nd.ones with
   
   `Operator _ones is not implemented for GPU.`
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] SumNeuron commented on issue #7900: Request: finish python gpu enabled guide for install

2017-09-18 Thread git
SumNeuron commented on issue #7900: Request: finish python gpu enabled guide 
for install
URL: 
https://github.com/apache/incubator-mxnet/issues/7900#issuecomment-330186373
 
 
   also, I want to reiterate that I would appreciate it if you could implement 
a gpu connection test
   
   either of the following would be fine by me:
   
   A specific error for not being able to reach the GPU
   ```
   gpu_device = mxnet.gpu()
   try:
   nd.ones(shape=(3,3), ctx=gpu_device)
   except MXNetGPUError as err:
   print("Can not connect to GPU {err}".formate(err=err))
   ```
   
   or return -1
   
   ```
   gpu_device = mxnet.gpu()
   
   if not gpu_device:
   print("GPU is not available")
   
   
   ```
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] formath closed pull request #7930: executor not expose its inner variables to symbol

2017-09-18 Thread git
formath closed pull request #7930: executor not expose its inner variables to 
symbol
URL: https://github.com/apache/incubator-mxnet/pull/7930
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] SumNeuron commented on issue #7917: SequenceLast of LSTM throws error

2017-09-18 Thread git
SumNeuron commented on issue #7917: SequenceLast of LSTM throws error
URL: 
https://github.com/apache/incubator-mxnet/issues/7917#issuecomment-330184409
 
 
   Perhaps you are seeing something that I am not.
   This shows how to define blocks utilizing gluon which is the newer version 
of symbol. 
   It doesnt explain how to grab the last time step of LSTM.
   Nor does it provide an example of merging sym and gluon
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] lijuan123 commented on issue #6474: Fine-tune the mxnet ssd get mismatchfrom.shape() error

2017-09-18 Thread git
lijuan123 commented on issue #6474: Fine-tune the mxnet ssd get  
mismatchfrom.shape() error
URL: 
https://github.com/apache/incubator-mxnet/issues/6474#issuecomment-330156954
 
 
   @adrianloy Hi , I have set the num_class and class_names in train.py, but i 
still have the erro. So i want to know if there are other places i need to 
change the setting, thank you very much!
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ykim362 opened a new pull request #7931: MKL-DNN integration: request for reviews

2017-09-18 Thread git
ykim362 opened a new pull request #7931: MKL-DNN integration: request for 
reviews
URL: https://github.com/apache/incubator-mxnet/pull/7931
 
 
   **This PR is a beta version for the code reviews. There are several known 
issues which are being debugged.**
   
   # MKL-DNN
   A new open-source deep learning library providing IA optimized DNN kernels.
   https://github.com/01org/mkl-dnn
   
   # Advantages
   ## More functionalities
   New functionalities will be mainly added to MKL-DNN rather than MKLML 
library.
   Below are two examples.
   1. Fused RNN cell (To be added)
   2. int8 inference (To be added)
   
   ## Performance optimization
   As of Sep. 18 2017. 
   * all units are (img/sec)
   Alexnet Inference (BS:256): 1474 (MKLML) --> 1568 (MKL-DNN)
   inception-bn inference (BS:32): 454 (MKLML) --> 483 (MKL-DNN)
   on Skylake 20-core machine (6148)
   Resnet 50 inference (BS: 32): 99 (MKLML) --> 116 (MKL-DNN)
   
   # Known issues
   - Convergence (resnet training)
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] formath opened a new pull request #7930: executor not expose its inner variables to symbol

2017-09-18 Thread git
formath opened a new pull request #7930: executor not expose its inner 
variables to symbol
URL: https://github.com/apache/incubator-mxnet/pull/7930
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cjolivier01 commented on issue #7926: Increase the tolerance

2017-09-18 Thread git
cjolivier01 commented on issue #7926: Increase the tolerance
URL: https://github.com/apache/incubator-mxnet/pull/7926#issuecomment-330143129
 
 
   Was it failing before last week? Because if it wans t, it's not the fault
   of the test. Not falling without 0.01 error is a problem.
   
   On Sun, Sep 17, 2017 at 10:30 PM Gautam Kumar 
   wrote:
   
   > @cjolivier01  I have seen this failing
   > since last week >75%, we are not running CPP test :(
   >
   > ?
   > You are receiving this because you were mentioned.
   > Reply to this email directly, view it on GitHub
   > 
,
   > or mute the thread
   > 

   > .
   >
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feiyuvl commented on issue #7871: Why library size so large

2017-09-18 Thread git
feiyuvl commented on issue #7871: Why library size so large
URL: 
https://github.com/apache/incubator-mxnet/issues/7871#issuecomment-330135336
 
 
   No, just feel it is too large. A single obj file generated by some op is 
about 10M which is unreasonable. Maybe the using of template in mshadow causes 
this.
   
![default](https://user-images.githubusercontent.com/13388702/30530651-1dbacbba-9c7b-11e7-8256-77098390eaa9.JPG)
   
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feiyuvl commented on issue #7871: Why library size so large

2017-09-18 Thread git
feiyuvl commented on issue #7871: Why library size so large
URL: 
https://github.com/apache/incubator-mxnet/issues/7871#issuecomment-330135336
 
 
   No, just feel it is too large. A single obj file generated by some op is 
about 10M which is unreasonable. Maybe the using of template in mshadow causes 
this.
   
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feiyuvl commented on issue #7871: Why library size so large

2017-09-18 Thread git
feiyuvl commented on issue #7871: Why library size so large
URL: 
https://github.com/apache/incubator-mxnet/issues/7871#issuecomment-330135336
 
 
   No, just feel it is too large. A single obj file generated by some op is 
above 10M which is unreasonable. Maybe the using of template in mshadow causes 
this.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services