[GitHub] szha commented on a change in pull request #7352: add Sequential compatibility to rnn layers

2017-08-05 Thread git
szha commented on a change in pull request #7352: add Sequential compatibility 
to rnn layers
URL: https://github.com/apache/incubator-mxnet/pull/7352#discussion_r131535173
 
 

 ##
 File path: python/mxnet/gluon/rnn/rnn_layer.py
 ##
 @@ -151,10 +151,52 @@ def begin_state(self, batch_size=0, func=ndarray.zeros, 
**kwargs):
 states.append(func(name='%sh0_%d'%(self.prefix, i), **info))
 return states
 
-def forward(self, inputs, states):
+def forward(self, inputs, states=None):
+"""Performs the RNN transformation for all time steps.
 
 Review comment:
   ok
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #7352: add Sequential compatibility to rnn layers

2017-08-05 Thread git
piiswrong commented on a change in pull request #7352: add Sequential 
compatibility to rnn layers
URL: https://github.com/apache/incubator-mxnet/pull/7352#discussion_r131535019
 
 

 ##
 File path: python/mxnet/gluon/rnn/rnn_layer.py
 ##
 @@ -165,8 +207,10 @@ def forward(self, inputs, states):
 self.i2h_weight[i].shape = (self._gates*self._hidden_size, 
inputs.shape[2])
 self.i2h_weight[i]._finish_deferred_init()
 if inputs.context.device_type == 'gpu':
-return self._forward_gpu(inputs, states)
-return self._forward_cpu(inputs, states)
+out = self._forward_gpu(inputs, states) # output, state
+out = self._forward_cpu(inputs, states)
 
 Review comment:
   ???
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #7352: add Sequential compatibility to rnn layers

2017-08-05 Thread git
piiswrong commented on a change in pull request #7352: add Sequential 
compatibility to rnn layers
URL: https://github.com/apache/incubator-mxnet/pull/7352#discussion_r131535006
 
 

 ##
 File path: python/mxnet/gluon/rnn/rnn_layer.py
 ##
 @@ -151,10 +151,52 @@ def begin_state(self, batch_size=0, func=ndarray.zeros, 
**kwargs):
 states.append(func(name='%sh0_%d'%(self.prefix, i), **info))
 return states
 
-def forward(self, inputs, states):
+def forward(self, inputs, states=None):
+"""Performs the RNN transformation for all time steps.
 
 Review comment:
   This is not visible. Put it in the class doc string of each layer
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha opened a new pull request #7352: add Sequential compatibility to rnn layers

2017-08-05 Thread git
szha opened a new pull request #7352: add Sequential compatibility to rnn layers
URL: https://github.com/apache/incubator-mxnet/pull/7352
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[incubator-mxnet] branch master updated: refactor gluon trainer (#7338)

2017-08-05 Thread jxie
This is an automated email from the ASF dual-hosted git repository.

jxie pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new be0579c  refactor gluon trainer (#7338)
be0579c is described below

commit be0579ce7519cd910dab8c0261df7212d155d0b1
Author: Eric Junyuan Xie 
AuthorDate: Sat Aug 5 21:28:24 2017 -0700

refactor gluon trainer (#7338)

* fix optimizer

* Update trainer.py
---
 python/mxnet/gluon/parameter.py|  2 +-
 python/mxnet/gluon/trainer.py  | 59 +-
 python/mxnet/optimizer.py  | 12 +++--
 .../python/unittest/{test_nn.py => test_gluon.py}  | 23 +
 4 files changed, 69 insertions(+), 27 deletions(-)

diff --git a/python/mxnet/gluon/parameter.py b/python/mxnet/gluon/parameter.py
index 0ae829a..bdc9674 100644
--- a/python/mxnet/gluon/parameter.py
+++ b/python/mxnet/gluon/parameter.py
@@ -361,7 +361,7 @@ class ParameterDict(object):
 """
 def __init__(self, prefix='', shared=None):
 self._prefix = prefix
-self._params = {}
+self._params = OrderedDict()
 self._shared = shared
 
 def __getitem__(self, key):
diff --git a/python/mxnet/gluon/trainer.py b/python/mxnet/gluon/trainer.py
index 5483f6b..e8aae71 100644
--- a/python/mxnet/gluon/trainer.py
+++ b/python/mxnet/gluon/trainer.py
@@ -15,14 +15,19 @@ class Trainer(object):
 params : ParameterDict
 The set of parameters to optimize.
 optimizer : str or Optimizer
-The optimizer to use.
+The optimizer to use. See
+`help 
`_
+on Optimizer for a list of available optimizers.
 optimizer_params : dict
 Key-word arguments to be passed to optimizer constructor. For example,
-`{'learning_rate': 0.1}`
+`{'learning_rate': 0.1}`. All optimizers accept learning_rate, wd 
(weight decay),
+clip_gradient, and lr_scheduler. See each optimizer's
+constructor for a list of additional supported arguments.
 kvstore : str or KVStore
-kvstore type for multi-gpu and distributed training.
+kvstore type for multi-gpu and distributed training. See help on
+:any:`mxnet.kvstore.create` for more information.
 """
-def __init__(self, params, optimizer, optimizer_params, kvstore='device'):
+def __init__(self, params, optimizer, optimizer_params=None, 
kvstore='device'):
 if isinstance(params, (dict, ParameterDict)):
 params = list(params.values())
 if not isinstance(params, (list, tuple)):
@@ -35,9 +40,9 @@ class Trainer(object):
 raise ValueError(
 "First argument must be a list or dict of Parameters, " \
 "got list of %s."%(type(param)))
-if param.grad_req != 'null':
-self._params.append(param)
+self._params.append(param)
 
+optimizer_params = optimizer_params if optimizer_params else {}
 self._scale = optimizer_params.get('rescale_grad', 1.0)
 self._contexts = self._check_contexts()
 self._init_optimizer(optimizer, optimizer_params)
@@ -56,32 +61,39 @@ class Trainer(object):
 return contexts
 
 def _init_optimizer(self, optimizer, optimizer_params):
-self._optimizer = opt.create(optimizer, **optimizer_params)
-
-lr_mult = {}
-wd_mult = {}
-for i, param in enumerate(self._params):
-lr_mult[i] = param.lr_mult
-wd_mult[i] = param.wd_mult
-self._optimizer.set_lr_mult(lr_mult)
-self._optimizer.set_wd_mult(wd_mult)
+param_dict = {i: param for i, param in enumerate(self._params)}
+if isinstance(optimizer, opt.Optimizer):
+assert not optimizer_params, \
+"optimizer_params must be None if optimizer is an instance of 
" \
+"Optimizer instead of str"
+self._optimizer = optimizer
+self._optimizer.param_dict = param_dict
+else:
+self._optimizer = opt.create(optimizer, param_dict=param_dict,
+ **optimizer_params)
 
 self._updaters = [opt.get_updater(self._optimizer) \
 for _ in self._contexts]
 
 def _init_kvstore(self):
 arg_arrays = {param.name: param.data(self._contexts[0]) for param in 
self._params}
-kvstore, update_on_kvstore = _create_kvstore(self._kvstore, 
len(self._contexts), arg_arrays)
-self._kvstore = kvstore
-self._update_on_kvstore = update_on_kvstore
+kvstore, update_on_kvstore = _create_kvstore(self._kvstore, 
len(self._contexts),
+ arg_arrays)
 if kvstore:
-assert 'dist

[GitHub] piiswrong closed pull request #7338: refactor gluon trainer

2017-08-05 Thread git
piiswrong closed pull request #7338: refactor gluon trainer
URL: https://github.com/apache/incubator-mxnet/pull/7338
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] dbsxdbsx opened a new issue #7351: The intellisence support for mxnet with Python

2017-08-05 Thread git
dbsxdbsx opened a new issue #7351: The intellisence support for mxnet with 
Python
URL: https://github.com/apache/incubator-mxnet/issues/7351
 
 
   Operating System: Linux,
   Compiler:  Vscode,Pycharm
   Package used (Python/R/Scala/Julia): Python 2.7.13
   MXNet version:  0.10.1
   -
   After installing mxnet,  I found I  could get intellisense when typing  
"mx.io." , but no intellisense when typing "mx.sym.",say looking for "flatten".
   
   
   I would whether there is some setting I missed?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] sergeykolychev commented on issue #7334: cannot use ImageIter in Perl package

2017-08-05 Thread git
sergeykolychev commented on issue #7334: cannot use ImageIter in Perl package
URL: 
https://github.com/apache/incubator-mxnet/issues/7334#issuecomment-320481894
 
 
   @dokechin 
   Thank you. Let's move to next level of our debugging. I've fixed your 
immediate problem and the data is being loaded. Try to plug in this iter into  
mxnet and train something, we'll see if you hit any snugs.
   Here's the patch to the Image.pm that fixes the bugs:
   ```developer@devbox:~/mxnet/perl-package/AI-MXNet$ git diff
   diff --git a/perl-package/AI-MXNet/lib/AI/MXNet/Image.pm 
b/perl-package/AI-MXNet/lib/AI/MXNet/Image.pm
   index 50e4a41..e61d1a4 100644
   --- a/perl-package/AI-MXNet/lib/AI/MXNet/Image.pm
   +++ b/perl-package/AI-MXNet/lib/AI/MXNet/Image.pm
   @@ -747,7 +747,7 @@ sub BUILD
{
chomp($line);
my @line = split(/\t/, $line);
   -my $label = AI::MXNet::NDArray->array([@line[1..@line-1]]);
   +my $label = AI::MXNet::NDArray->array([@line[1..@line-2]]);
my $key   = $line[0];
$imglist{$key} = [$label, $line[-1]];
push @imgkeys, $key;
   @@ -821,6 +821,10 @@ sub BUILD
{
$self->aug_list(AI::MXNet::Image->CreateAugmenter(data_shape => 
$self->data_shape, %{ $self->kwargs//{} }));
}
   +else
   +{
   +$self->aug_list([]);
   +}
$self->cur(0);
$self->reset();
}
   @@ -860,7 +864,7 @@ method next_sample()
}
else
{
   -my ($label, $fname) = $self->imglist->{$idx};
   +my ($label, $fname) = @{ $self->imglist->{$idx} };
if(not defined $self->imgrec)
{
open(F, $self->path_root . "/$fname") or confess("can't 
open $fname $!");
   
   ```
   And here is patched image.pl script with correct usage:
   ```
   #!/usr/bin/perl
   
   ## a bit modified
   ## added path_root
   ## $data is not an array but an object of class AI::MXNet::DataBatch
   use AI::MXNet qw('mx');
   use Data::Dumper;
   my $ite = mx->img()->ImageIter(
   {  batch_size => 1, data_shape=> [3,224,224],label_width =>1, path_imglist 
=> "custom.lst", path_root => '.' });
   for $data (@{$ite}){
 print Dumper($data);
 print $data->data->[0]->aspdl;
 print $data->label->[0]->aspdl;
   }
   ```
   The code assumes that you have image.pl and custom.lst in the top dir and 
the test.jpg file in the subdir 'data'.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] sergeykolychev commented on issue #7334: cannot use ImageIter in Perl package

2017-08-05 Thread git
sergeykolychev commented on issue #7334: cannot use ImageIter in Perl package
URL: 
https://github.com/apache/incubator-mxnet/issues/7334#issuecomment-320481894
 
 
   @dokechin 
   Thank you. Let's move to next level of our debugging. I've fixed your 
immediate problem and the data is being loaded. Try to plug in this iter into  
mxnet and train something, we'll see if you hit any snugs.
   Here's the patch to the Image.pm that fixes the bugs:
   ```developer@devbox:~/mxnet/perl-package/AI-MXNet$ git diff
   diff --git a/perl-package/AI-MXNet/lib/AI/MXNet/Image.pm 
b/perl-package/AI-MXNet/lib/AI/MXNet/Image.pm
   index 50e4a41..e61d1a4 100644
   --- a/perl-package/AI-MXNet/lib/AI/MXNet/Image.pm
   +++ b/perl-package/AI-MXNet/lib/AI/MXNet/Image.pm
   @@ -747,7 +747,7 @@ sub BUILD
{
chomp($line);
my @line = split(/\t/, $line);
   -my $label = AI::MXNet::NDArray->array([@line[1..@line-1]]);
   +my $label = AI::MXNet::NDArray->array([@line[1..@line-2]]);
my $key   = $line[0];
$imglist{$key} = [$label, $line[-1]];
push @imgkeys, $key;
   @@ -821,6 +821,10 @@ sub BUILD
{
$self->aug_list(AI::MXNet::Image->CreateAugmenter(data_shape => 
$self->data_shape, %{ $self->kwargs//{} }));
}
   +else
   +{
   +$self->aug_list([]);
   +}
$self->cur(0);
$self->reset();
}
   @@ -860,7 +864,7 @@ method next_sample()
}
else
{
   -my ($label, $fname) = $self->imglist->{$idx};
   +my ($label, $fname) = @{ $self->imglist->{$idx} };
if(not defined $self->imgrec)
{
open(F, $self->path_root . "/$fname") or confess("can't 
open $fname $!");
   
   ```
   And here is patched image.pl script with correct usage:
   ```
   #!/usr/bin/perl
   
   ## a bit modified
   ## added path_root
   ## $data is not an array but an object of class AI::MXNet::DataBatch
   use AI::MXNet qw('mx');
   use Data::Dumper;
   my $ite = mx->img()->ImageIter(
   {  batch_size => 1, data_shape=> [3,224,224],label_width =>1, path_imglist 
=> "custom.lst", path_root => '.' });
   for $data (@{$ite}){
 print Dumper($data);
 print $data->data->[0]->aspdl;
 print $data->label->[0]->aspdl;
   }
   ```
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] idealboy opened a new issue #7350: Multi-Training-Task on the same GPU card

2017-08-05 Thread git
idealboy opened a new issue #7350: Multi-Training-Task on the same GPU card
URL: https://github.com/apache/incubator-mxnet/issues/7350
 
 
   For bugs or installation issues, please provide the following information.
   The more information you provide, the more likely people will be able to 
help you.
   
   ## Environment info
   Operating System:
   centos-7.2
   Compiler:
   gcc-4.8.5
   Package used (Python/R/Scala/Julia):
   Python
   MXNet version:
   0.9.3
   Or if installed from source:
   installed from source
   MXNet commit hash (`git rev-parse HEAD`):
   
   If you are using python package, please provide
   python-2.7
   Python version and distribution:
   python-2.7
   If you are using R package, please provide
   
   R `sessionInfo()`:
   
   ## Error Message:
   Please paste the full error message, including stack trace.
   When I run a single process on a single gpu card, I got about 56 
samples/sec; 
   When I run two single process on a single gpu card, I got about 27 
samples/sec
   I think it is mainly the band-width in IO for GPU which cause the low 
trainning speed
   How can I run multi-trainning-task on the sample single GPU card with a high 
training performence?
   And My GPU's version is k40(12G)
   
   Thank you very much!
   
   ## Minimum reproducible example
   if you are using your own code, please provide a short script that 
reproduces the error.
   
   ## Steps to reproduce
   or if you are running standard examples, please provide the commands you 
have run that lead to the error.
   
   1.
   2.
   3.
   
   ## What have you tried to solve it?
   
   1.
   2.
   3.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] dokechin commented on issue #7334: cannot use ImageIter in Perl package

2017-08-05 Thread git
dokechin commented on issue #7334: cannot use ImageIter in Perl package
URL: 
https://github.com/apache/incubator-mxnet/issues/7334#issuecomment-320478262
 
 
   @sergeykolychev Thank you very much. I upload files.
   
[image.pl.zip](https://github.com/apache/incubator-mxnet/files/1202781/image.pl.zip)
   
[custom.lst.zip](https://github.com/apache/incubator-mxnet/files/1202782/custom.lst.zip)
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] thirdwing commented on issue #7336: io.cc:54: Data and label shape in-consistent

2017-08-05 Thread git
thirdwing commented on issue #7336: io.cc:54: Data and label shape in-consistent
URL: 
https://github.com/apache/incubator-mxnet/issues/7336#issuecomment-320472863
 
 
   Are you using the prebuilt pkg?
   
   On my Linux machine with the latest code, it works well
   
   ```r
   library(readr)
   DataNeurona <- read_csv("MyData2.csv")
   DataNeurona <- as.matrix(DataNeurona)
   
   train.ind <- c(1:100)
   train.x <- DataNeurona[train.ind, -length(DataNeurona[1, ])]
   train.y <- DataNeurona[train.ind, length(DataNeurona[1, ])]
   test.x <- DataNeurona[-train.ind, length(DataNeurona[1, ])]
   test.y <- DataNeurona[-train.ind, length(DataNeurona[1, ])]
   
   library(mxnet)
   
   data <- mx.symbol.Variable("data")
   fc1 <-
 mx.symbol.FullyConnected(data, name = "fc1", num_hidden = 40) #weight=w,
   act1 <- mx.symbol.Activation(fc1, name = "sigmoid1", act_type = "sigmoid")
   fc2 <- mx.symbol.FullyConnected(act1, name = "fc2", num_hidden = 50)
   act2 <- mx.symbol.Activation(fc2, name = "tanh1", act_type = "sigmoid")
   fc4 <- mx.symbol.FullyConnected(act2, name = "fc4", num_hidden = 1)
   lro <- mx.symbol.LinearRegressionOutput(data = fc4, grad.scale = 1)
   mx.set.seed(0)
   
   model <- mx.model.FeedForward.create(
 symbol = lro,
 X = train.x,
 y = train.y,
 ctx = mx.cpu(),
 num.round = 25,
 array.batch.size = 20,
 learning.rate = 0.05,
 momentum = 0.9,
 eval.metric = mx.metric.mse
   )
   ```
   
   ```
   Start training with 1 devices
   [1] Train-mse=0.189338970604364
   [2] Train-mse=0.182918034135271
   [3] Train-mse=0.139369296805449
   [4] Train-mse=0.0981672365474814
   [5] Train-mse=0.090259424948582
   [6] Train-mse=0.091202815317992
   [7] Train-mse=0.0735728471551251
   [8] Train-mse=0.0755325079735637
   [9] Train-mse=0.0776457090260113
   [10] Train-mse=0.0703273524582357
   [11] Train-mse=0.0727759200215761
   [12] Train-mse=0.073669241397235
   [13] Train-mse=0.0705466593146721
   [14] Train-mse=0.0721189324018792
   [15] Train-mse=0.0723587683907677
   [16] Train-mse=0.0710235608978592
   [17] Train-mse=0.0718809927804486
   [18] Train-mse=0.0718809992951069
   [19] Train-mse=0.0713206787760761
   [20] Train-mse=0.0717545830533599
   [21] Train-mse=0.071694452664629
   [22] Train-mse=0.0714664637380885
   [23] Train-mse=0.0716749304881256
   [24] Train-mse=0.0716162021040038
   [25] Train-mse=0.0715267213384503
   Warning message:
   In mx.model.select.layout.train(X, y) :
 Auto detect layout of input matrix, use rowmajor..
   ```
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mwbyeon commented on a change in pull request #7348: fix print format in im2rec.py

2017-08-05 Thread git
mwbyeon commented on a change in pull request #7348: fix print format in 
im2rec.py
URL: https://github.com/apache/incubator-mxnet/pull/7348#discussion_r131528453
 
 

 ##
 File path: tools/im2rec.py
 ##
 @@ -11,7 +11,6 @@
 import cv2
 import time
 import traceback
-from builtins import range
 
 Review comment:
   In Python 2.x, need to install the "future" package to use the "builtins" 
function. so this line was removed.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] szha commented on issue #7319: [RoadMap] Legacy issue resolution before 1.0 release

2017-08-05 Thread git
szha commented on issue #7319: [RoadMap] Legacy issue resolution before 1.0 
release
URL: 
https://github.com/apache/incubator-mxnet/issues/7319#issuecomment-320466333
 
 
   Default epsilon in Symbol/NDArray batch norm are too large (1e-3). Gluon now 
uses 1e-5, which is more commonly used.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kurt-o-sys opened a new issue #7349: time series delay output mxnet

2017-08-05 Thread git
kurt-o-sys opened a new issue #7349: time series delay output mxnet
URL: https://github.com/apache/incubator-mxnet/issues/7349
 
 
   For bugs or installation issues, please provide the following information.
   The more information you provide, the more likely people will be able to 
help you.
   
   ## Environment info
   Operating System:
   ```
   $ uname -ar
   Linux flipflap 4.4.0-57-generic #78-Ubuntu SMP Fri Dec 9 23:50:32 UTC 2016 
x86_64 x86_64 x86_64 GNU/Linux
   ```
   
   Compiler: ?
   
   Package used (Python/R/Scala/Julia): R
   
   MXNet version:
   ```
   > packageVersion("mxnet")
   [1] ?0.10.1?
   > sessionInfo()
   R version 3.4.1 (2017-06-30)
   Platform: x86_64-pc-linux-gnu (64-bit)
   Running under: Linux Mint 18
   
   Matrix products: default
   BLAS: /usr/lib/openblas-base/libblas.so.3
   LAPACK: /usr/lib/libopenblasp-r0.2.18.so
   
   locale:
[1] LC_CTYPE=en_US.UTF-8   LC_NUMERIC=C   
LC_TIME=en_US.UTF-8LC_COLLATE=en_US.UTF-8 LC_MONETARY=de_BE.UTF-8   
[6] LC_MESSAGES=en_US.UTF-8LC_PAPER=de_BE.UTF-8   LC_NAME=C 
 LC_ADDRESS=C   LC_TELEPHONE=C
   [11] LC_MEASUREMENT=de_BE.UTF-8 LC_IDENTIFICATION=C   
   
   attached base packages:
   [1] stats graphics  grDevices utils datasets  methods   base 
   
   other attached packages:
   [1] mxnet_0.10.1 httr_1.2.1   jsonlite_1.5
   
   loaded via a namespace (and not attached):
[1] Rcpp_0.12.12   compiler_3.4.1 RColorBrewer_1.1-2 
influenceR_0.1.0   plyr_1.8.4 bindr_0.1  viridis_0.4.0 
[8] tools_3.4.1digest_0.6.12  tibble_1.3.3   gtable_0.2.0   
viridisLite_0.2.0  rgexf_0.15.3   pkgconfig_2.0.1   
   [15] rlang_0.1.1igraph_1.1.2   rstudioapi_0.6 curl_2.4   
bindrcpp_0.2   gridExtra_2.2.1stringr_1.2.0 
   [22] DiagrammeR_0.9.0   dplyr_0.7.2htmlwidgets_0.9grid_3.4.1 
glue_1.1.1 R6_2.2.2   Rook_1.1-1
   [29] XML_3.98-1.9   ggplot2_2.2.1  magrittr_1.5   
codetools_0.2-15   scales_0.4.1   htmltools_0.3.6assertthat_0.1
   [36] colorspace_1.3-2   brew_1.0-6 stringi_1.1.5  
visNetwork_2.0.1   lazyeval_0.2.0 munsell_0.4.3
   ```
   
   
   # Question
   
   Having a rnn to predict a point in time, based in indirect inputs. This is 
not the 'normal' time series prediction in which the future is predicted on the 
last values, but on a bunch of other features. When training, the first S 
(let's say 50) outputs will not be accurate at all. This means, during 
training, I will provide at least 51 inputs (and hence, there will be 51 
outputs as well - 1 output per input). Let's call this number T. I'd only like 
to use only the last outputs (T-50) for the parameter/weights estimation during 
training, omitting the first S. How can I do/achieve that in mxnet?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #7345: add std_rgba to normalize options

2017-08-05 Thread git
piiswrong commented on a change in pull request #7345: add std_rgba to 
normalize options
URL: https://github.com/apache/incubator-mxnet/pull/7345#discussion_r131526668
 
 

 ##
 File path: src/io/iter_normalize.h
 ##
 @@ -137,25 +137,32 @@ class ImageNormalizeIter : public IIterator {
   if (data.shape_[0] == 4) {
 data[3] -= param_.mean_a;
   }
-  if ((param_.rand_mirror && coin_flip(rnd_)) || param_.mirror) {
-outimg_ = mirror(data * contrast + illumination) * param_.scale;
-  } else {
-outimg_ = (data * contrast + illumination) * param_.scale;
-  }
 } else if (!meanfile_ready_ || param_.mean_img.length() == 0) {
   // do not subtract anything
-  if ((param_.rand_mirror && coin_flip(rnd_)) || param_.mirror) {
-outimg_ = mirror(data) * param_.scale;
-  } else {
-outimg_ = F(data) * param_.scale;
-  }
 } else {
   CHECK(meanfile_ready_);
-  if ((param_.rand_mirror && coin_flip(rnd_)) || param_.mirror) {
-outimg_ = mirror((data - meanimg_) * contrast + illumination) * 
param_.scale;
-  } else {
-outimg_ = ((data - meanimg_) * contrast + illumination) * param_.scale;
-  }
+  data -= meanimg_;
+}
+// apply contrast and illumination jitter
+data = data * contrast + illumination;
+// apply std
+if (param_.std_r > 0.0f) {
+  data[0] /= param_.std_r;
+}
+if (data.shape_[0] >= 3 && param_.std_g > 0.0f) {
 
 Review comment:
   >=2?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #7345: add std_rgba to normalize options

2017-08-05 Thread git
piiswrong commented on a change in pull request #7345: add std_rgba to 
normalize options
URL: https://github.com/apache/incubator-mxnet/pull/7345#discussion_r131526676
 
 

 ##
 File path: src/io/iter_normalize.h
 ##
 @@ -137,25 +137,32 @@ class ImageNormalizeIter : public IIterator {
   if (data.shape_[0] == 4) {
 data[3] -= param_.mean_a;
   }
-  if ((param_.rand_mirror && coin_flip(rnd_)) || param_.mirror) {
-outimg_ = mirror(data * contrast + illumination) * param_.scale;
-  } else {
-outimg_ = (data * contrast + illumination) * param_.scale;
-  }
 } else if (!meanfile_ready_ || param_.mean_img.length() == 0) {
   // do not subtract anything
-  if ((param_.rand_mirror && coin_flip(rnd_)) || param_.mirror) {
-outimg_ = mirror(data) * param_.scale;
-  } else {
-outimg_ = F(data) * param_.scale;
-  }
 } else {
   CHECK(meanfile_ready_);
-  if ((param_.rand_mirror && coin_flip(rnd_)) || param_.mirror) {
-outimg_ = mirror((data - meanimg_) * contrast + illumination) * 
param_.scale;
-  } else {
-outimg_ = ((data - meanimg_) * contrast + illumination) * param_.scale;
-  }
+  data -= meanimg_;
+}
+// apply contrast and illumination jitter
+data = data * contrast + illumination;
+// apply std
+if (param_.std_r > 0.0f) {
+  data[0] /= param_.std_r;
+}
+if (data.shape_[0] >= 3 && param_.std_g > 0.0f) {
 
 Review comment:
   a switch without break would look better here
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] zhreshold commented on a change in pull request #7345: add std_rgba to normalize options

2017-08-05 Thread git
zhreshold commented on a change in pull request #7345: add std_rgba to 
normalize options
URL: https://github.com/apache/incubator-mxnet/pull/7345#discussion_r131526564
 
 

 ##
 File path: src/io/iter_normalize.h
 ##
 @@ -137,25 +137,32 @@ class ImageNormalizeIter : public IIterator {
   if (data.shape_[0] == 4) {
 data[3] -= param_.mean_a;
   }
-  if ((param_.rand_mirror && coin_flip(rnd_)) || param_.mirror) {
 
 Review comment:
   moved to last step before assign to outimg_
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #7345: add std_rgba to normalize options

2017-08-05 Thread git
piiswrong commented on a change in pull request #7345: add std_rgba to 
normalize options
URL: https://github.com/apache/incubator-mxnet/pull/7345#discussion_r131526450
 
 

 ##
 File path: src/io/iter_normalize.h
 ##
 @@ -137,25 +137,32 @@ class ImageNormalizeIter : public IIterator {
   if (data.shape_[0] == 4) {
 data[3] -= param_.mean_a;
   }
-  if ((param_.rand_mirror && coin_flip(rnd_)) || param_.mirror) {
 
 Review comment:
   why remove these?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] piiswrong commented on a change in pull request #7348: fix print format in im2rec.py

2017-08-05 Thread git
piiswrong commented on a change in pull request #7348: fix print format in 
im2rec.py
URL: https://github.com/apache/incubator-mxnet/pull/7348#discussion_r131526429
 
 

 ##
 File path: tools/im2rec.py
 ##
 @@ -11,7 +11,6 @@
 import cv2
 import time
 import traceback
-from builtins import range
 
 Review comment:
   ?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] sergeykolychev commented on issue #7334: cannot use ImageIter in Perl package

2017-08-05 Thread git
sergeykolychev commented on issue #7334: cannot use ImageIter in Perl package
URL: 
https://github.com/apache/incubator-mxnet/issues/7334#issuecomment-320451914
 
 
   @dokechin Ok, let's do some debugging. Can you please send me your script 
and custom.lst (no images needed) and I'll try to get to the bottom of this. 
The Image.pm is battle untested yet and hopefully with your help we can get get 
it working.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] chinakook commented on issue #7228: I think mxnet is lack of a execution order iterator of a symbol

2017-08-05 Thread git
chinakook commented on issue #7228: I think mxnet is lack of a execution order 
iterator of a symbol
URL: 
https://github.com/apache/incubator-mxnet/issues/7228#issuecomment-320448352
 
 
   I have write a graph iterator by myself. The way to do that is same to the 
mx.viz.plot_network function. Nervertheless, I want the official can provide a 
handy way.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] chinakook commented on issue #7288: How to visualize the mxnet network written in json format??

2017-08-05 Thread git
chinakook commented on issue #7288: How to visualize the mxnet network written 
in json format?? 
URL: 
https://github.com/apache/incubator-mxnet/issues/7288#issuecomment-320447628
 
 
   
   mx.viz.plot_network
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] jcftang commented on issue #6121: mx.sym.WarpCTC cuda memcpy or memset failed issue

2017-08-05 Thread git
jcftang commented on issue #6121: mx.sym.WarpCTC cuda memcpy or memset failed 
issue
URL: 
https://github.com/apache/incubator-mxnet/issues/6121#issuecomment-320445766
 
 
   @sbodenstein I have nvidia 1080Ti's (founder's edition)
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] dokechin commented on issue #7334: cannot use ImageIter in Perl package

2017-08-05 Thread git
dokechin commented on issue #7334: cannot use ImageIter in Perl package
URL: 
https://github.com/apache/incubator-mxnet/issues/7334#issuecomment-320444767
 
 
   I changed source code and custom.lst file.
   And tried again.
   But same error has occurred. at line 866.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] sbodenstein commented on issue #6121: mx.sym.WarpCTC cuda memcpy or memset failed issue

2017-08-05 Thread git
sbodenstein commented on issue #6121: mx.sym.WarpCTC cuda memcpy or memset 
failed issue
URL: 
https://github.com/apache/incubator-mxnet/issues/6121#issuecomment-320443165
 
 
   @jcftang: this is strange. Its fixed for some (like myself), and others not.
   
   Could you give info about the GPU you are using?
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] dma100180 commented on issue #7336: io.cc:54: Data and label shape in-consistent

2017-08-05 Thread git
dma100180 commented on issue #7336: io.cc:54: Data and label shape in-consistent
URL: 
https://github.com/apache/incubator-mxnet/issues/7336#issuecomment-320434611
 
 
   Hi, sorry, DataNeurona is a list with market data from the IBEX35 index 
(Spain) and Calculated and normalized fields, I attach file as is it in 
DataNeurona.
   
   In the next step, I convert the list DataNeurona into matrix:
   DataNeurona<-as.matrix(DataNeurona)
   
   And then I do the steps from my first message
   
   Many thanks
   
   
[MyData2.zip](https://github.com/apache/incubator-mxnet/files/1202274/MyData2.zip)
   
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] wielandbrendel commented on issue #6766: Fix softmax_cross_entropy list input names

2017-08-05 Thread git
wielandbrendel commented on issue #6766: Fix softmax_cross_entropy list input 
names
URL: https://github.com/apache/incubator-mxnet/pull/6766#issuecomment-320425398
 
 
   I came across this error when compiling the current master, and so I needed 
to manually pull this PR locally to fix this problem.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services