aGiant commented on issue #17814: mxnet.gluon.data.vision.transforms.Normalize(mean=0.0, std=1.0) tuple issue within hybird_forward() URL: https://github.com/apache/incubator-mxnet/issues/17814#issuecomment-598071179 > For the first example, I noticed that you are defining both `hybrid_forward` and `forward` at the same time which is not supposed to work in this way. > > For the second example, you can use `get_constant` instead of `get` > > ```python > # self.scales = self.params.get('scales', shape=scales.shape, init=mx.init.Constant(scales.asnumpy()), differentiable=False) > > self.scales = self.params.get_constant(value=scales) > ``` The first one forward() worked very well to get the variable of x.shape. That's the only way to get the input x.shape from other people. After applying self.params.get_constant("scales", value=scales), got new error ``` --------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) <ipython-input-20-a5e4b7303bc4> in <module> 19 data = batch.data[0].as_in_context(model_ctx) 20 with autograd.record(): ---> 21 loss, y = net(data) 22 #loss, y, kl, ll, f, s, t, v = net(data) 23 ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in __call__(self, *args) 691 hook(self, args) 692 --> 693 out = self.forward(*args) 694 695 for hook in self._forward_hooks.values(): <ipython-input-17-f9e660a6116f> in forward(self, x) 35 def forward(self,x): 36 self.batch_size = x.shape[0] ---> 37 return gluon.HybridBlock.forward(self, x) 38 39 # https://mxnet.apache.org/api/python/docs/tutorials/extend/custom_layer.html ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in forward(self, x, *args) 1156 params = {k: v.data(ctx) for k, v in self._reg_params.items()} 1157 -> 1158 return self.hybrid_forward(ndarray, x, *args, **params) 1159 1160 params = {i: j.var() for i, j in self._reg_params.items()} <ipython-input-17-f9e660a6116f> in hybrid_forward(self, F, x) 42 #x_ = x.reshape((x.shape[0], x.shape[1], 1)) 43 #x_normalized = F.broadcast_div(F.broadcast_sub(self.flatten(x), self.min_v), (F.broadcast_sub(self.max_v, self.min_v))) ---> 44 x_normalized = self.normalizer(x) 45 h = self.encoder(x_normalized) 46 #print(h.asnumpy()[0]) ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in __call__(self, *args) 691 hook(self, args) 692 --> 693 out = self.forward(*args) 694 695 for hook in self._forward_hooks.values(): ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in forward(self, x, *args) 1149 with ctx: 1150 try: -> 1151 params = {k: v.data(ctx) for k, v in self._reg_params.items()} 1152 except DeferredInitializationError: 1153 self._deferred_infer_shape(x, *args) ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in <dictcomp>(.0) 1149 with ctx: 1150 try: -> 1151 params = {k: v.data(ctx) for k, v in self._reg_params.items()} 1152 except DeferredInitializationError: 1153 self._deferred_infer_shape(x, *args) ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/parameter.py in data(self, ctx) 563 "because its storage type is %s. Please use row_sparse_data() " \ 564 "instead." % (self.name, str(ctx), self._stype)) --> 565 return self._check_and_get(self._data, ctx) 566 567 def list_data(self): ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/parameter.py in _check_and_get(self, arr_list, ctx) 240 "with Block.collect_params() instead of Block.params " \ 241 "because the later does not include Parameters of " \ --> 242 "nested child Blocks"%(self.name)) 243 244 def _get_row_sparse(self, arr_list, ctx, row_id): RuntimeError: Parameter 'vae5_normalizationhybridlayer0_bias' has not been initialized. Note that you should initialize parameters and create Trainer with Block.collect_params() instead of Block.params because the later does not include Parameters of nested child Blocks ```
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
