QueensGambit commented on issue #15337: Current MXNet-Dev master breaks loading 
of certain models
URL: 
https://github.com/apache/incubator-mxnet/issues/15337#issuecomment-513372662
 
 
   @roywei Thank your for the reply.
   Sorry, for the inconvenience there was apparently a `/` missing relative 
path of the config-files which I released. I just updated the .zip-Release 
files and it should work again for MXNet 1.4.1.
   
   This is the code how the model is currently loaded:
   
https://github.com/QueensGambit/CrazyAra/blob/master/DeepCrazyhouse/src/domain/agent/neural_net_api.py#L66
   
   ```python
           sym = mx.sym.load(self.symbol_path)
           # https://github.com/apache/incubator-mxnet/issues/6951
           save_dict = mx.nd.load(self.params_path)
           arg_params = {}
           aux_params = {}
           for key, val in save_dict.items():
               param_type, name = key.split(":", 1)
               if param_type == "arg":
                   arg_params[name] = val
               if param_type == "aux":
                   aux_params[name] = val
           # set the context on CPU, switch to GPU if there is one available
           if ctx == "cpu":
               self.ctx = mx.cpu()
           elif ctx == "gpu":
               self.ctx = mx.gpu()
           else:
               raise Exception("Unavailable ctx mode given %s. You must either 
select 'cpu' or 'gpu'" % ctx)
           # define batch_size times executor objects which are used for 
inference
           # one executor object is used for the currently requested batch 
batch length
           # the requested batch length is variable and at maximum the given 
batch_size
           self.executors = []
           for i in range(batch_size):
               executor = sym.simple_bind(
                   ctx=self.ctx,
                   # add a new length for each size starting with 1
                   data=(i + 1, NB_CHANNELS_FULL, BOARD_HEIGHT, BOARD_WIDTH),
                   grad_req="null",
                   force_rebind=True,
               )
               executor.copy_params_from(arg_params, aux_params)
               self.executors.append(executor)
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to