threeleafzerg commented on a change in pull request #10696: [MXNET-366]Extend 
MXNet Distributed Training by MPI AllReduce
URL: https://github.com/apache/incubator-mxnet/pull/10696#discussion_r187258294
 
 

 ##########
 File path: python/mxnet/kvstore.py
 ##########
 @@ -296,13 +305,121 @@ def pull(self, key, out=None, priority=0):
         [ 2.  2.  2.]]
         """
         assert(out is not None)
-        ckeys, cvals, use_str_keys = _ctype_key_value(key, out)
-        if use_str_keys:
-            check_call(_LIB.MXKVStorePullEx(
-                self.handle, mx_uint(len(ckeys)), ckeys, cvals, 
ctypes.c_int(priority)))
+        if self.type != 'dist_sync_mpi':
+            ckeys, cvals, use_str_keys = _ctype_key_value(key, out)
+            if use_str_keys:
+                check_call(_LIB.MXKVStorePullEx(
+                    self.handle, mx_uint(len(ckeys)), ckeys, cvals, 
ctypes.c_int(priority)))
+            else:
+                check_call(_LIB.MXKVStorePull(
+                    self.handle, mx_uint(len(ckeys)), ckeys, cvals, 
ctypes.c_int(priority)))
         else:
-            check_call(_LIB.MXKVStorePull(
-                self.handle, mx_uint(len(ckeys)), ckeys, cvals, 
ctypes.c_int(priority)))
+            raise Exception("This api is not supported for kvstore with type 
%s. \
+                             Please use pushpull instead."%self.type)
+
+    def pushpull(self, key, ins, outs, priority=0):
 
 Review comment:
   When using dist_sync_mpi kvstore, its gradient updater policy is the same as 
kvstore local. It always use updater, you can check api set_optimizer in 
kvstore in this patch.  Hope I didn't misunderstand your meaning.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to