CodingCat opened a new pull request #7571: [scala-package][spark] Resources running PS (role = server) should be explicit to Spark URL: https://github.com/apache/incubator-mxnet/pull/7571 another PR to facilitate the further work on integrating with Spark The current implementation starts PS (role = server), release the cluster resources as if no one is using that and then start PS (role = worker) To make the integration more seamless, we should make the resources used by PS (role = server) explicit to Spark Question to @javelinjs @terrytangyuan , why we choose to start PS in a child process instead of wrap it with a Spark task? ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
With regards, Apache Git Services