WeichenXu123 commented on code in PR #39267: URL: https://github.com/apache/spark/pull/39267#discussion_r1072300627
########## python/pyspark/ml/torch/distributor.py: ########## @@ -325,8 +329,15 @@ def _create_torchrun_command( torchrun_args = ["--standalone", "--nnodes=1"] processes_per_node = num_processes else: - pass - # TODO(SPARK-41592): Handle distributed training + master_addr, master_port = os.environ["MASTER_ADDR"], os.environ["MASTER_PORT"] + node_rank = os.environ["RANK"] + torchrun_args = [ + f"--nnodes={num_processes}", + f"--node_rank={node_rank}", + f"--rdzv_endpoint={master_addr}:{master_port}", + "--rdzv_id=0", + ] # TODO: setup random ID that is gleaned from env variables + processes_per_node = 1 Review Comment: We don't need setting `preexec_fn=sigterm_on_parent_death` when executing `torch_run_process_wrapper` subprocess -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org