ConeyLiu commented on issue #26239: [SPARK-29582][PYSPARK] Unify the behavior 
of pyspark.TaskContext with spark core
URL: https://github.com/apache/spark/pull/26239#issuecomment-545928895
 
 
   Yeah, we use rdd.mappartitionsWithIndex to startup a distributed dl model 
training, which is smilier to horovod.spark. Now we want to switch to barrier 
mode. And we want to reduce the code difference between scala and python. I 
think this patch is also useful for other people.
   
   在 2019年10月24日,下午9:12,Jiang Xingbo 
<[email protected]<mailto:[email protected]>> 写道:
   
   
   This is useful when people switch from normal code to barrier code
   
   Do you have any use cases that people want to reuse their production code 
and migrate on to barrier execution mode ?
   
   —
   You are receiving this because you authored the thread.
   Reply to this email directly, view it on 
GitHub<https://github.com/apache/spark/pull/26239?email_source=notifications&email_token=ADBEWSE5BQPFGSB6HUI72K3QQGNKJA5CNFSM4JEN6GRKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOECE625I#issuecomment-545910133>,
 or 
unsubscribe<https://github.com/notifications/unsubscribe-auth/ADBEWSDQTLZZPSDEYTRVC5TQQGNKJANCNFSM4JEN6GRA>.
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to