xwu99 commented on pull request #33385:
URL: https://github.com/apache/spark/pull/33385#issuecomment-888382433


   @mridulm From the last failed tests. something is wrong for
   
   override val cpus: Int = SparkEnv.get.conf.get(config.CPUS_PER_TASK),
   
   [info]   java.lang.NullPointerException:
   [info]   at 
org.apache.spark.TaskContextImpl$.$lessinit$greater$default$10(TaskContextImpl.scala:57)
   [info]   at org.apache.spark.TaskContext$.empty(TaskContext.scala:70)
   
   It looks just need to fix this specific case by passing cpus=1 for empty() 
as it's a faked task ?
   
   Could you help me on how to set this properly. thanks!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to