Ngone51 commented on PR #49350:
URL: https://github.com/apache/spark/pull/49350#issuecomment-2590063028

   After a second thinking, I decided to remove `Utils.isTesting` where 
`LocalSparkCluster.get` is already defined. `local-cluster` mode is [documented 
to be used in unit test 
only](https://spark.apache.org/docs/3.5.4/submitting-applications.html#master-urls).
 And in Spark, we do not has an explict `Utils.isTesting` check where 
`local-cluster` mode is enabled. We all tacitly acknowledge that 
`local-cluster` is equivalent to `Utils.isTesting=true`. This is what it is for 
a while. So users should take the risk by using `local-cluster` mode out of the 
unit test. Given this, I think we can remove `Utils.isTesting` for simplify. cc 
@JoshRosen @LuciferYang 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to