Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/19760
  
    There's quite a lot of applications that use just `SparkContext`. And 
`SparkConf` is generally created before anything else (e.g. in yarn-cluster 
mode, the Spark code will instantiate `SparkConf` before even calling any user 
code).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to