Github user jiangxb1987 commented on the issue:
https://github.com/apache/spark/pull/20347
My major concern is that, if there is a existing `SparkContext`, some confs
you set may not take effect, as described in `SparkContext.getOrCreate()`. It's
hard to enumerate the use cases but I'm sure there are some that pass in
specific confs to create a new `JavaSparkContext`, so I tend to keep the
current behavior here.
On the other hand, the following comment copyed from the comment of the
class `JavaSparkContext`:
```
* Only one SparkContext may be active per JVM. You must `stop()` the
active SparkContext before
* creating a new one. This limitation may eventually be removed; see
SPARK-2243 for more details.
```
If that is the case, there should be no active `SparkContext` before we
initiate the `JavaSparkContext`, so the change doesn't bring any advantage in
that means.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]