Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19509
>The effect of this change is that now it's possible to initialize multiple,
non-concurrent SparkContext instances in the same JVM.
@vanzin , do we support in now? As I remembered it was not supported before.--- --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
