My bad, I just fired up a spark-shell and created a new sparkContext and it
was working fine. I basically did a parallelize and collect with both
sparkContexts.

Thanks
Best Regards

On Fri, Nov 7, 2014 at 3:17 PM, Tobias Pfeiffer <t...@preferred.jp> wrote:

> Hi,
>
> On Fri, Nov 7, 2014 at 4:58 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>>
>> That doc was created during the initial days (Spark 0.8.0), you can of
>> course create multiple sparkContexts in the same driver program now.
>>
>
> You sure about that? According to
> http://apache-spark-user-list.1001560.n3.nabble.com/Is-spark-context-in-local-mode-thread-safe-td7275.html
> (June 2014), "you currently can’t have multiple SparkContext objects in the
> same JVM".
>
> Tobias
>
>
>

Reply via email to