My bad, I just fired up a spark-shell and created a new sparkContext and it
was working fine. I basically did a parallelize and collect with both
sparkContexts.
Thanks
Best Regards
On Fri, Nov 7, 2014 at 3:17 PM, Tobias Pfeiffer t...@preferred.jp wrote:
Hi,
On Fri, Nov 7, 2014 at 4:58 PM,
Hi,
quick question: I found this:
http://docs.sigmoidanalytics.com/index.php/Problems_and_their_Solutions#Multiple_SparkContext:Failed_to_bind_to:.2F127.0.1.1:45916
My main question: is this constrain still valid? AM I not allowed to have
two SparkContexts pointing to the same Spark Master in
Hi Pawel,
That doc was created during the initial days (Spark 0.8.0), you can of
course create multiple sparkContexts in the same driver program now.
Thanks
Best Regards
On Thu, Nov 6, 2014 at 9:30 PM, Paweł Szulc paul.sz...@gmail.com wrote:
Hi,
quick question: I found this: