Are you trying to do this in the shell? Shell is instantiated with a spark
context named sc.

-Ilya Ganelin

On Sat, Dec 27, 2014 at 5:24 PM, tfrisk <tfris...@gmail.com> wrote:

>
> Hi,
>
> Doing:
>    val ssc = new StreamingContext(conf, Seconds(1))
>
> and getting:
>    Only one SparkContext may be running in this JVM (see SPARK-2243). To
> ignore this error, set spark.driver.allowMultipleContexts = true.
>
>
> But I dont think that I have another SparkContext running. Is there any way
> I can check this or force kill ?  I've tried restarting the server as I'm
> desperate but still I get the same issue.  I was not getting this earlier
> today.
>
> Any help much appreciated .....
>
> Thanks,
>
> Thomas
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Problem-with-StreamingContext-getting-SPARK-2243-tp20869.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to