Re: Having multiple spark context

2017-01-30 Thread Rohit Verma
mailto:m...@clearstorydata.com>> Subject: Re: Having multiple spark context in general in a single JVM which is basically running in Local mode, you have only one Spark Context. However, you can stop the current Spark Context by sc.stop() HTH Dr Mich Talebzadeh LinkedIn http

RE: Having multiple spark context

2017-01-30 Thread jasbir.sing
@spark.apache.org; Sing, Jasbir ; Mark Hamstra Subject: Re: Having multiple spark context in general in a single JVM which is basically running in Local mode, you have only one Spark Context. However, you can stop the current Spark Context by sc.stop() HTH Dr Mich Talebzadeh LinkedIn https

Re: Having multiple spark context

2017-01-30 Thread Mich Talebzadeh
in general in a single JVM which is basically running in Local mode, you have only one Spark Context. However, you can stop the current Spark Context by sc.stop() HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

Re: Having multiple spark context

2017-01-29 Thread vincent gromakowski
A clustering lib is necessary to manage multiple jvm. Akka cluster for instance Le 30 janv. 2017 8:01 AM, "Rohit Verma" a écrit : > Hi, > > If I am right, you need to launch other context from another jvm. If you > are trying to launch from same jvm another context it will return you the > exist

Re: Having multiple spark context

2017-01-29 Thread Rohit Verma
Hi, If I am right, you need to launch other context from another jvm. If you are trying to launch from same jvm another context it will return you the existing context. Rohit On Jan 30, 2017, at 12:24 PM, Mark Hamstra mailto:m...@clearstorydata.com>> wrote: More than one Spark Context in a si

Re: Having multiple spark context

2017-01-29 Thread Mark Hamstra
More than one Spark Context in a single Application is not supported. On Sun, Jan 29, 2017 at 9:08 PM, wrote: > Hi, > > > > I have a requirement in which, my application creates one Spark context in > Distributed mode whereas another Spark context in local mode. > > When I am creating this, my c

Having multiple spark context

2017-01-29 Thread jasbir.sing
Hi, I have a requirement in which, my application creates one Spark context in Distributed mode whereas another Spark context in local mode. When I am creating this, my complete application is working on only one SparkContext (created in Distributed mode). Second spark context is not getting cr