I suppose it doesn't work using spark-shell too ? If you can confirm

Thanks
> On Apr 3, 2016, at 03:39, Mich Talebzadeh <mich.talebza...@gmail.com> wrote:
> 
> This works fine for me
> 
> val sparkConf = new SparkConf().
>              setAppName("StreamTest").
>              setMaster("yarn-client").
>              set("spark.cores.max", "12").
>              set("spark.driver.allowMultipleContexts", "true").
>              set("spark.hadoop.validateOutputSpecs", "false")
> 
> Time: 1459669805000 ms
> -------------------------------------------
> -------------------------------------------
> Time: 1459669860000 ms
> -------------------------------------------
> (Sun Apr 3 08:35:01 BST 2016  ======= Sending messages from rhes5)
> 
> 
> 
> 
> Dr Mich Talebzadeh
>  
> LinkedIn  
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>  
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>
>  
> http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/>
>  
> 
> On 3 April 2016 at 03:34, Cyril Scetbon <cyril.scet...@free.fr 
> <mailto:cyril.scet...@free.fr>> wrote:
> Nobody has any idea ?
> 
> > On Mar 31, 2016, at 23:22, Cyril Scetbon <cyril.scet...@free.fr 
> > <mailto:cyril.scet...@free.fr>> wrote:
> >
> > Hi,
> >
> > I'm having issues to create a StreamingContext with Scala using 
> > spark-shell. It tries to access the localhost interface and the Application 
> > Master is not running on this interface :
> >
> > ERROR ApplicationMaster: Failed to connect to driver at localhost:47257, 
> > retrying ...
> >
> > I don't have the issue with Python and pyspark which works fine (you can 
> > see it uses the ip address) :
> >
> > ApplicationMaster: Driver now available: 192.168.10.100:43290 
> > <http://192.168.10.100:43290/>
> >
> > I use similar codes though :
> >
> > test.scala :
> > --------------
> >
> > import org.apache.spark._
> > import org.apache.spark.streaming._
> > val app = "test-scala"
> > val conf = new SparkConf().setAppName(app).setMaster("yarn-client")
> > val ssc = new StreamingContext(conf, Seconds(3))
> >
> > command used : spark-shell -i test.scala
> >
> > test.py :
> > -----------
> >
> > from pyspark import SparkConf, SparkContext
> > from pyspark.streaming import StreamingContext
> > app = "test-python"
> > conf = SparkConf().setAppName(app).setMaster("yarn-client")
> > sc = SparkContext(conf=conf)
> > ssc = StreamingContext(sc, 3)
> >
> > command used : pyspark test.py
> >
> > Any idea why scala can't instantiate it ? I thought python was barely using 
> > scala under the hood, but it seems there are differences. Are there any 
> > parameters set using Scala but not Python ?
> >
> > Thanks
> > --
> > Cyril SCETBON
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
> > <mailto:user-unsubscr...@spark.apache.org>
> > For additional commands, e-mail: user-h...@spark.apache.org 
> > <mailto:user-h...@spark.apache.org>
> >
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
> <mailto:user-unsubscr...@spark.apache.org>
> For additional commands, e-mail: user-h...@spark.apache.org 
> <mailto:user-h...@spark.apache.org>
> 
> 

Reply via email to