Multiple spark shell sessions

2014-10-01 Thread Sanjay Subramanian
hey guys

I am using  spark 1.0.0+cdh5.1.0+41
When two users try to run spark-shell , the first guy's spark-shell shows
active in the 18080 Web UI but the second user shows WAITING and the shell
has a bunch of errors but does go the spark-shell and sc.master seems to
point to the correct master.

I tried controlling the number of cores in the spark-shell command
--executor-cores 8
Does not work

thanks

sanjay 
   

  

Re: Multiple spark shell sessions

2014-10-01 Thread Matei Zaharia
You need to set --total-executor-cores to limit how many total cores it grabs 
on the cluster. --executor-cores is just for each individual executor, but it 
will try to launch many of them.

Matei

On Oct 1, 2014, at 4:29 PM, Sanjay Subramanian 
sanjaysubraman...@yahoo.com.INVALID wrote:

 hey guys
 
 I am using  spark 1.0.0+cdh5.1.0+41
 When two users try to run spark-shell , the first guy's spark-shell shows
 active in the 18080 Web UI but the second user shows WAITING and the shell
 has a bunch of errors but does go the spark-shell and sc.master seems to
 point to the correct master.
 
 I tried controlling the number of cores in the spark-shell command
 --executor-cores 8
 Does not work
 
 thanks
 
 sanjay 
 
 



Re: Multiple spark shell sessions

2014-10-01 Thread Sanjay Subramanian
Awesome thanks a TON. It works
There is a clash in the UI port initially but looks like it creates a second UI 
port at 4041 for the second user wanting to use the spark-shell 14/10/01 
17:34:38 INFO JettyUtils: Failed to create UI at port, 4040. Trying 
again.14/10/01 17:34:38 INFO JettyUtils: Error was: 
Failure(java.net.BindException: Address already in use)14/10/01 17:34:38 INFO 
SparkUI: Started SparkUI at http://hadoop02:4041
sanjay
  From: Matei Zaharia matei.zaha...@gmail.com
 To: Sanjay Subramanian sanjaysubraman...@yahoo.com 
Cc: user@spark.apache.org user@spark.apache.org 
 Sent: Wednesday, October 1, 2014 5:19 PM
 Subject: Re: Multiple spark shell sessions
   
You need to set --total-executor-cores to limit how many total cores it grabs 
on the cluster. --executor-cores is just for each individual executor, but it 
will try to launch many of them.
Matei


On Oct 1, 2014, at 4:29 PM, Sanjay Subramanian 
sanjaysubraman...@yahoo.com.INVALID wrote:

hey guys

I am using  spark 1.0.0+cdh5.1.0+41
When two users try to run spark-shell , the first guy's spark-shell shows
active in the 18080 Web UI but the second user shows WAITING and the shell
has a bunch of errors but does go the spark-shell and sc.master seems to
point to the correct master.

I tried controlling the number of cores in the spark-shell command
--executor-cores 8
Does not work

thanks

sanjay 
   

   



  

Re: Multiple spark shell sessions

2014-09-05 Thread Andrew Ash
Hi Dhimant,

We also cleaned up these needless warnings on port failover in Spark 1.1 --
see https://issues.apache.org/jira/browse/SPARK-1902

Andrew


On Thu, Sep 4, 2014 at 7:38 AM, Dhimant dhimant84.jays...@gmail.com wrote:

 Thanks Yana,
 I am able to execute application and command via another session, i also
 received another port for UI application.

 Thanks,
 Dhimant



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-spark-shell-sessions-tp13441p13459.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Multiple spark shell sessions

2014-09-04 Thread Dhimant
(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120)
at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263)
at
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120)
at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:913)
at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56)
at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104)
at
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:56)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:930)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
/



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-spark-shell-sessions-tp13441.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Multiple spark shell sessions

2014-09-04 Thread Yana Kadiyska
$.startJettyServer(JettyUtils.scala:205)
 at org.apache.spark.ui.WebUI.bind(WebUI.scala:99)
 at org.apache.spark.SparkContext.init(SparkContext.scala:223)
 at
 org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:957)
 at $line3.$read$$iwC$$iwC.init(console:8)
 at $line3.$read$$iwC.init(console:14)
 at $line3.$read.init(console:16)
 at $line3.$read$.init(console:20)
 at $line3.$read$.clinit(console)
 at $line3.$eval$.init(console:7)
 at $line3.$eval$.clinit(console)
 at $line3.$eval.$print(console)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at

 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at

 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at
 org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
 at
 org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
 at
 org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
 at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
 at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
 at
 org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
 at

 org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
 at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
 at

 org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121)
 at

 org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120)
 at
 org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263)
 at

 org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120)
 at
 org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56)
 at

 org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:913)
 at

 org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142)
 at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56)
 at

 org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104)
 at
 org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:56)
 at

 org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:930)
 at

 org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
 at

 org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
 at

 scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
 at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
 at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
 at org.apache.spark.repl.Main$.main(Main.scala:31)
 at org.apache.spark.repl.Main.main(Main.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at

 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at

 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at
 org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 /



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-spark-shell-sessions-tp13441.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Multiple spark shell sessions

2014-09-04 Thread Dhimant
Thanks Yana,
I am able to execute application and command via another session, i also
received another port for UI application.

Thanks,
Dhimant



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-spark-shell-sessions-tp13441p13459.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org