My colleagues and I work on spark recently. We just setup a new cluster on
yarn over which we can run spark. We basically use ipython and write program
in the notebook in a specific port(like 8888) via http.
We have our own notebooks and the odd thing is that if I run my notebook
first, my colleagues' notebooks will get stuck and stop there when there is
an action; and if one of them runs first, mine will get stuck.
Does one run of pyspark only support one notebook at one time? Or we should
figure out some configurations to make it work?

@Nam has the same problem with me. 
http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-accesses-to-a-Spark-cluster-via-iPython-Notebook-td12162.html
<http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-accesses-to-a-Spark-cluster-via-iPython-Notebook-td12162.html>
  

Thanks
Ai



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Do-multiple-ipython-notebooks-work-on-yarn-in-one-cluster-tp22498.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to