I set up iPython Notebook to work with the pyspark shell, and now I'd like use %run to basically 'spark-submit' another Python Spark file, and leave the objects accessible within the Notebook.
I tried this, but got a "ValueError: Cannot run multiple SparkContexts at once" error. I then tried taking out the 'sc = SparkContext()' line from the .py file, but then it couldn't access sc. How can I %run another Python Spark file within iPython Notebook? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Run-Spark-job-from-within-iPython-Spark-tp24427.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org