Merge pull request #256 from MLnick/master
Fix 'IPYTHON=1 ./pyspark' throwing ValueError
This fixes an annoying issue where running ```IPYTHON=1 ./pyspark``` resulted
in:
```
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 0.8.0
/_/
Using Python version 2.7.5 (default, Jun 20 2013 11:06:30)
Spark context avaiable as sc.
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
/usr/local/lib/python2.7/site-packages/IPython/utils/py3compat.pyc in
execfile(fname, *where)
202 else:
203 filename = fname
--> 204 __builtin__.execfile(filename, *where)
/Users/Nick/workspace/scala/spark-0.8.0-incubating-bin-hadoop1/python/pyspark/shell.py
in <module>()
30 add_files = os.environ.get("ADD_FILES").split(',') if
os.environ.get("ADD_FILES") != None else None
31
---> 32 sc = SparkContext(os.environ.get("MASTER", "local"), "PySparkShell",
pyFiles=add_files)
33
34 print """Welcome to
/Users/Nick/workspace/scala/spark-0.8.0-incubating-bin-hadoop1/python/pyspark/context.pyc
in __init__(self, master, jobName, sparkHome, pyFiles, environment, batchSize)
70 with SparkContext._lock:
71 if SparkContext._active_spark_context:
---> 72 raise ValueError("Cannot run multiple SparkContexts at
once")
73 else:
74 SparkContext._active_spark_context = self
ValueError: Cannot run multiple SparkContexts at once
```
The issue arises since previously IPython didn't seem to respect
```$PYTHONSTARTUP```, but since at least 1.0.0 it has. Technically this might
break for older versions of IPython, but most users should be able to upgrade
IPython to at least 1.0.0 (and should be encouraged to do so :).
New behaviour:
```
Nicks-MacBook-Pro:incubator-spark-mlnick Nick$ IPYTHON=1 ./pyspark
Python 2.7.5 (default, Jun 20 2013, 11:06:30)
Type "copyright", "credits" or "license" for more information.
IPython 1.1.0 -- An enhanced Interactive Python.
? -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help -> Python's own help system.
object? -> Details about 'object', use 'object??' for extra details.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/Users/Nick/workspace/scala/incubator-spark-mlnick/tools/target/scala-2.9.3/spark-tools-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/Users/Nick/workspace/scala/incubator-spark-mlnick/assembly/target/scala-2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
13/12/12 13:08:15 WARN Utils: Your hostname, Nicks-MacBook-Pro.local resolves
to a loopback address: 127.0.0.1; using 10.0.0.4 instead (on interface en0)
13/12/12 13:08:15 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another
address
13/12/12 13:08:15 INFO Slf4jEventHandler: Slf4jEventHandler started
13/12/12 13:08:15 INFO SparkEnv: Registering BlockManagerMaster
13/12/12 13:08:15 INFO DiskBlockManager: Created local directory at
/var/folders/_l/06wxljt13wqgm7r08jlc44_r0000gn/T/spark-local-20131212130815-0e76
13/12/12 13:08:15 INFO MemoryStore: MemoryStore started with capacity 326.7 MB.
13/12/12 13:08:15 INFO ConnectionManager: Bound socket to port 53732 with id =
ConnectionManagerId(10.0.0.4,53732)
13/12/12 13:08:15 INFO BlockManagerMaster: Trying to register BlockManager
13/12/12 13:08:15 INFO BlockManagerMasterActor$BlockManagerInfo: Registering
block manager 10.0.0.4:53732 with 326.7 MB RAM
13/12/12 13:08:15 INFO BlockManagerMaster: Registered BlockManager
13/12/12 13:08:15 INFO HttpBroadcast: Broadcast server started at
http://10.0.0.4:53733
13/12/12 13:08:15 INFO SparkEnv: Registering MapOutputTracker
13/12/12 13:08:15 INFO HttpFileServer: HTTP File server directory is
/var/folders/_l/06wxljt13wqgm7r08jlc44_r0000gn/T/spark-8f40e897-8211-4628-a7a8-755562d5244c
13/12/12 13:08:16 INFO SparkUI: Started Spark Web UI at http://10.0.0.4:4040
2013-12-12 13:08:16.337 java[56801:4003] Unable to load realm info from
SCDynamicStore
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 0.9.0-SNAPSHOT
/_/
Using Python version 2.7.5 (default, Jun 20 2013 11:06:30)
Spark context avaiable as sc.
```
Project: http://git-wip-us.apache.org/repos/asf/incubator-spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-spark/commit/d2ced6d5
Tree: http://git-wip-us.apache.org/repos/asf/incubator-spark/tree/d2ced6d5
Diff: http://git-wip-us.apache.org/repos/asf/incubator-spark/diff/d2ced6d5
Branch: refs/heads/master
Commit: d2ced6d58c5e8aea23f909c2fc4ac11aa1b55607
Parents: c55e698 bb5277b
Author: Josh Rosen <[email protected]>
Authored: Sun Dec 15 14:11:34 2013 -0800
Committer: Josh Rosen <[email protected]>
Committed: Sun Dec 15 14:11:34 2013 -0800
----------------------------------------------------------------------
pyspark | 8 ++++++--
1 file changed, 6 insertions(+), 2 deletions(-)
----------------------------------------------------------------------