[
https://issues.apache.org/jira/browse/SPARK-1904?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Eric Lee updated SPARK-1904:
----------------------------
Description:
Attempting to run my bin/pyspark interactive shell against a Mesos cluster this
assignment to MASTER used to work:
{code}MASTER=zk://1.2.3.4:2181,2.3.4.5:2181,5.6.7.8:2181/mesos
{code}
Now, with this spark-v1.0.0 tag, this yields an error:
{code}
Type "help", "copyright", "credits" or "license" for more information.
command is ['/tmp/spark-1.0.0-rc10/./bin/spark-submit', 'pyspark-shell']
Error: Master must start with yarn, mesos, spark, or local
Run with --help for usage help or --verbose for debug output
Traceback (most recent call last):
File "/tmp/spark-1.0.0-rc10/python/pyspark/shell.py", line 41, in <module>
SparkContext.setSystemProperty("spark.executor.uri",
os.environ["SPARK_EXECUTOR_URI"])
File "/tmp/spark-1.0.0-rc10/python/pyspark/context.py", line 203, in
setSystemProperty
SparkContext._ensure_initialized()
File "/tmp/spark-1.0.0-rc10/python/pyspark/context.py", line 180, in
_ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File "/tmp/spark-1.0.0-rc10/python/pyspark/java_gateway.py", line 52, in
launch_gateway
gateway_port = int(proc.stdout.readline())
ValueError: invalid literal for int() with base 10: ''
>>>
{code}
either this assignment:
{code}
export MASTER=mesos://10.33.9.56:5050
{code}
Or this one:
{code}
export MASTER=mesos://zk://1.2.3.4:2181,2.3.4.5:2181,5.6.7.8:2181/mesos
{code}
Is necessary to start bin/pyspark. Is specifying a zk:// URI no longer
supported intentionally, or is this a regression?
was:
Attempting to run my bin/pyspark interactive shell against a Mesos cluster this
assignment to MASTER used to work:
{code}MASTER=zk://10.32.43.173:2181,10.33.9.56:2181,10.116.167.95:2181/mes
{code}
Now, with this spark-v1.0.0 tag, this yields an error:
{code}
Type "help", "copyright", "credits" or "license" for more information.
command is ['/tmp/spark-1.0.0-rc10/./bin/spark-submit', 'pyspark-shell']
Error: Master must start with yarn, mesos, spark, or local
Run with --help for usage help or --verbose for debug output
Traceback (most recent call last):
File "/tmp/spark-1.0.0-rc10/python/pyspark/shell.py", line 41, in <module>
SparkContext.setSystemProperty("spark.executor.uri",
os.environ["SPARK_EXECUTOR_URI"])
File "/tmp/spark-1.0.0-rc10/python/pyspark/context.py", line 203, in
setSystemProperty
SparkContext._ensure_initialized()
File "/tmp/spark-1.0.0-rc10/python/pyspark/context.py", line 180, in
_ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File "/tmp/spark-1.0.0-rc10/python/pyspark/java_gateway.py", line 52, in
launch_gateway
gateway_port = int(proc.stdout.readline())
ValueError: invalid literal for int() with base 10: ''
>>>
{code}
either this assignment:
{code}
export MASTER=mesos://10.33.9.56:5050
{code}
Or this one:
{code}
export
MASTER=mesos://zk://10.32.43.173:2181,10.33.9.56:2181,10.116.167.95:2181/mesos
{code}
Is necessary to start bin/pyspark. Is specifying a zk:// URI no longer
supported intentionally, or is this a regression?
> ZooKeeper URI in spark-env.sh no longer working w/ bin/pyspark
> --------------------------------------------------------------
>
> Key: SPARK-1904
> URL: https://issues.apache.org/jira/browse/SPARK-1904
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 1.0.0
> Environment: Ubuntu AMI in EC2, build taken from tags/spark-v1.0.0 tag
> ```
> $ lsb_release -r
> Release: 12.04
> $ uname -a
> Linux ip-10-97-159-136 3.2.0-23-virtual #36-Ubuntu SMP Tue Apr 10 22:29:03
> UTC 2012 x86_64 x86_64 x86_64 GNU/Linux
> $
> ```
> Reporter: Eric Lee
>
> Attempting to run my bin/pyspark interactive shell against a Mesos cluster
> this assignment to MASTER used to work:
> {code}MASTER=zk://1.2.3.4:2181,2.3.4.5:2181,5.6.7.8:2181/mesos
> {code}
> Now, with this spark-v1.0.0 tag, this yields an error:
> {code}
> Type "help", "copyright", "credits" or "license" for more information.
> command is ['/tmp/spark-1.0.0-rc10/./bin/spark-submit', 'pyspark-shell']
> Error: Master must start with yarn, mesos, spark, or local
> Run with --help for usage help or --verbose for debug output
> Traceback (most recent call last):
> File "/tmp/spark-1.0.0-rc10/python/pyspark/shell.py", line 41, in <module>
> SparkContext.setSystemProperty("spark.executor.uri",
> os.environ["SPARK_EXECUTOR_URI"])
> File "/tmp/spark-1.0.0-rc10/python/pyspark/context.py", line 203, in
> setSystemProperty
> SparkContext._ensure_initialized()
> File "/tmp/spark-1.0.0-rc10/python/pyspark/context.py", line 180, in
> _ensure_initialized
> SparkContext._gateway = gateway or launch_gateway()
> File "/tmp/spark-1.0.0-rc10/python/pyspark/java_gateway.py", line 52, in
> launch_gateway
> gateway_port = int(proc.stdout.readline())
> ValueError: invalid literal for int() with base 10: ''
> >>>
> {code}
> either this assignment:
> {code}
> export MASTER=mesos://10.33.9.56:5050
> {code}
> Or this one:
> {code}
> export MASTER=mesos://zk://1.2.3.4:2181,2.3.4.5:2181,5.6.7.8:2181/mesos
> {code}
> Is necessary to start bin/pyspark. Is specifying a zk:// URI no longer
> supported intentionally, or is this a regression?
--
This message was sent by Atlassian JIRA
(v6.2#6252)