Spark 2.0.1 is running with a different py4j library than Spark 1.6.

You will probably run into other problems mixing versions though - is there a 
reason you can't run Spark 1.6 on the client?


_____________________________
From: Klaus Schaefers 
<klaus.schaef...@philips.com<mailto:klaus.schaef...@philips.com>>
Sent: Wednesday, November 30, 2016 2:44 AM
Subject: PySpark to remote cluster
To: <user@spark.apache.org<mailto:user@spark.apache.org>>


Hi,

I want to connect with a local Jupyter Notebook to a remote Spark cluster.
The Cluster is running Spark 2.0.1 and the Jupyter notebook is based on
Spark 1.6 and running in a docker image (Link). I try to init the
SparkContext like this:

import pyspark
sc = pyspark.SparkContext('spark://<IP>:7077')

However, this gives me the following exception:


ERROR:py4j.java_gateway:Error while sending or receiving.
Traceback (most recent call last):
File "/usr/local/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py",
line 746, in send_command
raise Py4JError("Answer from Java side is empty")
py4j.protocol.Py4JError: Answer from Java side is empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/local/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py",
line 626, in send_command
response = connection.send_command(command)
File "/usr/local/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py",
line 750, in send_command
raise Py4JNetworkError("Error while sending or receiving", e)
py4j.protocol.Py4JNetworkError: Error while sending or receiving

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/local/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py",
line 740, in send_command
answer = smart_decode(self.stream.readline()[:-1])
File "/opt/conda/lib/python3.5/socket.py", line 575, in readinto
return self._sock.recv_into(b)
ConnectionResetError: [Errno 104] Connection reset by peer
ERROR:py4j.java_gateway:An error occurred while trying to connect to the
Java server
Traceback (most recent call last):
File "/usr/local/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py",
line 746, in send_command
raise Py4JError("Answer from Java side is empty")
py4j.protocol.Py4JError: Answer from Java side is empty

...

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/opt/conda/lib/python3.5/site-packages/IPython/utils/PyColorize.py",
line 262, in format2
for atoken in generate_tokens(text.readline):
File "/opt/conda/lib/python3.5/tokenize.py", line 597, in _tokenize
raise TokenError("EOF in multi-line statement", (lnum, 0))
tokenize.TokenError: ('EOF in multi-line statement', (2, 0))


Is this error caused by the different spark versions?

Best,

Klaus




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/PySpark-to-remote-cluster-tp28147.html
Sent from the Apache Spark User List mailing list archive at 
Nabble.com<http://Nabble.com>.

---------------------------------------------------------------------
To unsubscribe e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>



Reply via email to