[ https://issues.apache.org/jira/browse/SPARK-13767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15877251#comment-15877251 ]
yb commented on SPARK-13767: ---------------------------- i use spark 1.6.2 python 2.7 I'm seeing the error that Venkata showed as well, if anyone has any thoughts on why that would occur, I'd really appreciate it. error like follow: Traceback (most recent call last): File "E:/scrapy_workspace/testddd/dds/SparkTest.py", line 10, in <module> conf=SparkConf().setAppName("pySparkDemo").setMaster("local") File "D:\Python27\lib\site-packages\pyspark\conf.py", line 104, in __init__ SparkContext._ensure_initialized() File "D:\Python27\lib\site-packages\pyspark\context.py", line 245, in _ensure_initialized SparkContext._gateway = gateway or launch_gateway() File "D:\Python27\lib\site-packages\pyspark\java_gateway.py", line 116, in launch_gateway java_import(gateway.jvm, "org.apache.spark.SparkConf") File "D:\Python27\lib\site-packages\py4j\java_gateway.py", line 79, in java_import answer = gateway_client.send_command(command) File "D:\Python27\lib\site-packages\py4j\java_gateway.py", line 624, in send_command connection = self._get_connection() File "D:\Python27\lib\site-packages\py4j\java_gateway.py", line 579, in _get_connection connection = self._create_connection() File "D:\Python27\lib\site-packages\py4j\java_gateway.py", line 585, in _create_connection connection.start() File "D:\Python27\lib\site-packages\py4j\java_gateway.py", line 697, in start raise Py4JNetworkError(msg, e) py4j.protocol.Py4JNetworkError: An error occurred while trying to connect to the Java server > py4j.protocol.Py4JNetworkError: An error occurred while trying to connect to > the Java server > -------------------------------------------------------------------------------------------- > > Key: SPARK-13767 > URL: https://issues.apache.org/jira/browse/SPARK-13767 > Project: Spark > Issue Type: Bug > Components: PySpark > Reporter: Poonam Agrawal > > I am trying to create spark context object with the following commands on > pyspark: > from pyspark import SparkContext, SparkConf > conf = > SparkConf().setAppName('App_name').setMaster("spark://local-or-remote-ip:7077").set('spark.cassandra.connection.host', > 'cassandra-machine-ip').set('spark.storage.memoryFraction', > '0.2').set('spark.rdd.compress', 'true').set('spark.streaming.blockInterval', > 500).set('spark.serializer', > 'org.apache.spark.serializer.KryoSerializer').set('spark.scheduler.mode', > 'FAIR').set('spark.mesos.coarse', 'true') > sc = SparkContext(conf=conf) > but I am getting the following error: > Traceback (most recent call last): > File "<stdin>", line 1, in <module> > File "/usr/local/lib/spark-1.4.1/python/pyspark/conf.py", line 106, in > __init__ > self._jconf = _jvm.SparkConf(loadDefaults) > File > "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", > line 766, in __getattr__ > File > "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", > line 362, in send_command > File > "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", > line 318, in _get_connection > File > "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", > line 325, in _create_connection > File > "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", > line 432, in start > py4j.protocol.Py4JNetworkError: An error occurred while trying to connect to > the Java server > I am getting the same error executing the command : conf = > SparkConf().setAppName("App_name").setMaster("spark://127.0.0.1:7077") -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org