[
https://issues.apache.org/jira/browse/SPARK-5004?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14375312#comment-14375312
]
Eric O. LEBIGOT (EOL) commented on SPARK-5004:
----------------------------------------------
I did some more investigation: so, the error originates from the connection
around PythonWorkerFactory.scala:L75, which contains
{{val socket = new Socket(daemonHost, daemonPort)}}
I do not know much about Java or Scala or proxy handling, but here is the best
information I found:
- Someone who had the exact same problem for the exact same call has a solution
at http://stackoverflow.com/a/7085409/42973
- Official documentation about how to handle proxies:
http://docs.oracle.com/javase/8/docs/technotes/guides/net/proxies.html
- Maybe relevant: use {{ProxySelector.getDefault()}}
(http://docs.oracle.com/javase/7/docs/api/java/net/ProxySelector.html,
http://blog.itpub.net/9844649/viewspace-1021778/)
Fixing this problem would mean that people can run PySpark when they use SOCKS
proxy, which would be very convenient (this would for example allow me to use
PySpark _and_ Google at the same time :) ).
> PySpark does not handle SOCKS proxy
> -----------------------------------
>
> Key: SPARK-5004
> URL: https://issues.apache.org/jira/browse/SPARK-5004
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 1.2.0, 1.3.0
> Reporter: Eric O. LEBIGOT (EOL)
>
> PySpark cannot run even the quick start examples when a SOCKS proxy is used.
> Turning off the SOCKS proxy makes PySpark work.
> The Scala-shell version is not affected and works even when a SOCKS proxy is
> used.
> Is there a quick workaround, while waiting for this to be fixed?
> Here is the error message (printed, e.g., when .count() is called):
> {code}
> >>> 14/12/30 17:13:44 WARN PythonWorkerFactory: Failed to open socket to
> >>> Python daemon:
> java.net.SocketException: Malformed reply from SOCKS server
> at java.net.SocksSocketImpl.readSocksReply(SocksSocketImpl.java:129)
> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:503)
> at java.net.Socket.connect(Socket.java:579)
> at java.net.Socket.connect(Socket.java:528)
> at java.net.Socket.<init>(Socket.java:425)
> at java.net.Socket.<init>(Socket.java:241)
> at
> org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:75)
> at
> org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:90)
> at
> org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:89)
> at
> org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:62)
> at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:102)
> at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:263)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:230)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
> at org.apache.spark.scheduler.Task.run(Task.scala:56)
> at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:724)
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]