RE: PySpark SPARK_CLASSPATH doesn't distribute jars to executors
Neither of those helped. I'm still getting a ClassNotFoundException on the workers. From: Denny Lee [denny.g@gmail.com] Sent: Tuesday, February 24, 2015 7:21 PM To: Michael Nazario; dev@spark.apache.org Subject: Re: PySpark SPARK_CLASSPATH doesn't distribute jars to executors Can you try extraClassPath or driver-class-path and see if that helps with the distribution? On Tue, Feb 24, 2015 at 14:54 Michael Nazario mnaza...@palantir.commailto:mnaza...@palantir.com wrote: Has anyone experienced a problem with the SPARK_CLASSPATH not distributing jars for PySpark? I have a detailed description of what I tried in the ticket below, and this seems like a problem that is not a configuration problem. The only other case I can think of is that configuration changed between Spark 1.1.1 and Spark 1.2.1 about distributing jars for PySpark. https://issues.apache.org/jira/browse/SPARK-5977https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SPARK-2D5977d=AwMFaQc=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8r=yN4Yj1JskMkGMKoYoLUUIQViRLGShPc1wislP1YdU4gm=REuScW0kzWb6UlI-_aLgZdCGZ62fF2vfW9gRHXsxt_gs=ZwF_JWb5QHA4P9d9XuZU0u7ZGw-00kOSgfomsg_vREEe= Thanks, Michael
PySpark SPARK_CLASSPATH doesn't distribute jars to executors
Has anyone experienced a problem with the SPARK_CLASSPATH not distributing jars for PySpark? I have a detailed description of what I tried in the ticket below, and this seems like a problem that is not a configuration problem. The only other case I can think of is that configuration changed between Spark 1.1.1 and Spark 1.2.1 about distributing jars for PySpark. https://issues.apache.org/jira/browse/SPARK-5977 Thanks, Michael
Re: PySpark SPARK_CLASSPATH doesn't distribute jars to executors
Can you try extraClassPath or driver-class-path and see if that helps with the distribution? On Tue, Feb 24, 2015 at 14:54 Michael Nazario mnaza...@palantir.com wrote: Has anyone experienced a problem with the SPARK_CLASSPATH not distributing jars for PySpark? I have a detailed description of what I tried in the ticket below, and this seems like a problem that is not a configuration problem. The only other case I can think of is that configuration changed between Spark 1.1.1 and Spark 1.2.1 about distributing jars for PySpark. https://issues.apache.org/jira/browse/SPARK-5977 Thanks, Michael