Neither of those helped. I'm still getting a ClassNotFoundException on the 
workers.
________________________________
From: Denny Lee [denny.g....@gmail.com]
Sent: Tuesday, February 24, 2015 7:21 PM
To: Michael Nazario; dev@spark.apache.org
Subject: Re: PySpark SPARK_CLASSPATH doesn't distribute jars to executors

Can you try extraClassPath or driver-class-path and see if that helps with the 
distribution?
On Tue, Feb 24, 2015 at 14:54 Michael Nazario 
<mnaza...@palantir.com<mailto:mnaza...@palantir.com>> wrote:
Has anyone experienced a problem with the SPARK_CLASSPATH not distributing jars 
for PySpark? I have a detailed description of what I tried in the ticket below, 
and this seems like a problem that is not a configuration problem. The only 
other case I can think of is that configuration changed between Spark 1.1.1 and 
Spark 1.2.1 about distributing jars for PySpark.

https://issues.apache.org/jira/browse/SPARK-5977<https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SPARK-2D5977&d=AwMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=yN4Yj1JskMkGMKoYoLUUIQViRLGShPc1wislP1YdU4g&m=REuScW0kzWb6UlI-_aLgZdCGZ62fF2vfW9gRHXsxt_g&s=ZwF_JWb5QHA4P9d9XuZU0u7ZGw-00kOSgfomsg_vREE&e=>

Thanks,
Michael

Reply via email to