Can you try extraClassPath or driver-class-path and see if that helps with the distribution? On Tue, Feb 24, 2015 at 14:54 Michael Nazario <mnaza...@palantir.com> wrote:
> Has anyone experienced a problem with the SPARK_CLASSPATH not distributing > jars for PySpark? I have a detailed description of what I tried in the > ticket below, and this seems like a problem that is not a configuration > problem. The only other case I can think of is that configuration changed > between Spark 1.1.1 and Spark 1.2.1 about distributing jars for PySpark. > > https://issues.apache.org/jira/browse/SPARK-5977 > > Thanks, > Michael >