Has anyone experienced a problem with the SPARK_CLASSPATH not distributing jars for PySpark? I have a detailed description of what I tried in the ticket below, and this seems like a problem that is not a configuration problem. The only other case I can think of is that configuration changed between Spark 1.1.1 and Spark 1.2.1 about distributing jars for PySpark.
https://issues.apache.org/jira/browse/SPARK-5977 Thanks, Michael