[ 
https://issues.apache.org/jira/browse/SPARK-5977?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Nazario closed SPARK-5977.
----------------------------------
    Resolution: Not a Problem

This was my misunderstanding of how setting in spark are supposed to be used. 
"spark.jars" will do this for you.

> PySpark SPARK_CLASSPATH doesn't distribute jars to executors
> ------------------------------------------------------------
>
>                 Key: SPARK-5977
>                 URL: https://issues.apache.org/jira/browse/SPARK-5977
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.2.1
>            Reporter: Michael Nazario
>              Labels: jars
>
> In PySpark 1.2.1, I added a jar for avro support similar to the one in 
> spark-examples. This jar I need to convert avro files into rows. However, in 
> the worker logs, I kept getting a ClassNotFoundException for my 
> AvroToPythonConverter class.
> I double checked the jar to make sure the class was in there which it was. I 
> made sure I used the SPARK_CLASSPATH environment variable to place this jar 
> on the executor and driver classpaths. I then checked the application web UI 
> which also had this jar on both the executor and driver classpaths.
> The final thing I tried was explicitly dropping the jars in the same location 
> as on my driver machine. That made the ClassNotFoundException go away.
> This makes me think that the jars which back in 1.1.1 used to be sent to the 
> workers are no longer being sent over.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to