AFAIK, Spark only supports adding local files to the class path and not
from HDFS. If you are using spark-ec2, you could rsync the jar across
machines using something like '/root/spark-ec2/copy-dir <path_to_jar>'

Shivaram

On Sun, Nov 9, 2014 at 9:08 PM, lev <kat...@gmail.com> wrote:

> I set the path of commons-math3-3.1.1.jar to spark.executor.extraClassPath
> and it worked.
> Thanks a lot!
>
> It only worked for me when the jar was locally on the machine.
> Is there a way to make it work when the jar is on hdfs?
>
> I tried putting there a link to the file on the hdfs (with or without
> "hdfs://" ) and it didn't work..
>
> Thanks,
> Lev.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/org-apache-commons-math3-random-RandomGenerator-issue-tp15748p18453.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to