Hmmm, you might be suffering from SPARK-1719. Not sure what the proper workaround is, but it sounds like your native libs are not in any of the "standard" lib directories; one workaround might be to copy them there, or add their location to /etc/ld.so.conf (I'm assuming Linux).
On Thu, Sep 25, 2014 at 8:34 AM, taqilabon <g945...@gmail.com> wrote: > Hi all, > > I tried to run my spark job on yarn. > In my application, I need to call third-parity jni libraries in a spark job. > However, I can't find a way to make spark job load my native libraries. > Is there anyone who knows how to solve this problem? > Thanks. > > Ziv Huang > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/how-to-run-spark-job-on-yarn-with-jni-lib-tp15146.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > -- Marcelo --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org