You're putting those into spark-env.sh?  Try setting LD_LIBRARY_PATH as
well, that might help.

Also where is the exception coming from?  You have to set this properly for
both the cluster and the driver, which are independently set.

Cheers!
Andrew

On Sun, Oct 5, 2014 at 1:06 PM, Tom <thubregt...@gmail.com> wrote:

> Hi,
>
> I am trying to call some c code, let's say the compiled file is /path/code,
> and it has chmod +x. When I call it directly, it works. Now i want to call
> it from Spark 1.1. My problem is not building it into Spark, but making
> sure
> Spark can find it.
>
> I have tried:
> SPARK_DAEMON_JAVA_OPTS="-Djava.library.path=/path"
> SPARK_DAEMON_JAVA_OPTS="-Djava.library.path=/path/code"
> SPARK_CLASSPATH="-Djava.library.path=/path"
> SPARK_JAVA_OPTS="-Djava.library.path=/path"
> SPARK_LIBRARY_PATH="/path"
>
> All with and without the "". Every single time I get
> Exception in thread "main" java.lang.UnsatisfiedLinkError: code (Not found
> in java.library.path)
>
> Any advice? Thanks!
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/java-library-path-tp15766.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to