hey guys,

I'm hoping someone could provide some assistance. I'm having issues
(UnsatisfiedLinkError) calling some native libraries when submitting the
application in "cluster" mode. When running the application in local mode,
the application runs fine. Here's what my setup looks like.

The .so files are stored in the same location across all cluster nodes. The
node that submits the application has its LD_LIBRARY_PATH pointing to the
native libraries. Within the application i set the environment variables
like so:

SparkContext(new SparkConf()
            .setAppName("SparkDriver")
            .setExecutorEnv("LD_LIBRARY_PATH", sys.env("LD_LIBRARY_PATH"))

I thought the .setExecutorEnv would give each node the correct path to the
native libraries, but I guess not. 

I'm not really sure what I'm missing in my setup, any help is greatly
appreciated.

Thanks!





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-find-Native-Library-in-cluster-deploy-mode-tp28072.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to