Hello;
I'm trying to use a native library in Spark.
I was using a simple standalone cluster with one master and worker.
According to the documentation I edited the spark-defautls.conf by setting:
spark.driver.extraClassPath=/opt/eTOX_spark/lib/org.RDKit.jar
spark.driver.extraLibraryPath=/opt/eTOX_spark/lib/
spark.executor.extraLibraryPath=/opt/eTOX_spark/lib/
In the path /opt/eTOX_spark/lib/ there are 3 so files wich are wrapped in
org.RDKit.jar.
But when I try so submit a job that uses the native library I get:
Exception in thread "main" java.lang.UnsatisfiedLinkError:
org.RDKit.RDKFuncsJNI.RWMol_MolFromSmiles__SWIG_3(Ljava/lang/String;)J
at org.RDKit.RDKFuncsJNI.RWMol_MolFromSmiles__SWIG_3(Native Method)
at org.RDKit.RWMol.MolFromSmiles(RWMol.java:426)
at models.spark.sources.eTOX_DB$.main(eTOX.scala:54)
at models.spark.sources.eTOX_DB.main(eTOX.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:727)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I use the submit.sh with the following parameters:
/opt/spark/bin/spark-submit --verbose --class
"models.spark.sources.eTOX_DB" --master
spark://localhost.localdomain:7077
target/scala-2.10/etox_spark_2.10-1.0.jar
the full output is:
Using properties file: /opt/spark/conf/spark-defaults.conf
Adding default property: spark.driver.extraLibraryPath=/opt/eTOX_spark/lib/
Adding default property:
spark.driver.extraClassPath=/opt/eTOX_spark/lib/org.RDKit.jar
Adding default property:
spark.executor.extraLibraryPath=/opt/eTOX_spark/lib/
Parsed arguments:
master spark://localhost.localdomain:7077
deployMode null
executorMemory null
executorCores null
totalExecutorCores null
propertiesFile /opt/spark/conf/spark-defaults.conf
driverMemorynull
driverCores null
driverExtraClassPath/opt/eTOX_spark/lib/org.RDKit.jar
driverExtraLibraryPath /opt/eTOX_spark/lib/
driverExtraJavaOptions null
supervise false
queue null
numExecutorsnull
files null
pyFiles null
archivesnull
mainClass models.spark.sources.eTOX_DB
primaryResource
file:/opt/eTOX_spark/target/scala-2.10/etox_spark_2.10-1.0.jar
namemodels.spark.sources.eTOX_DB
childArgs []
jarsnull
packagesnull
packagesExclusions null
repositoriesnull
verbose true
Spark properties used, including those specified through
--conf and those from the properties file
/opt/spark/conf/spark-defaults.conf:
spark.executor.extraLibraryPath -> /opt/eTOX_spark/lib/
spark.driver.extraLibraryPath -> /opt/eTOX_spark/lib/
spark.driver.extraClassPath -> /opt/eTOX_spark/lib/org.RDKit.jar
Main class:
models.spark.sources.eTOX_DB
Arguments:
System properties:
spark.executor.extraLibraryPath -> /opt/eTOX_spark/lib/
spark.driver.extraLibraryPath -> /opt/eTOX_spark/lib/
SPARK_SUBMIT -> true
spark.app.name -> models.spark.sources.eTOX_DB
spark.jars -> file:/opt/eTOX_spark/target/scala-2.10/etox_spark_2.10-1.0.jar
spark.submit.deployMode -> client
spark.master -> spark://localhost.localdomain:7077
spark.driver.extraClassPath -> /opt/eTOX_spark/lib/org.RDKit.jar
Classpath elements:
file:/opt/eTOX_spark/target/scala-2.10/etox_spark_2.10-1.0.jar
Buffer(/opt/jdk1.8.0_45/jre/lib/amd64/libzip.so)
Loading libraries
Buffer(/opt/jdk1.8.0_45/jre/lib/amd64/libzip.so, /opt/eTOX_spark/lib/
libboost_thread.1.48.0.so, /opt/eTOX_spark/lib/libboost_system.1.48.0.so,
/opt/eTOX_spark/lib/libGraphMolWrap.so)
Loading libraries
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
15/11/25 16:27:32 INFO SparkContext: Running Spark version 1.6.0-SNAPSHOT
15/11/25 16:27:33 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
15/11/25 16:27:33 WARN Utils: Your hostname, localhost.localdomain resolves
to a loopback address: 127.0.0.1; using 10.0.2.15 instead (on interface
enp0s3)
15/11/25 16:27:33 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
another address
15/11/25 16:27:33 INFO SecurityManager: Changing view acls to: user
15/11/25 16:27:33 INFO SecurityManager: Changing modify acls to: user