Dear all,I have setup two Spark standalone test clusters which both suffered
from the same problem. I have a workaround but it's bad. I would appreciate
some help and input. I'm too much of a beginner to conclude that it's a bug
but I found someone else having the exact same issue on Stack Overflow
(where I also posted my own question.) My own question on Stack Overflow.
<https://stackoverflow.com/questions/64029599/cant-run-example-pyspark-code-on-standalone-cluster>
  
Another person with the same question on Stack Overflow.
<https://stackoverflow.com/questions/63143795/spark-3-error-java-lang-unsatisfiedlinkerror-no-zstd-jni-in-java-library-path>
 
*Error*When I run the following python test example from any of the two
clusters, I get the same error.--- command line
---/usr/spark/bin/spark-submit --master "spark://:7077"
/usr/spark/examples/src/main/python/wordcount.py '/usr/spark/LICENSE'---
command line ----- error ---Exception in thread "map-output-dispatcher-0"
java.lang.UnsatisfiedLinkError: no zstd-jni in java.library.pathUnsupported
OS/arch, cannot find /linux/amd64/libzstd-jni.so or load zstd-jni from
system libraries. Please try building from source the jar or providing
libzstd-jni in your system."-- error ---If I only run the example on one
node Local[x] then it works without an issue. I assume that doesn't trigger
the usage of libzstd-jni.*Cluster setup*First cluster:- Three RHEL7 VMs.-
Spark 3.0.0 downloaded from the apache site, prebuilt for hadoop 3.2.- JDK8.
- JAVA_HOME configured.- Cluster set up in stand-alone mode, one master and
two slaves.Second cluster:- Three RHEL8 VMs.- Spark 3.0.1 downloaded from
the apache site, prebuilt for hadoop 3.2.- OpenJDK 11.- JAVA_HOME
configured.- Cluster set up in stand-alone mode, one master and two slaves.-
Hadoop 3.2.1 downloaded and installed.- LD_LIBRARY_PATH configured to point
to the hadoop native library directory.- SPARK_HOME configured.- CLASSPATH
set to point to $SPARK_HOME/jars*Workaround*Looking at the issue I
determined that the appropriate libzstd-jni.so was available in jar xf
$SPARK_HOME/jars/zstd-jni-1.4.4-3.jar and it was possible to extract it from
there.So I extracted the .so-file and put it into /usr/lib64 and then the
example ran without an issue.But that is not a good way to go, and I'm
wondering if I have made a mistake somewhere. I would expect that the java
code would pick up the correct so-file from the jar automatically based on
the environment it executes on?



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

Reply via email to