Hi Team ,

We are trying to build a docker image using Centos and trying to connect
through S3. Same works with Hadoop 3.2.0 and spark.3.1.2

#Installing spark binaries
ENV SPARK_HOME /opt/spark
ENV SPARK_VERSION 3.2.1
ENV HADOOP_VERSION 3.2.0
ARG HADOOP_VERSION_SHORT=3.2
ARG HADOOP_AWS_VERSION=3.3.0
ARG AWS_SDK_VERSION=1.11.563


RUN set -xe \
  && cd /tmp \
  && wget
http://mirrors.gigenet.com/apache/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION_SHORT}.tgz
 \
  && tar -zxvf spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION_SHORT}.tgz
\
  && rm *.tgz \
  && mv spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION_SHORT}
${SPARK_HOME} \
  && cp ${SPARK_HOME}/kubernetes/dockerfiles/spark/entrypoint.sh
${SPARK_HOME} \
  && wget
https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/${HADOOP_AWS_VERSION}/hadoop-aws-${HADOOP_AWS_VERSION}.jar
 \
 && wget
https://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk-bundle/${AWS_SDK_VERSION}/aws-java-sdk-bundle-${AWS_SDK_VERSION}.jar
 \
&& wget
https://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk/${AWS_SDK_VERSION}/aws-java-sdk-${AWS_SDK_VERSION}.jar
 \
 && mv *.jar /opt/spark/jars/

Error:

Any help on this is appreciated
java.lang.NoSuchMethodError:
com/google/common/base/Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;Ljava/lang/Object;)V
(loaded from file:/opt/spark/jars/guava-14.0.1.jar by
jdk.internal.loader.ClassLoaders$AppClassLoader@1e4553e) called from class
org.apache.hadoop.fs.s3a.S3AUtils (loaded from
file:/opt/spark/jars/hadoop-aws-3.3.0.jar by
jdk.internal.loader.ClassLoaders$AppClassLoader@1e4553e).

Reply via email to