Should we remove the existing jar and upgrade it to some recent version?

On Tue, Feb 15, 2022, 01:08 Mich Talebzadeh <mich.talebza...@gmail.com>
wrote:

> I recall I had similar issues running Spark on Google Dataproc.
>
> sounds like it gets Hadoop's jars on the classpath which include an older
> version of Guava. The solution is to shade/relocate Guava in your
> distribution
>
>
> HTH
>
>
>    view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Mon, 14 Feb 2022 at 19:10, Raj ks <rajabhupati....@gmail.com> wrote:
>
>> Hi Team ,
>>
>> We are trying to build a docker image using Centos and trying to connect
>> through S3. Same works with Hadoop 3.2.0 and spark.3.1.2
>>
>> #Installing spark binaries
>> ENV SPARK_HOME /opt/spark
>> ENV SPARK_VERSION 3.2.1
>> ENV HADOOP_VERSION 3.2.0
>> ARG HADOOP_VERSION_SHORT=3.2
>> ARG HADOOP_AWS_VERSION=3.3.0
>> ARG AWS_SDK_VERSION=1.11.563
>>
>>
>> RUN set -xe \
>>   && cd /tmp \
>>   && wget
>> http://mirrors.gigenet.com/apache/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION_SHORT}.tgz
>>  \
>>   && tar -zxvf
>> spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION_SHORT}.tgz \
>>   && rm *.tgz \
>>   && mv spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION_SHORT}
>> ${SPARK_HOME} \
>>   && cp ${SPARK_HOME}/kubernetes/dockerfiles/spark/entrypoint.sh
>> ${SPARK_HOME} \
>>   && wget
>> https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/${HADOOP_AWS_VERSION}/hadoop-aws-${HADOOP_AWS_VERSION}.jar
>>  \
>>  && wget
>> https://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk-bundle/${AWS_SDK_VERSION}/aws-java-sdk-bundle-${AWS_SDK_VERSION}.jar
>>  \
>> && wget
>> https://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk/${AWS_SDK_VERSION}/aws-java-sdk-${AWS_SDK_VERSION}.jar
>>  \
>>  && mv *.jar /opt/spark/jars/
>>
>> Error:
>>
>> Any help on this is appreciated
>> java.lang.NoSuchMethodError:
>> com/google/common/base/Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;Ljava/lang/Object;)V
>> (loaded from file:/opt/spark/jars/guava-14.0.1.jar by
>> jdk.internal.loader.ClassLoaders$AppClassLoader@1e4553e) called from
>> class org.apache.hadoop.fs.s3a.S3AUtils (loaded from
>> file:/opt/spark/jars/hadoop-aws-3.3.0.jar by
>> jdk.internal.loader.ClassLoaders$AppClassLoader@1e4553e).
>>
>>

Reply via email to