Hi

I am running a spark stream app on yarn and using apache httpasyncclient 4.1
This client Jar internally has a dependency on jar http-core4.4.1.jar.

This jar's( http-core .jar) old version i.e. httpcore-4.2.5.jar is also
present in class path and has higher priority in classpath(coming earlier
in classpath)
Jar is at /apps/cloudera/parcels/CDH/jars/httpcore-4.2.5.jar

This is conflicting with job and making job to kill.

I have packaged my jobs jar using maven.
When I ran the job - it killed the executors with below exception :

Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 3.0 failed 4 times, most recent failure:
Lost task 0.3 in stage 3.0 (TID 76, ip): java.lang.NoSuchFieldError:
INSTANCE
        at
org.apache.http.impl.nio.codecs.DefaultHttpRequestWriterFactory.<init

                                  >(DefaultHttpRequestWriterFactory.java:52)
        at
org.apache.http.impl.nio.codecs.DefaultHttpRequestWriterFactory.<init

                                  >(DefaultHttpRequestWriterFactory.java:56)



When I specify latets version of http-core in --jars argument then also it
picked old versioned jar only.

Is there any way to make it not to use spark path's jar rather my jar while
executing the job ?

Thanks

Reply via email to