Yes, I'm using spark-submit, and I'm giving it a shaded jar.

What do you mean "aligning the dependencies"?

Jacob

On Mon, Oct 2, 2017 at 11:06 AM, Jean-Baptiste Onofré <[email protected]>
wrote:

> Hi
>
> Do you start your pipeline with spark-submit ? If so you can provide the
> packages. You can also create a shaded jar.
>
> I have a similar issue in the spark 2 runner that I worked around by
> aligning the dependencies.
>
> Regards
> JB
>
> On Oct 2, 2017, 20:04, at 20:04, Jacob Marble <[email protected]> wrote:
> >My Beam pipeline runs fine with DirectRunner and DataflowRunner, but
> >fails
> >with SparkRunner. That stack trace is after this message.
> >
> >The exception indicates that
> >com.fasterxml.jackson.databind.ObjectMapper.enable doesn't exist.
> >ObjectMapper.enable() didn't exist until Jackson 2.5. `mvn
> >dependency:tree
> >-Dverbose` shows that spark-core_2.10 (1.6.3) and beam-runners-spark
> >(2.1.0) both request versions of Jackson before 2.5.
> >
> >Since I'm using a local, standalone Spark cluster for development, I
> >have
> >to include spark-core_2.10 version 1.6.3 in dependencies.
> >
> >I have added explicit dependencies to my pom.xml, so that I can be
> >certain
> >that the more recent version of Jackson is included in my shaded jar.
> >`mvn
> >clean package` confirms this:
> >
> >[INFO] Including com.fasterxml.jackson.core:jackson-core:jar:2.8.9 in
> >the
> >shaded jar.
> >[INFO] Including
> >com.fasterxml.jackson.core:jackson-annotations:jar:2.8.9
> >in the shaded jar.
> >[INFO] Including com.fasterxml.jackson.core:jackson-databind:jar:2.8.9
> >in
> >the shaded jar.
> >[INFO] Including
> >com.fasterxml.jackson.module:jackson-module-scala_2.10:jar:2.8.9 in the
> >shaded jar.
> >[INFO] Including
> >com.fasterxml.jackson.module:jackson-module-paranamer:jar:2.8.9 in the
> >shaded jar.
> >[INFO] Including
> >com.fasterxml.jackson.dataformat:jackson-dataformat-cbor:jar:2.8.9 in
> >the
> >shaded jar.
> >
> >Beyond jar creation, is there anything I can do to ensure that my
> >chosen
> >version of a dependency is used when Spark runs my pipeline? I can't be
> >the
> >first to encounter this problem.
> >
> >Thanks!
> >
> >Jacob
> >
> >--------
> >
> >Exception in thread "main" java.lang.RuntimeException:
> >java.lang.NoSuchMethodError:
> >com.fasterxml.jackson.databind.ObjectMapper.enable([
> Lcom/fasterxml/jackson/core/JsonParser$Feature;)Lcom/
> fasterxml/jackson/databind/ObjectMapper;
> >at
> >org.apache.beam.runners.spark.SparkPipelineResult.runtimeExceptionFrom(
> SparkPipelineResult.java:55)
> >at
> >org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(
> SparkPipelineResult.java:72)
> >at
> >org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(
> SparkPipelineResult.java:99)
> >at
> >org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(
> SparkPipelineResult.java:87)
> >at com.kochava.beam.jobs.ExampleS3.main(ExampleS3.java:46)
> >at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >at
> >sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> >at
> >sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> >at java.lang.reflect.Method.invoke(Method.java:498)
> >at
> >org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
> deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> >at
> >org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> >at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> >at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> >at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >Caused by: java.lang.NoSuchMethodError:
> >com.fasterxml.jackson.databind.ObjectMapper.enable([
> Lcom/fasterxml/jackson/core/JsonParser$Feature;)Lcom/
> fasterxml/jackson/databind/ObjectMapper;
> >at
> >com.amazonaws.partitions.PartitionsLoader.<clinit>(
> PartitionsLoader.java:54)
> >at
> >com.amazonaws.regions.RegionMetadataFactory.create(
> RegionMetadataFactory.java:30)
> >at com.amazonaws.regions.RegionUtils.initialize(RegionUtils.java:64)
> >at
> >com.amazonaws.regions.RegionUtils.getRegionMetadata(RegionUtils.java:52)
> >at com.amazonaws.regions.RegionUtils.getRegion(RegionUtils.java:105)
> >at
> >com.amazonaws.client.builder.AwsClientBuilder.withRegion(
> AwsClientBuilder.java:239)
> >at com.kochava.beam.s3.S3Util.<init>(S3Util.java:103)
> >at com.kochava.beam.s3.S3Util.<init>(S3Util.java:53)
> >at com.kochava.beam.s3.S3Util$S3UtilFactory.create(S3Util.java:81)
> >at com.kochava.beam.s3.S3Util$S3UtilFactory.create(S3Util.java:55)
>

Reply via email to