[
https://issues.apache.org/jira/browse/BEAM-10430?focusedWorklogId=628718&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-628718
]
ASF GitHub Bot logged work on BEAM-10430:
-----------------------------------------
Author: ASF GitHub Bot
Created on: 27/Jul/21 21:15
Start Date: 27/Jul/21 21:15
Worklog Time Spent: 10m
Work Description: anguillanneuf commented on pull request #14953:
URL: https://github.com/apache/beam/pull/14953#issuecomment-887840112
I can optionally install a Flink component when creating a Dataproc cluster,
just like other Dataproc optional
[components](https://cloud.google.com/dataproc/docs/concepts/components/overview#available_optional_components).
But I don't have great many details on Dataproc 1.5 vs. 2.0.
You may be onto something. I tried the following while using [Beam Flink
compatibility](https://beam.apache.org/documentation/runners/flink/#flink-version-compatibility)
to guide myself. The issues seem to concentrate in Dataproc 2.0.
| Dataproc | Beam | Flink | I can try | Worked|
|---|---|---|---|---|
|2.0|2.31|1.12|Yes|No - missing dep|
|2.0|2.30|1.12|Yes|No - missing dep|
|1.5|2.29|1.9|Yes|Yes|
|1.5|2.28|1.9|Yes|Yes|
|1.5|2.27|1.9|Yes|Yes|
|1.5|2.26|1.9|Yes|Yes|
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 628718)
Time Spent: 3h 20m (was: 3h 10m)
> Can't run WordCount on EMR With Flink Runner via YARN
> -----------------------------------------------------
>
> Key: BEAM-10430
> URL: https://issues.apache.org/jira/browse/BEAM-10430
> Project: Beam
> Issue Type: Improvement
> Components: examples-java, runner-flink
> Affects Versions: 2.22.0
> Environment: AWS EMR 5.30.0 running Spark 2.4.5, Flink 1.10.0
> Reporter: Shashi
> Priority: P3
> Labels: Clarified
> Time Spent: 3h 20m
> Remaining Estimate: 0h
>
> 1) I setup WordCount project as detailed on Beam website..
> {{mvn archetype:generate \
> -DarchetypeGroupId=org.apache.beam \
> -DarchetypeArtifactId=beam-sdks-java-maven-archetypes-examples \
> -DarchetypeVersion=2.22.0 \
> -DgroupId=org.example \
> -DartifactId=word-count-beam \
> -Dversion="0.1" \
> -Dpackage=org.apache.beam.examples \
> -DinteractiveMode=false}}
> 2) mvn clean package -Pflink-runner
> 3) Ran the application on AWS EMR 5.30.0 with Flink 1.10.0
> flink run -m yarn-cluster -yid <yarn_application_id> -p 4 -c
> org.apache.beam.examples.WordCount word-count-beam-bundled-0.1.jar
> –runner=FlinkRunner --inputFile <path_in_s3_of_input_file> --output
> <path_in_s3_of_output_dir>
> 4) Launch failed with the following exception stack trace
> java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module:
> Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
> at java.util.ServiceLoader.fail(ServiceLoader.java:239)
> at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
> at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
> at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
> at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
> at
> com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1054)
> at
> org.apache.beam.sdk.options.PipelineOptionsFactory.<clinit>(PipelineOptionsFactory.java:471)
> at org.apache.beam.examples.WordCount.main(WordCount.java:190)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:321)
> at
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:205)
> at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:138)
> at
> org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:664)
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
> at
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:895)
> at
> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:968)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844)
> at
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:968)
--
This message was sent by Atlassian Jira
(v8.3.4#803005)