[ 
https://issues.apache.org/jira/browse/BEAM-10430?focusedWorklogId=745247&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-745247
 ]

ASF GitHub Bot logged work on BEAM-10430:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 21/Mar/22 16:52
            Start Date: 21/Mar/22 16:52
    Worklog Time Spent: 10m 
      Work Description: VictorPlusC commented on pull request #14953:
URL: https://github.com/apache/beam/pull/14953#issuecomment-1074152397


   I believe so. Though for this, the versioning does not seem to have much of 
an effect here, I was able to successfully execute a job on Dataproc with both 
versions.
   
   As this dependency is necessary for me to fully enable an automatic process 
to send Flink pipelines to Dataproc, without needing users to locally build the 
shadowJar with it included, would it be possible to include only the 
`jackson-module-jaxb-annotations` dependency and have a Jira ticket with a 
to-do to remove it after it has been resolved on the Dataproc side? This way, 
we can guarantee that this dependency issue does not show up in a future 
version of Beam. Doing so will also make it possible for users to follow 
use-cases such as the content covered in the [Dataproc Flink component 
documentation](https://cloud.google.com/dataproc/docs/concepts/components/flink)
 using a version of Flink that has not been deprecated on the Beam side (the 
working example uses a Dataproc 1.5 image and Flink 1.9, but we no longer 
support that Flink version). Additionally, it does not appear that the 
`jackson-datatype-jsr310` dependency is needed for me to run Flink pipelines on 
Dataproc, so only adding the jaxb-annotations should suffice.
   
   +Dagang Wei (@functicons), who helped me investigate the dependencies on the 
Dataproc side.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

    Worklog Id:     (was: 745247)
    Time Spent: 6h 40m  (was: 6.5h)

> Can't run WordCount on EMR With Flink Runner via YARN
> -----------------------------------------------------
>
>                 Key: BEAM-10430
>                 URL: https://issues.apache.org/jira/browse/BEAM-10430
>             Project: Beam
>          Issue Type: Bug
>          Components: examples-java, runner-flink
>    Affects Versions: 2.22.0
>         Environment: AWS EMR 5.30.0 running Spark 2.4.5, Flink 1.10.0
>            Reporter: Shashi
>            Assignee: Etienne Chauchot
>            Priority: P3
>              Labels: Clarified
>             Fix For: Missing
>
>          Time Spent: 6h 40m
>  Remaining Estimate: 0h
>
> 1) I setup WordCount project as detailed on Beam website..
>  {{mvn archetype:generate \
>       -DarchetypeGroupId=org.apache.beam \
>       -DarchetypeArtifactId=beam-sdks-java-maven-archetypes-examples \
>       -DarchetypeVersion=2.22.0 \
>       -DgroupId=org.example \
>       -DartifactId=word-count-beam \
>       -Dversion="0.1" \
>       -Dpackage=org.apache.beam.examples \
>       -DinteractiveMode=false}}
> 2) mvn clean package -Pflink-runner
> 3) Ran the application on AWS EMR 5.30.0 with Flink 1.10.0
> flink run -m yarn-cluster -yid <yarn_application_id> -p 4  -c 
> org.apache.beam.examples.WordCount word-count-beam-bundled-0.1.jar 
> –runner=FlinkRunner --inputFile <path_in_s3_of_input_file> --output 
> <path_in_s3_of_output_dir>
> 4) Launch failed with the following exception stack trace 
> java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: 
> Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
>  at java.util.ServiceLoader.fail(ServiceLoader.java:239)
>  at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
>  at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
>  at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
>  at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
>  at 
> com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1054)
>  at 
> org.apache.beam.sdk.options.PipelineOptionsFactory.<clinit>(PipelineOptionsFactory.java:471)
>  at org.apache.beam.examples.WordCount.main(WordCount.java:190)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498)
>  at 
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:321)
>  at 
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:205)
>  at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:138)
>  at 
> org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:664)
>  at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
>  at 
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:895)
>  at 
> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:968)
>  at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:422)
>  at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844)
>  at 
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>  at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:968)



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to