Dear Sir/Madam,

I followed the official documentation to download the WordCount example (
https://beam.apache.org/get-started/quickstart-java/), used Maven to build
the project, deployed spark-2.4.8-bin-hadoop2.7 locally, and followed the
official Spark Runner configuration (
https://beam.apache.org/documentation/runners/spark/), the error was
reported.

Exception in thread "main" java.lang.NoSuchMethodError:
> com.fasterxml.jackson.databind.type.TypeBindings.emptyBindings()Lcom/fasterxml/jackson/databind/type/TypeBindings;
>         at
> org.apache.beam.sdk.options.PipelineOptionsFactory.createBeanProperty(PipelineOptionsFactory.java:1706)



The Spark commit cluster command is:

> ./bin/spark-submit \
>   --class org.apache.beam.examples.WordCount \
>   --master spark://yuanshu-2288H-V5:7077 \
>
> /share/apache_beam_code/word-count-beam/target/word-count-beam-bundled-0.1.jar
> \
> --runner=SparkRunner \
> --inputFile=/share/k_means_data_python/10000_point_4_center_2_feature.txt \
>
> --output=/share/apache_beam_code/word-count-beam/src/main/java/BeamKmeans/4core6G1worker/output
>

The pom.xml file and full error output in Maven are in the attachment.

I created the question on StackOverflow:
https://stackoverflow.com/questions/71791853/apache-beam-2-37-0-sparkrunner-program-executing-in-spark-2-4-8-standalone-clust

Looking forward to your reply

-- 
Kind regards,
Jie Liu
----------------
Mr. Jie Liu
Computer School, University of South China, Hengyang 421001
E-Mail: [email protected] <[email protected]>
(base) root@yuanshu-2288H-V5:/usr/local/spark-2.4.8-bin-hadoop2.7# 
./bin/spark-submit --class org.apache.beam.examples.WordCount --master 
spark://yuanshu-2288H-V5:7077 
/share/apache_beam_test/word-count-beam/target/word-count-beam-bundled-0.1.jar 
--runner=SparkRunner \
> --inputFile=/share/k_means_data_python/10000_point_4_center_2_feature.txt \
> --output=/share/apache_beam_code/word-count-beam/src/main/java/BeamKmeans/4core6G1worker/output
22/04/08 12:23:46 WARN Utils: Your hostname, yuanshu-2288H-V5 resolves to a 
loopback address: 127.0.1.1; using 192.168.190.3 instead (on interface eno3)
22/04/08 12:23:46 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another 
address
22/04/08 12:23:47 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
log4j:WARN No appenders could be found for logger 
(org.apache.beam.sdk.options.PipelineOptionsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
22/04/08 12:23:49 INFO SparkContext: Running Spark version 2.4.8
22/04/08 12:23:49 INFO SparkContext: Submitted application: WordCount
22/04/08 12:23:50 INFO SecurityManager: Changing view acls to: root
22/04/08 12:23:50 INFO SecurityManager: Changing modify acls to: root
22/04/08 12:23:50 INFO SecurityManager: Changing view acls groups to: 
22/04/08 12:23:50 INFO SecurityManager: Changing modify acls groups to: 
22/04/08 12:23:50 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users  with view permissions: Set(root); groups 
with view permissions: Set(); users  with modify permissions: Set(root); groups 
with modify permissions: Set()
22/04/08 12:23:50 INFO Utils: Successfully started service 'sparkDriver' on 
port 41475.
22/04/08 12:23:50 INFO SparkEnv: Registering MapOutputTracker
22/04/08 12:23:50 INFO SparkEnv: Registering BlockManagerMaster
22/04/08 12:23:50 INFO BlockManagerMasterEndpoint: Using 
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/04/08 12:23:50 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/04/08 12:23:50 INFO DiskBlockManager: Created local directory at 
/tmp/blockmgr-671a50b9-65f8-4a24-945c-6188a715b520
22/04/08 12:23:50 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
22/04/08 12:23:50 INFO SparkEnv: Registering OutputCommitCoordinator
22/04/08 12:23:50 INFO Utils: Successfully started service 'SparkUI' on port 
4040.
22/04/08 12:23:50 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at 
http://192.168.190.3:4040
22/04/08 12:23:50 INFO SparkContext: Added JAR 
file:/share/apache_beam_test/word-count-beam/target/word-count-beam-bundled-0.1.jar
 at spark://192.168.190.3:41475/jars/word-count-beam-bundled-0.1.jar with 
timestamp 1649391830697
22/04/08 12:23:50 INFO StandaloneAppClient$ClientEndpoint: Connecting to master 
spark://yuanshu-2288H-V5:7077...
22/04/08 12:23:50 INFO TransportClientFactory: Successfully created connection 
to yuanshu-2288H-V5/127.0.1.1:7077 after 58 ms (0 ms spent in bootstraps)
22/04/08 12:23:50 INFO StandaloneSchedulerBackend: Connected to Spark cluster 
with app ID app-20220408122350-0013
22/04/08 12:23:50 INFO StandaloneAppClient$ClientEndpoint: Executor added: 
app-20220408122350-0013/0 on worker-20220408082455-192.168.190.3-40777 
(192.168.190.3:40777) with 4 core(s)
22/04/08 12:23:50 INFO StandaloneSchedulerBackend: Granted executor ID 
app-20220408122350-0013/0 on hostPort 192.168.190.3:40777 with 4 core(s), 
1024.0 MB RAM
22/04/08 12:23:51 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 46337.
22/04/08 12:23:51 INFO NettyBlockTransferService: Server created on 
192.168.190.3:46337
22/04/08 12:23:51 INFO BlockManager: Using 
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication 
policy
22/04/08 12:23:51 INFO StandaloneAppClient$ClientEndpoint: Executor updated: 
app-20220408122350-0013/0 is now RUNNING
22/04/08 12:23:51 INFO BlockManagerMasterEndpoint: Registering block manager 
192.168.190.3:46337 with 366.3 MB RAM, BlockManagerId(driver, 192.168.190.3, 
46337, None)stering BlockManager BlockManagerId(driver, 192.168.190.3, 46337, 
None)
22/04/08 12:23:51 INFO BlockManagerMaster: Registered BlockManager 
BlockManagerId(driver, 192.168.190.3, 46337, None)
22/04/08 12:23:51 INFO BlockManager: Initialized BlockManager: 
BlockManagerId(driver, 192.168.190.3, 46337, None)
22/04/08 12:23:51 INFO StandaloneSchedulerBackend: SchedulerBackend is ready 
for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
Exception in thread "main" java.lang.NoSuchMethodError: 
com.fasterxml.jackson.databind.type.TypeBindings.emptyBindings()Lcom/fasterxml/jackson/databind/type/TypeBindings;
        at 
org.apache.beam.sdk.options.PipelineOptionsFactory.createBeanProperty(PipelineOptionsFactory.java:1706)
        at 
org.apache.beam.sdk.options.PipelineOptionsFactory.computeCustomSerializerForMethod(PipelineOptionsFactory.java:1756)
        at 
java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
        at 
org.apache.beam.sdk.options.PipelineOptionsFactory.getCustomSerializerForMethod(PipelineOptionsFactory.java:1790)
        at 
org.apache.beam.sdk.options.ProxyInvocationHandler$Serializer.getSerializerMap(ProxyInvocationHandler.java:727)
        at 
org.apache.beam.sdk.options.ProxyInvocationHandler$Serializer.serialize(ProxyInvocationHandler.java:681)
        at 
org.apache.beam.sdk.options.ProxyInvocationHandler$Serializer.serialize(ProxyInvocationHandler.java:654)
        at 
com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:130)
        at 
com.fasterxml.jackson.databind.ObjectMapper._configAndWriteValue(ObjectMapper.java:3559)
        at 
com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:2927)
        at 
org.apache.beam.runners.core.construction.SerializablePipelineOptions.serializeToJson(SerializablePipelineOptions.java:68)
        at 
org.apache.beam.runners.core.construction.SerializablePipelineOptions.<init>(SerializablePipelineOptions.java:44)
        at 
org.apache.beam.runners.spark.translation.EvaluationContext.<init>(EvaluationContext.java:75)
        at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:222)
        at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:96)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:323)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
        at org.apache.beam.examples.WordCount.runWordCount(WordCount.java:196)
        at org.apache.beam.examples.WordCount.main(WordCount.java:203)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:855)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/04/08 12:23:51 INFO SparkContext: Invoking stop() from shutdown hook
22/04/08 12:23:51 INFO SparkUI: Stopped Spark web UI at 
http://192.168.190.3:4040
22/04/08 12:23:51 INFO StandaloneSchedulerBackend: Shutting down all executors
22/04/08 12:23:51 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking 
each executor to shut down
22/04/08 12:23:51 INFO MapOutputTrackerMasterEndpoint: 
MapOutputTrackerMasterEndpoint stopped!
22/04/08 12:23:51 INFO MemoryStore: MemoryStore cleared
22/04/08 12:23:51 INFO BlockManager: BlockManager stopped
22/04/08 12:23:51 INFO BlockManagerMaster: BlockManagerMaster stopped
22/04/08 12:23:51 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
OutputCommitCoordinator stopped!
22/04/08 12:23:51 INFO SparkContext: Successfully stopped SparkContext
22/04/08 12:23:51 INFO ShutdownHookManager: Shutdown hook called
22/04/08 12:23:51 INFO ShutdownHookManager: Deleting directory 
/tmp/spark-349a00b4-47c1-4b85-b47f-63cb35298b85
22/04/08 12:23:51 INFO ShutdownHookManager: Deleting directory 
/tmp/spark-135fd9d7-1867-4d25-8009-3530efdd5a89

<?xml version="1.0" encoding="UTF-8"?>
<!--
    Licensed to the Apache Software Foundation (ASF) under one or more
    contributor license agreements.  See the NOTICE file distributed with
    this work for additional information regarding copyright ownership.
    The ASF licenses this file to You under the Apache License, Version 2.0
    (the "License"); you may not use this file except in compliance with
    the License.  You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

    Unless required by applicable law or agreed to in writing, software
    distributed under the License is distributed on an "AS IS" BASIS,
    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    See the License for the specific language governing permissions and
    limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0";
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.example</groupId>
    <artifactId>word-count-beam</artifactId>
    <version>0.1</version>

    <packaging>jar</packaging>

    <properties>
        <beam.version>2.37.0</beam.version>

        <bigquery.version>v2-rev20211129-1.32.1</bigquery.version>
        <google-api-client.version>1.32.1</google-api-client.version>
        <guava.version>31.0.1-jre</guava.version>
        <hamcrest.version>2.1</hamcrest.version>
        <jackson.version>2.13.0</jackson.version>
        <joda.version>2.10.10</joda.version>
        <junit.version>4.13.1</junit.version>
        <libraries-bom.version>24.2.0</libraries-bom.version>
        <maven-compiler-plugin.version>3.7.0</maven-compiler-plugin.version>
        <maven-exec-plugin.version>1.6.0</maven-exec-plugin.version>
        <maven-jar-plugin.version>3.0.2</maven-jar-plugin.version>
        <maven-shade-plugin.version>3.1.0</maven-shade-plugin.version>
        <mockito.version>3.7.7</mockito.version>
        <pubsub.version>v1-rev20211130-1.32.1</pubsub.version>
        <slf4j.version>1.7.30</slf4j.version>
        <spark.version>2.4.8</spark.version>
        <hadoop.version>2.10.1</hadoop.version>
        <maven-surefire-plugin.version>3.0.0-M5</maven-surefire-plugin.version>
        <nemo.version>0.1</nemo.version>
        <flink.artifact.name>beam-runners-flink-1.13</flink.artifact.name>
    </properties>

    <repositories>
        <repository>
            <id>apache.snapshots</id>
            <name>Apache Development Snapshot Repository</name>
            
<url>https://repository.apache.org/content/repositories/snapshots/</url>
            <releases>
                <enabled>false</enabled>
            </releases>
            <snapshots>
                <enabled>true</enabled>
            </snapshots>
        </repository>
    </repositories>

    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>${maven-compiler-plugin.version}</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>

            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>${maven-surefire-plugin.version}</version>
                <configuration>
                    <parallel>all</parallel>
                    <threadCount>4</threadCount>
                    <redirectTestOutputToFile>true</redirectTestOutputToFile>
                </configuration>
                <dependencies>
                    <dependency>
                        <groupId>org.apache.maven.surefire</groupId>
                        <artifactId>surefire-junit47</artifactId>
                        <version>${maven-surefire-plugin.version}</version>
                    </dependency>
                </dependencies>
            </plugin>

            <!-- Ensure that the Maven jar plugin runs before the Maven
              shade plugin by listing the plugin higher within the file. -->
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-jar-plugin</artifactId>
                <version>${maven-jar-plugin.version}</version>
            </plugin>

            <!--
              Configures `mvn package` to produce a bundled jar ("fat jar") for 
runners
              that require this for job submission to a cluster.
            -->
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>${maven-shade-plugin.version}</version>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            
<finalName>${project.artifactId}-bundled-${project.version}</finalName>
                            <filters>
                                <filter>
                                    <artifact>*:*</artifact>
                                    <excludes>
                                        <exclude>META-INF/LICENSE</exclude>
                                        <exclude>META-INF/*.SF</exclude>
                                        <exclude>META-INF/*.DSA</exclude>
                                        <exclude>META-INF/*.RSA</exclude>
                                    </excludes>
                                </filter>
                            </filters>
                            <transformers>
                                <transformer
                                        
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                                <transformer
                                        
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                    <resource>reference.conf</resource>
                                </transformer>
                            </transformers>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>

        <pluginManagement>
            <plugins>
                <plugin>
                    <groupId>org.codehaus.mojo</groupId>
                    <artifactId>exec-maven-plugin</artifactId>
                    <version>${maven-exec-plugin.version}</version>
                    <configuration>
                        <cleanupDaemonThreads>false</cleanupDaemonThreads>
                    </configuration>
                </plugin>
            </plugins>
        </pluginManagement>
    </build>

    <profiles>
        <profile>
            <id>direct-runner</id>
            <activation>
                <activeByDefault>true</activeByDefault>
            </activation>
            <!-- Makes the DirectRunner available when running a pipeline. -->
            <dependencies>
                <dependency>
                    <groupId>org.apache.beam</groupId>
                    <artifactId>beam-runners-direct-java</artifactId>
                    <version>${beam.version}</version>
                    <scope>runtime</scope>
                </dependency>
            </dependencies>
        </profile>

        <profile>
            <id>portable-runner</id>
            <activation>
                <activeByDefault>true</activeByDefault>
            </activation>
            <!-- Makes the PortableRunner available when running a pipeline. -->
            <dependencies>
                <dependency>
                    <groupId>org.apache.beam</groupId>
                    <artifactId>beam-runners-portability-java</artifactId>
                    <version>${beam.version}</version>
                    <scope>runtime</scope>
                </dependency>
            </dependencies>
        </profile>

        <profile>
            <id>dataflow-runner</id>
            <!-- Makes the DataflowRunner available when running a pipeline. -->
            <dependencies>
                <dependency>
                    <groupId>org.apache.beam</groupId>
                    
<artifactId>beam-runners-google-cloud-dataflow-java</artifactId>
                    <version>${beam.version}</version>
                    <scope>runtime</scope>
                </dependency>
            </dependencies>
        </profile>

        <profile>
            <id>flink-runner</id>
            <!-- Makes the FlinkRunner available when running a pipeline. -->
            <dependencies>
                <dependency>
                    <groupId>org.apache.beam</groupId>
                    <!-- Please see the Flink Runner page for an up-to-date list
                         of supported Flink versions and their artifact names:
                         https://beam.apache.org/documentation/runners/flink/ 
-->
                    <artifactId>${flink.artifact.name}</artifactId>
                    <version>${beam.version}</version>
                    <scope>runtime</scope>
                </dependency>
            </dependencies>
        </profile>

        <profile>
            <id>spark-runner</id>
            <!-- Makes the SparkRunner available when running a pipeline. 
Additionally,
                 overrides some Spark dependencies to Beam-compatible versions. 
-->
            <properties>
                <netty.version>4.1.17.Final</netty.version>
            </properties>
            <dependencies>
                <dependency>
                    <groupId>org.apache.beam</groupId>
                    <artifactId>beam-runners-spark</artifactId>
                    <version>${beam.version}</version>
                    <scope>runtime</scope>
                    <exclusions>
                        <exclusion>
                            <groupId>org.slf4j</groupId>
                            <artifactId>jul-to-slf4j</artifactId>
                        </exclusion>
                    </exclusions>
                </dependency>
                <dependency>
                    <groupId>org.apache.beam</groupId>
                    
<artifactId>beam-sdks-java-io-hadoop-file-system</artifactId>
                    <version>${beam.version}</version>
                    <scope>runtime</scope>
                </dependency>
                <!--        
https://beam.apache.org/documentation/runners/spark/#:~:text=The%20Apache%20Spark%20Runner%20can%20be%20used%20to,Spark%E2%80%99s%20Standalone%20RM%2C%20or%20using%20YARN%20or%20Mesos.-->
                <dependency>
                    <groupId>org.apache.spark</groupId>
                    <artifactId>spark-core_2.11</artifactId>
                    <version>${spark.version}</version>
                    <scope>runtime</scope>
                    <exclusions>
                        <exclusion>
                            <groupId>org.slf4j</groupId>
                            <artifactId>jul-to-slf4j</artifactId>
                        </exclusion>
                    </exclusions>
                </dependency>
                <dependency>
                    <groupId>org.apache.spark</groupId>
                    <artifactId>spark-streaming_2.11</artifactId>
                    <version>${spark.version}</version>
                    <scope>runtime</scope>
                    <exclusions>
                        <exclusion>
                            <groupId>org.slf4j</groupId>
                            <artifactId>jul-to-slf4j</artifactId>
                        </exclusion>
                    </exclusions>
                </dependency>
                <dependency>
                    <groupId>com.fasterxml.jackson.module</groupId>
                    <artifactId>jackson-module-scala_2.11</artifactId>
                    <version>${jackson.version}</version>
                    <scope>runtime</scope>
                </dependency>
                <!-- [BEAM-3519] GCP IO exposes netty on its API surface, 
causing conflicts with runners -->
                <dependency>
                    <groupId>org.apache.beam</groupId>
                    
<artifactId>beam-sdks-java-io-google-cloud-platform</artifactId>
                    <version>${beam.version}</version>
                    <exclusions>
                        <exclusion>
                            <groupId>io.grpc</groupId>
                            <artifactId>grpc-netty</artifactId>
                        </exclusion>
                        <exclusion>
                            <groupId>io.netty</groupId>
                            <artifactId>netty-handler</artifactId>
                        </exclusion>
                    </exclusions>
                </dependency>
            </dependencies>
        </profile>
        <profile>
            <id>samza-runner</id>
            <dependencies>
                <dependency>
                    <groupId>org.apache.beam</groupId>
                    <artifactId>beam-runners-samza</artifactId>
                    <version>${beam.version}</version>
                    <scope>runtime</scope>
                </dependency>
            </dependencies>
        </profile>
        <profile>
            <id>twister2-runner</id>
            <dependencies>
                <dependency>
                    <groupId>org.apache.beam</groupId>
                    <artifactId>beam-runners-twister2</artifactId>
                    <version>${beam.version}</version>
                    <scope>runtime</scope>
                </dependency>
            </dependencies>
        </profile>
        <profile>
            <id>nemo-runner</id>
            <dependencies>
                <dependency>
                    <groupId>org.apache.nemo</groupId>
                    <artifactId>nemo-compiler-frontend-beam</artifactId>
                    <version>${nemo.version}</version>
                </dependency>
                <dependency>
                    <groupId>org.apache.hadoop</groupId>
                    <artifactId>hadoop-common</artifactId>
                    <version>${hadoop.version}</version>
                    <exclusions>
                        <exclusion>
                            <groupId>org.slf4j</groupId>
                            <artifactId>slf4j-api</artifactId>
                        </exclusion>
                        <exclusion>
                            <groupId>org.slf4j</groupId>
                            <artifactId>slf4j-log4j12</artifactId>
                        </exclusion>
                    </exclusions>
                </dependency>
            </dependencies>
        </profile>

        <profile>
            <id>jet-runner</id>
            <dependencies>
                <dependency>
                    <groupId>org.apache.beam</groupId>
                    <artifactId>beam-runners-jet</artifactId>
                    <version>${beam.version}</version>
                    <scope>runtime</scope>
                </dependency>
            </dependencies>
        </profile>

    </profiles>

    <dependencies>
        <!-- Adds a dependency on the Beam SDK. -->
        <dependency>
            <groupId>org.apache.beam</groupId>
            <artifactId>beam-sdks-java-core</artifactId>
            <version>${beam.version}</version>
        </dependency>

        <!-- Adds a dependency on the Beam Google Cloud Platform IO module. -->
        <dependency>
            <groupId>org.apache.beam</groupId>
            <artifactId>beam-sdks-java-io-google-cloud-platform</artifactId>
            <version>${beam.version}</version>
        </dependency>

        <!-- Dependencies below this line are specific dependencies needed by 
the examples code. -->
        <dependency>
            <groupId>com.google.api-client</groupId>
            <artifactId>google-api-client</artifactId>
            <version>${google-api-client.version}</version>
            <exclusions>
                <!-- Exclude an old version of guava that is being pulled
                     in by a transitive dependency of google-api-client -->
                <exclusion>
                    <groupId>com.google.guava</groupId>
                    <artifactId>guava-jdk5</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>com.google.apis</groupId>
            <artifactId>google-api-services-bigquery</artifactId>
            <version>${bigquery.version}</version>
            <exclusions>
                <!-- Exclude an old version of guava that is being pulled
                     in by a transitive dependency of google-api-client -->
                <exclusion>
                    <groupId>com.google.guava</groupId>
                    <artifactId>guava-jdk5</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>com.google.http-client</groupId>
            <artifactId>google-http-client</artifactId>
            <exclusions>
                <!-- Exclude an old version of guava that is being pulled
                     in by a transitive dependency of google-api-client -->
                <exclusion>
                    <groupId>com.google.guava</groupId>
                    <artifactId>guava-jdk5</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>com.google.apis</groupId>
            <artifactId>google-api-services-pubsub</artifactId>
            <version>${pubsub.version}</version>
            <exclusions>
                <!-- Exclude an old version of guava that is being pulled
                     in by a transitive dependency of google-api-client -->
                <exclusion>
                    <groupId>com.google.guava</groupId>
                    <artifactId>guava-jdk5</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>joda-time</groupId>
            <artifactId>joda-time</artifactId>
            <version>${joda.version}</version>
        </dependency>

        <!-- Add slf4j API frontend binding with JUL backend -->
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-api</artifactId>
            <version>${slf4j.version}</version>
        </dependency>

        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-jdk14</artifactId>
            <version>${slf4j.version}</version>
            <!-- When loaded at runtime this will wire up slf4j to the JUL 
backend -->
            <scope>runtime</scope>
        </dependency>

        <!-- Hamcrest and JUnit are required dependencies of PAssert,
             which is used in the main code of DebuggingWordCount example. -->
        <dependency>
            <groupId>org.hamcrest</groupId>
            <artifactId>hamcrest-core</artifactId>
            <version>${hamcrest.version}</version>
        </dependency>

        <dependency>
            <groupId>org.hamcrest</groupId>
            <artifactId>hamcrest-library</artifactId>
            <version>${hamcrest.version}</version>
        </dependency>

        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>${junit.version}</version>
        </dependency>

        <!-- The DirectRunner is needed for unit tests. -->
        <dependency>
            <groupId>org.apache.beam</groupId>
            <artifactId>beam-runners-direct-java</artifactId>
            <version>${beam.version}</version>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>org.mockito</groupId>
            <artifactId>mockito-core</artifactId>
            <version>${mockito.version}</version>
            <scope>test</scope>
        </dependency>
    </dependencies>

    <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>com.google.guava</groupId>
                <artifactId>guava</artifactId>
                <version>${guava.version}</version>  <!-- "-jre" for Java 8 or 
higher -->
            </dependency>
            <!-- GCP libraries BOM sets the version for google http client -->
            <dependency>
                <groupId>com.google.cloud</groupId>
                <artifactId>libraries-bom</artifactId>
                <version>${libraries-bom.version}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>
</project>

Reply via email to