Hi, Mich > [WARNING] Expected all dependencies to require Scala version: 2.10.4 > [WARNING] spark:scala:1.0 requires scala version: 2.11.7 > [WARNING] Multiple versions of scala libraries detected!
I think you should make your scala version to 2.11 first. And try again. Cheers Minglei > 在 2018年7月1日,下午9:24,Mich Talebzadeh <[email protected]> 写道: > > Hi Minglei, > > Many thanks > > My flink version is 1.5 > > This is the pom.xml from GitHub as suggested > > <!-- > Licensed to the Apache Software Foundation (ASF) under one > or more contributor license agreements. See the NOTICE file > distributed with this work for additional information > regarding copyright ownership. The ASF licenses this file > to you under the Apache License, Version 2.0 (the > "License"); you may not use this file except in compliance > with the License. You may obtain a copy of the License at > http://www.apache.org/licenses/LICENSE-2.0 > <http://www.apache.org/licenses/LICENSE-2.0> > Unless required by applicable law or agreed to in writing, > software distributed under the License is distributed on an > "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY > KIND, either express or implied. See the License for the > specific language governing permissions and limitations > under the License. > --> > <project xmlns="http://maven.apache.org/POM/4.0.0 > <http://maven.apache.org/POM/4.0.0>" > xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance > <http://www.w3.org/2001/XMLSchema-instance>" > xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 > <http://maven.apache.org/POM/4.0.0> > http://maven.apache.org/xsd/maven-4.0.0.xsd > <http://maven.apache.org/xsd/maven-4.0.0.xsd>"> > <modelVersion>4.0.0</modelVersion> > <groupId>${groupId}</groupId> > <artifactId>${artifactId}</artifactId> > <version>1.5</version> > <packaging>jar</packaging> > <name>Flink Quickstart Job</name> > <url>http://www.myorganization.org > <http://www.myorganization.org/></url> > <properties> > > <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> > <flink.version>@project.version@</flink.version> > <java.version>1.8</java.version> > <scala.binary.version>2.11</scala.binary.version> > <maven.compiler.source>2.11</maven.compiler.source> > <maven.compiler.target>2.11</maven.compiler.target> > </properties> > <repositories> > <repository> > <id>apache.snapshots</id> > <name>Apache Development Snapshot Repository</name> > > <url>https://repository.apache.org/content/repositories/snapshots/ > <https://repository.apache.org/content/repositories/snapshots/></url> > <releases> > <enabled>false</enabled> > </releases> > <snapshots> > <enabled>true</enabled> > </snapshots> > </repository> > </repositories> > <dependencies> > <!-- Apache Flink dependencies --> > <!-- These dependencies are provided, because they should not > be packaged into the JAR file. --> > <dependency> > <groupId>org.apache.flink</groupId> > <artifactId>flink-java</artifactId> > <version>1.5</version> > <scope>provided</scope> > </dependency> > <dependency> > <groupId>org.apache.flink</groupId> > <artifactId>flink-streaming-java_2.11</artifactId> > <version>1.5</version> > <scope>provided</scope> > </dependency> > <!-- Add connector dependencies here. They must be in the > default scope (compile). --> > <!-- Example: > <dependency> > <groupId>org.apache.flink</groupId> > > <artifactId>flink-connector-kafka-0.10_2.11</artifactId> > <version>1.5</version> > </dependency> > --> > <!-- Add logging framework, to produce console output when > running in the IDE. --> > <!-- These dependencies are excluded from the application JAR > by default. --> > <dependency> > <groupId>org.slf4j</groupId> > <artifactId>slf4j-log4j12</artifactId> > <version>1.7.7</version> > <scope>runtime</scope> > </dependency> > <dependency> > <groupId>log4j</groupId> > <artifactId>log4j</artifactId> > <version>1.2.17</version> > <scope>runtime</scope> > </dependency> > </dependencies> > <build> > <plugins> > <!-- Java Compiler --> > <plugin> > <groupId>org.apache.maven.plugins</groupId> > <artifactId>maven-compiler-plugin</artifactId> > <version>3.1</version> > <configuration> > <source>2.11</source> > <target>2.11</target> > </configuration> > </plugin> > <!-- We use the maven-shade plugin to create a fat > jar that contains all necessary dependencies. --> > <!-- Change the value of <mainClass>...</mainClass> > if your program entry point changes. --> > <plugin> > <groupId>org.apache.maven.plugins</groupId> > <artifactId>maven-shade-plugin</artifactId> > <version>3.0.0</version> > <executions> > <!-- Run shade goal on package phase > --> > <execution> > <phase>package</phase> > <goals> > <goal>shade</goal> > </goals> > <configuration> > <artifactSet> > <excludes> > > <exclude>org.apache.flink:force-shading</exclude> > > <exclude>com.google.code.findbugs:jsr305</exclude> > > <exclude>org.slf4j:*</exclude> > > <exclude>log4j:*</exclude> > </excludes> > </artifactSet> > <filters> > <filter> > <!-- > Do not copy the signatures in the META-INF folder. > > Otherwise, this might cause SecurityExceptions when using the JAR. --> > > <artifact>*:*</artifact> > > <excludes> > > <exclude>META-INF/*.SF</exclude> > > <exclude>META-INF/*.DSA</exclude> > > <exclude>META-INF/*.RSA</exclude> > > </excludes> > </filter> > </filters> > <transformers> > <transformer > implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"> > > <mainClass>${package}.StreamingJob</mainClass> > </transformer> > </transformers> > </configuration> > </execution> > </executions> > </plugin> > </plugins> > <pluginManagement> > <plugins> > <!-- If you want to use Java 8 Lambda > Expressions uncomment the following lines --> > <!-- > <plugin> > > <artifactId>maven-compiler-plugin</artifactId> > <configuration> > <source>2.11</source> > <target>2.11</target> > <compilerId>jdt</compilerId> > </configuration> > <dependencies> > <dependency> > > <groupId>org.eclipse.tycho</groupId> > > <artifactId>tycho-compiler-jdt</artifactId> > > <version>0.21.0</version> > </dependency> > </dependencies> > </plugin> > --> > <!-- This improves the out-of-the-box > experience in Eclipse by resolving some warnings. --> > <plugin> > <groupId>org.eclipse.m2e</groupId> > > <artifactId>lifecycle-mapping</artifactId> > <version>1.0.0</version> > <configuration> > <lifecycleMappingMetadata> > <pluginExecutions> > > <pluginExecution> > > <pluginExecutionFilter> > > <groupId>org.apache.maven.plugins</groupId> > > <artifactId>maven-shade-plugin</artifactId> > > <versionRange>[3.0.0,)</versionRange> > > <goals> > > <goal>shade</goal> > > </goals> > > </pluginExecutionFilter> > > <action> > > <ignore/> > > </action> > > </pluginExecution> > > <pluginExecution> > > <pluginExecutionFilter> > > <groupId>org.apache.maven.plugins</groupId> > > <artifactId>maven-compiler-plugin</artifactId> > > <versionRange>[3.1,)</versionRange> > > <goals> > > <goal>testCompile</goal> > > <goal>compile</goal> > > </goals> > > </pluginExecutionFilter> > > <action> > > <ignore/> > > </action> > > </pluginExecution> > </pluginExecutions> > </lifecycleMappingMetadata> > </configuration> > </plugin> > </plugins> > </pluginManagement> > </build> > <!-- This profile helps to make things run out of the box in IntelliJ > --> > <!-- Its adds Flink's core classes to the runtime class path. --> > <!-- Otherwise they are missing in IntelliJ, because the dependency > is 'provided' --> > <profiles> > <profile> > <id>add-dependencies-for-IDEA</id> > <activation> > <property> > <name>idea.version</name> > </property> > </activation> > <dependencies> > <dependency> > <groupId>org.apache.flink</groupId> > <artifactId>flink-java</artifactId> > <version>1.5</version> > <scope>compile</scope> > </dependency> > <dependency> > <groupId>org.apache.flink</groupId> > > <artifactId>flink-streaming-java_2.11</artifactId> > <version>1.5</version> > <scope>compile</scope> > </dependency> > </dependencies> > </profile> > </profiles> > </project> > > > But I am still getting the same errors for input > > [INFO] Scanning for projects... > [INFO] > [INFO] > ------------------------------------------------------------------------ > [INFO] Building scala 1.0 > [INFO] > ------------------------------------------------------------------------ > [INFO] > [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ scala > --- > [INFO] Using 'UTF-8' encoding to copy filtered resources. > [INFO] skip non existing resourceDirectory > /home/hduser/dba/bin/flink/md_streaming/src/main/resources > [INFO] > [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ scala --- > [INFO] Nothing to compile - all classes are up to date > [INFO] > [INFO] --- maven-scala-plugin:2.15.2:compile (default) @ scala --- > [INFO] Checking for multiple versions of scala > [WARNING] Expected all dependencies to require Scala version: 2.10.4 > [WARNING] spark:scala:1.0 requires scala version: 2.11.7 > [WARNING] Multiple versions of scala libraries detected! > [INFO] includes = [**/*.java,**/*.scala,] > [INFO] excludes = [] > [INFO] /home/hduser/dba/bin/flink/md_streaming/src/main/scala:-1: info: > compiling > [INFO] Compiling 1 source files to > /home/hduser/dba/bin/flink/md_streaming/target/classes at 1530451461171 > [ERROR] > /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:3: > error: object flink is not a member of package org.apache > [INFO] import org.apache.flink.api.common.functions.MapFunction > [INFO] ^ > [ERROR] > /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:4: > error: object flink is not a member of package org.apache > [INFO] import org.apache.flink.api.java.utils.ParameterTool > [INFO] ^ > [ERROR] > /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:5: > error: object flink is not a member of package org.apache > [INFO] import org.apache.flink.streaming.api.datastream.DataStream > [INFO] ^ > [ERROR] > /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:6: > error: object flink is not a member of package org.apache > [INFO] import > org.apache.flink.streaming.api.environment.StreamExecutionEnvironment > [INFO] ^ > [ERROR] > /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:7: > error: object flink is not a member of package org.apache > [INFO] import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09 > [INFO] ^ > [ERROR] > /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:8: > error: object flink is not a member of package org.apache > [INFO] import org.apache.flink.streaming.util.serialization.SimpleStringSchema > [INFO] ^ > [ERROR] > /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:9: > error: object flink is not a member of package org.apache > [INFO] import > org.apache.flink.streaming.util.serialization.DeserializationSchema > [INFO] ^ > [ERROR] > /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:10: > error: object flink is not a member of package org.apache > [INFO] import org.apache.flink.streaming.util.serialization.SimpleStringSchema > [INFO] ^ > [ERROR] > /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:18: > error: not found: value StreamExecutionEnvironment > [INFO] val env = StreamExecutionEnvironment.getExecutionEnvironment > [INFO] ^ > [ERROR] 9 errors found > [INFO] > ------------------------------------------------------------------------ > [INFO] BUILD FAILURE > > > Dr Mich Talebzadeh > > LinkedIn > https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw > > <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw> > > http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> > > Disclaimer: Use it at your own risk. Any and all responsibility for any loss, > damage or destruction of data or any other property which may arise from > relying on this email's technical content is explicitly disclaimed. The > author will in no case be liable for any monetary damages arising from such > loss, damage or destruction. > > > > On Sun, 1 Jul 2018 at 14:07, zhangminglei <[email protected] > <mailto:[email protected]>> wrote: > Hi, Mich. > >> Is there a basic MVN pom file for flink? The default one from GitHub does >> not seem to be working! > > > Please take a look on > https://github.com/apache/flink/blob/master/flink-quickstart/flink-quickstart-java/src/main/resources/archetype-resources/pom.xml > > <https://github.com/apache/flink/blob/master/flink-quickstart/flink-quickstart-java/src/main/resources/archetype-resources/pom.xml> > > Cheers > Minglei > > >> 在 2018年7月1日,下午7:44,Mich Talebzadeh <[email protected] >> <mailto:[email protected]>> 写道: >> >> I have done many times with sbt or maven for spark streaming. >> >> Trying to compile a simple program that compiles ok in flink-scala.sh >> >> The imports are as follows >> import java.util.Properties >> import java.util.Arrays >> import org.apache.flink.api.common.functions.MapFunction >> import org.apache.flink.api.java.utils.ParameterTool >> import org.apache.flink.streaming.api.datastream.DataStream >> import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment >> import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09 >> import org.apache.flink.streaming.util.serialization.SimpleStringSchema >> import org.apache.flink.streaming.util.serialization.DeserializationSchema >> import org.apache.flink.streaming.util.serialization.SimpleStringSchema >> >> with adding to classpath the following jars it compiles >> >> flink-connector-kafka-0.9_2.11-1.5.0.jar >> flink-connector-kafka-base_2.11-1.5.0.jar >> >> I guess my pom.xml is incorrect. >> >> I have added these two dependencies to the pom.xml file >> >> <dependency> >> <groupId>org.apache.flink</groupId> >> <artifactId>flink-streaming-java_2.11</artifactId> >> <version>1.4.2</version> >> <scope>provided</scope> >> </dependency> >> <dependency> >> <groupId>org.apache.flink</groupId> >> <artifactId>flink-core</artifactId> >> <version>1.5.0</version> >> </dependency> >> >> However, I am getting these basic errors! >> >> [ERROR] >> /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:4: >> error: object flink is not a member of package org.apache >> [INFO] import org.apache.flink.api.java.utils.ParameterTool >> [INFO] ^ >> [ERROR] >> /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:5: >> error: object flink is not a member of package org.apache >> [INFO] import org.apache.flink.streaming.api.datastream.DataStream >> [INFO] ^ >> [ERROR] >> /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:6: >> error: object flink is not a member of package org.apache >> [INFO] import >> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment >> [INFO] ^ >> [ERROR] >> /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:7: >> error: object flink is not a member of package org.apache >> [INFO] import >> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09 >> [INFO] ^ >> [ERROR] >> /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:8: >> error: object flink is not a member of package org.apache >> [INFO] import >> org.apache.flink.streaming.util.serialization.SimpleStringSchema >> [INFO] ^ >> [ERROR] >> /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:9: >> error: object flink is not a member of package org.apache >> [INFO] import >> org.apache.flink.streaming.util.serialization.DeserializationSchema >> [INFO] ^ >> [ERROR] >> /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:10: >> error: object flink is not a member of package org.apache >> [INFO] import >> org.apache.flink.streaming.util.serialization.SimpleStringSchema >> [INFO] ^ >> [ERROR] >> /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:18: >> error: not found: value StreamExecutionEnvironment >> [INFO] val env = StreamExecutionEnvironment.getExecutionEnvironment >> >> Is there a basic MVN pom file for flink? The default one from GitHub does >> not seem to be working! >> >> Thanks >> >> Dr Mich Talebzadeh >> >> LinkedIn >> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw >> >> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw> >> >> http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> >> >> Disclaimer: Use it at your own risk. Any and all responsibility for any >> loss, damage or destruction of data or any other property which may arise >> from relying on this email's technical content is explicitly disclaimed. The >> author will in no case be liable for any monetary damages arising from such >> loss, damage or destruction. >> >
