Re: flink编译问题

2021-09-26 文章 Martin MA
BTW,忘记说明环境情况了:
Java 1.8.0_281
Scala 2.12.15
Maven 3.6.3

Martin MA  于2021年9月26日周日 下午11:30写道:

> Hi team,
>
> 我 从master分支上clone了最新的flink代码,尝试在centOS7下编译,发现以下错误:
>
> [image: image.png]
>
> 编译命令:
>  mvn clean install -DskipTests -Dscala-2.12 -rf
> :flink-avro-confluent-registry
>
> 在此之前使用mvn clean install -DskipTests -Dscala-2.12编译报如下错误:
> [image: image.png]
>
> 于是我从mvn repo手动下载了jar包,并使用以下命令再次编译:
> mvn install:install-file -DgroupId=io.confluent
> -DartifactId=kafka-schema-registry-client -Dversion=5.5.2 -Dpackaging=jar
> -Dfile=kafka-schema-registry-client-5.5.2.jar
>
> 请问该如何解决?Thanks
>


flink编译问题

2021-09-26 文章 Martin MA
Hi team,

我 从master分支上clone了最新的flink代码,尝试在centOS7下编译,发现以下错误:

[image: image.png]

编译命令:
 mvn clean install -DskipTests -Dscala-2.12 -rf
:flink-avro-confluent-registry

在此之前使用mvn clean install -DskipTests -Dscala-2.12编译报如下错误:
[image: image.png]

于是我从mvn repo手动下载了jar包,并使用以下命令再次编译:
mvn install:install-file -DgroupId=io.confluent
-DartifactId=kafka-schema-registry-client -Dversion=5.5.2 -Dpackaging=jar
-Dfile=kafka-schema-registry-client-5.5.2.jar

请问该如何解决?Thanks


Flink编译问题

2019-08-20 文章 戴嘉诚

大家好:

我这里用的cdh6.3.0版本进行hadoop管理。所以我根据官网上的显示,对flink的源码根据cdh6.3.0重新编译打包,但是在打包过程中,貌似发现了个问题:

[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:3.8.0:testCompile 
(default-testCompile) on project flink-yarn_2.11: Compilation failure
[ERROR] 
/data1/flink/flink/flink-yarn/src/test/java/org/apache/flink/yarn/AbstractYarnClusterTest.java:[89,41]
 no suitable method found for 
newInstance(org.apache.hadoop.yarn.api.records.ApplicationId,org.apache.hadoop.yarn.api.records.ApplicationAttemptId,java.lang.String,java.lang.String,java.lang.String,java.lang.String,int,,org.apache.hadoop.yarn.api.records.YarnApplicationState,,,long,long,org.apache.hadoop.yarn.api.records.FinalApplicationStatus,,,float,,)
[ERROR] method 
org.apache.hadoop.yarn.api.records.ApplicationReport.newInstance(org.apache.hadoop.yarn.api.records.ApplicationId,org.apache.hadoop.yarn.api.records.ApplicationAttemptId,java.lang.String,java.lang.String,java.lang.String,java.lang.String,int,org.apache.hadoop.yarn.api.records.Token,org.apache.hadoop.yarn.api.records.YarnApplicationState,java.lang.String,java.lang.String,long,long,long,org.apache.hadoop.yarn.api.records.FinalApplicationStatus,org.apache.hadoop.yarn.api.records.ApplicationResourceUsageReport,java.lang.String,float,java.lang.String,org.apache.hadoop.yarn.api.records.Token)
 is not applicable
[ERROR]   (actual and formal argument lists differ in length)
[ERROR] method 
org.apache.hadoop.yarn.api.records.ApplicationReport.newInstance(org.apache.hadoop.yarn.api.records.ApplicationId,org.apache.hadoop.yarn.api.records.ApplicationAttemptId,java.lang.String,java.lang.String,java.lang.String,java.lang.String,int,org.apache.hadoop.yarn.api.records.Token,org.apache.hadoop.yarn.api.records.YarnApplicationState,java.lang.String,java.lang.String,long,long,org.apache.hadoop.yarn.api.records.FinalApplicationStatus,org.apache.hadoop.yarn.api.records.ApplicationResourceUsageReport,java.lang.String,float,java.lang.String,org.apache.hadoop.yarn.api.records.Token,java.util.Set,boolean,org.apache.hadoop.yarn.api.records.Priority,java.lang.String,java.lang.String)
 is not applicable
[ERROR]   (actual and formal argument lists differ in length)
[ERROR] method 
org.apache.hadoop.yarn.api.records.ApplicationReport.newInstance(org.apache.hadoop.yarn.api.records.ApplicationId,org.apache.hadoop.yarn.api.records.ApplicationAttemptId,java.lang.String,java.lang.String,java.lang.String,java.lang.String,int,org.apache.hadoop.yarn.api.records.Token,org.apache.hadoop.yarn.api.records.YarnApplicationState,java.lang.String,java.lang.String,long,long,long,org.apache.hadoop.yarn.api.records.FinalApplicationStatus,org.apache.hadoop.yarn.api.records.ApplicationResourceUsageReport,java.lang.String,float,java.lang.String,org.apache.hadoop.yarn.api.records.Token,java.util.Set,boolean,org.apache.hadoop.yarn.api.records.Priority,java.lang.String,java.lang.String)
 is not applicable
[ERROR]   (actual and formal argument lists differ in length)

Ps : 我执行的命令是:mvn clean install -DskipTests -Pvendor-repos 
-Dhadoop.version=3.0.0-cdh6.3.0
这个hadoop是3.0版本的