你运行的命令是啥?然后在哪个目录下运行的,和 flink 下载下来解压的目录是什么相对关系?

Best,
tison.


air23 <[email protected]> 于2020年5月29日周五 下午2:35写道:

> 代码就是flink自带的例子。
>
> public class WordCountStreamingByJava {
> public static void main(String[] args) throws Exception {
>
> // 创建执行环境
> StreamExecutionEnvironment env =
> StreamExecutionEnvironment.getExecutionEnvironment();
> // 设置socket数据源
> DataStreamSource<String> source = env.socketTextStream("zongteng75", 9001,
> "\n");
>
> // 转化处理数据
> DataStream<WordWithCount> dataStream = source.flatMap(new
> FlatMapFunction<String, WordWithCount>() {
> @Override
> public void flatMap(String line, Collector<WordWithCount> collector)
> throws Exception {
>
>                 System.out.println(line);
> for (String word : line.split(" ")) {
> collector.collect(new WordWithCount(word, 1));
>                 }
>             }
>         }).keyBy("word")//以key分组统计
> .timeWindow(Time.seconds(2),Time.seconds(2))//设置一个窗口函数,模拟数据流动
> .sum("count");//计算时间窗口内的词语个数
>
>         // 输出数据到目的端
> dataStream.print();
>
> // 执行任务操作
> env.execute("Flink Streaming Word Count By Java");
>
>     }
>
>
>
>
> 我现在加了flink环境变量 这个例子 可以过了。就很奇怪
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> 在 2020-05-29 14:22:39,"tison" <[email protected]> 写道:
> >然后你 execute 前后的代码片段甚至整个 main 如果可以的话通过 gist 贴一下(x)
> >
> >Best,
> >tison.
> >
> >
> >tison <[email protected]> 于2020年5月29日周五 下午2:21写道:
> >
> >> 这个问题好诡异啊,一般来说编译会在 env.execute
> >> 的时候拦截,不应该真的调度起来才对。你能详细描述一下你提交作业的方法还有这个错误报在哪里吗(client?cluster?)?
> >>
> >> Best,
> >> tison.
> >>
> >>
> >> air23 <[email protected]> 于2020年5月29日周五 下午1:38写道:
> >>
> >>> cdh运行flink1.10 on cdh yarn 报错如下。 用1.7.2版本就没有问题
> >>> flink-shaded-hadoop-2-uber-2.6.5-10.0.jar 也加了
> >>> hadoop环境变量 export HADOOP_CONF_DIR=/etc/hadoop/conf
> >>> 求解答
> >>>
> >>>
> >>>
> >>>
> >>>
> >>>
> >>>
> >>> org.apache.flink.client.program.ProgramInvocationException: The main
> >>> method caused an error:
> >>> org.apache.flink.client.program.ProgramInvocationException: Job failed
> >>> (JobID: e358699c1be6be1472078771e1fd027f)
> >>>
> >>>         at
> >>>
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:335)
> >>>
> >>>         at
> >>>
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:205)
> >>>
> >>>         at
> >>>
> org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:138)
> >>>
> >>>         at
> >>>
> org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:662)
> >>>
> >>>         at
> >>> org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
> >>>
> >>>         at
> >>>
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:893)
> >>>
> >>>         at
> >>>
> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:966)
> >>>
> >>>         at java.security.AccessController.doPrivileged(Native Method)
> >>>
> >>>         at javax.security.auth.Subject.doAs(Subject.java:422)
> >>>
> >>>         at
> >>>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> >>>
> >>>         at
> >>> org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:966)
> >>>
> >>> Caused by: java.util.concurrent.ExecutionException:
> >>> org.apache.flink.client.program.ProgramInvocationException: Job failed
> >>> (JobID: e358699c1be6be1472078771e1fd027f)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
> >>>
> >>>         at
> >>> java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
> >>>
> >>>         at
> >>>
> org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:83)
> >>>
> >>>         at
> >>>
> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1620)
> >>>
> >>>         at
> >>> tt.WordCountStreamingByJava.main(WordCountStreamingByJava.java:36)
> >>>
> >>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>
> >>>         at
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >>>
> >>>         at
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>>
> >>>         at java.lang.reflect.Method.invoke(Method.java:498)
> >>>
> >>>         at
> >>>
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:321)
> >>>
> >>>         ... 11 more
> >>>
> >>> Caused by: org.apache.flink.client.program.ProgramInvocationException:
> >>> Job failed (JobID: e358699c1be6be1472078771e1fd027f)
> >>>
> >>>         at
> >>>
> org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:112)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:602)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:577)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1962)
> >>>
> >>>         at
> >>>
> org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$21(RestClusterClient.java:565)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:736)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1962)
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$8(FutureUtils.java:291)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:736)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:561)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:929)
> >>>
> >>>         at
> >>>
> java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:442)
> >>>
> >>>         at
> >>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> >>>
> >>>         at
> >>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> >>>
> >>>         at java.lang.Thread.run(Thread.java:748)
> >>>
> >>> Caused by: org.apache.flink.runtime.client.JobExecutionException: Job
> >>> execution failed.
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:147)
> >>>
> >>>         at
> >>>
> org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:110)
> >>>
> >>>         ... 19 more
> >>>
> >>> Caused by: org.apache.flink.runtime.JobException: Recovery is
> suppressed
> >>> by NoRestartBackoffTimeStrategy
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:110)
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:76)
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:192)
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:186)
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:180)
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:496)
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:380)
> >>>
> >>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>
> >>>         at
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >>>
> >>>         at
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>>
> >>>         at java.lang.reflect.Method.invoke(Method.java:498)
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:284)
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:199)
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
> >>>
> >>>         at
> >>>
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
> >>>
> >>>         at akka.japi.pf
> .UnitCaseStatement.apply(CaseStatements.scala:26)
> >>>
> >>>         at akka.japi.pf
> .UnitCaseStatement.apply(CaseStatements.scala:21)
> >>>
> >>>         at
> >>> scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
> >>>
> >>>         at akka.japi.pf
> >>> .UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
> >>>
> >>>         at
> >>> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
> >>>
> >>>         at
> >>> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
> >>>
> >>>         at
> >>> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
> >>>
> >>>         at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
> >>>
> >>>         at
> akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
> >>>
> >>>         at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
> >>>
> >>>         at akka.actor.ActorCell.invoke(ActorCell.scala:561)
> >>>
> >>>         at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
> >>>
> >>>         at akka.dispatch.Mailbox.run(Mailbox.scala:225)
> >>>
> >>>         at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
> >>>
> >>>         at
> >>> akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> >>>
> >>>         at
> >>>
> akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> >>>
> >>>         at
> >>> akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> >>>
> >>>         at
> >>>
> akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> >>>
> >>> Caused by: java.net.ConnectException: Connection refused (Connection
> >>> refused)
> >>>
> >>>         at java.net.PlainSocketImpl.socketConnect(Native Method)
> >>>
> >>>         at java.net
> >>> .AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
> >>>
> >>>         at java.net
> >>>
> .AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
> >>>
> >>>         at java.net
> >>> .AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
> >>>
> >>>         at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
> >>>
> >>>         at java.net.Socket.connect(Socket.java:606)
> >>>
> >>>         at
> >>>
> org.apache.flink.streaming.api.functions.source.SocketTextStreamFunction.run(SocketTextStreamFunction.java:97)
> >>>
> >>>         at
> >>>
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
> >>>
> >>>         at
> >>>
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
> >>>
> >>>         at
> >>>
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:200)
> >>
> >>
>

回复