Hi,您好! 我们参考Flink官网上提供的Python API中的DataStream API用户指南文档编写了一份python脚本,文档地址:https://ci.apache.org/projects/flink/flink-docs-release-1.12/zh/dev/python/datastream-api-users-guide/operators.html flink运行方式是 on yarn,通过-py参数指定了脚本,能成功提交到yarn上,但是会遇到如下错误 Job has been submitted with JobID ee9e3a89eae69f457b81d1ebf4a45264 Traceback (most recent call last): File "official_example_2blk.py", line 44, in <module> env.execute("tutorial_job") File "/usr/local/service/flink-1.12.0/opt/python/pyflink.zip/pyflink/datastream/stream_execution_environment.py", line 623, in execute File "/usr/local/service/flink-1.12.0/opt/python/py4j-0.10.8.1-src.zip/py4j/java_gateway.py", line 1286, in __call__ File "/usr/local/service/flink-1.12.0/opt/python/pyflink.zip/pyflink/util/exceptions.py", line 147, in deco File "/usr/local/service/flink-1.12.0/opt/python/py4j-0.10.8.1-src.zip/py4j/protocol.py", line 328, in get_return_value py4j.protocol.Py4JJavaError: An error occurred while calling o2.execute. : java.util.concurrent.ExecutionException: org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: ee9e3a89eae69f457b81d1ebf4a45264) 完整的堆栈报错可以参考附件中,还请帮忙看下具体原因! |
flink.log
Description: Binary data
