刚注意到你用的YARN application模式,PyFlink 1.14.0才支持YARN application模式,主要是新增了命令行选项“ -pyclientexec” 和配置“python.client.executable”: https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/dev/python/python_config/#python-client-executable
对于你这个作业来说,你需要通过使用1.14.0版本,同时添加命令行选项:-pyclientexec venv.zip/venv/bin/python On Fri, Nov 19, 2021 at 10:48 AM Asahi Lee <[email protected]> wrote: > 我通过source my_env/bin/activate和指定环境变量PYFLINK_CLIENT_EXECUTABLE均未提交成功,错误未变化; > 我的错误中,jobmanager的日志中显示No module named pyflink,而jobmanager是运行在yarn集群上; > 请问,还有还有什么未配置? > > > > LogType:jobmanager.out > > Log Upload Time:星期四 十一月 18 20:48:45 +0800 2021 > > LogLength:37 > > Log Contents: > > /bin/python: No module named pyflink > > > > > ------------------ 原始邮件 ------------------ > 发件人: > "user-zh" > < > [email protected]>; > 发送时间: 2021年11月19日(星期五) 上午9:38 > 收件人: "user-zh"<[email protected]>; > > 主题: Re: 哪里可以下载到downloads/setup-pyflink-virtual-env.sh脚本 > > > > -pyexec 指定的是集群端所用的Python环境,客户端需要编译Flink作业,也会依赖Python环境。可以看一下这个文档: > > https://nightlies.apache.org/flink/flink-docs-release-1.13/docs/dev/python/dependency_management/#python-interpreter-of-client > > On Thu, Nov 18, 2021 at 9:00 PM Asahi Lee <[email protected]> > wrote: > > > Hi ! > > &nbsp; &nbsp;我在java Table api中使用python udf > > 函数,通过下面的命令提交应用,报无法启动python服务错误,请问我的提交方式对吗?jm日志为/bin/python: No module > named > > pyflink。 > > > > > > ./flink-1.13.2/bin/flink&nbsp; > > run-application -t yarn-application&nbsp; > > > -Dyarn.provided.lib.dirs="hdfs://nameservice1/user/flink/flinklib"&nbsp; > > -Dyarn.application.queue=d > > -p 1&nbsp; > > -pyarch /opt/venv.zip > > -pyexec venv.zip/venv/bin/python&nbsp; > > -pyfs /opt/test.py&nbsp; > > -c test.PyUDFTest&nbsp; > > /opt/flink-python-test-1.0-SNAPSHOT.jar > > > > > > > > 错误: > > Caused by: java.lang.RuntimeException: Python callback server start > failed! > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.client.python.PythonFunctionFactory.createPythonFunctionFactory(PythonFunctionFactory.java:167) > > ~[flink-python_2.11-1.13.2.jar:1.13.2] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.client.python.PythonFunctionFactory$1.load(PythonFunctionFactory.java:88) > > ~[flink-python_2.11-1.13.2.jar:1.13.2] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.client.python.PythonFunctionFactory$1.load(PythonFunctionFactory.java:84) > > ~[flink-python_2.11-1.13.2.jar:1.13.2] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527) > > ~[flink-dist_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) > > ~[flink-dist_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282) > > ~[flink-dist_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) > > ~[flink-dist_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937) > > ~[flink-dist_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) > > ~[flink-dist_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) > > ~[flink-dist_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.client.python.PythonFunctionFactory.getPythonFunction(PythonFunctionFactory.java:129) > > ~[flink-python_2.11-1.13.2.jar:1.13.2] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > ~[?:1.8.0_111] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > > ~[?:1.8.0_111] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > ~[?:1.8.0_111] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_111] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.table.functions.python.utils.PythonFunctionUtils.getPythonFunction(PythonFunctionUtils.java:45) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.table.functions.UserDefinedFunctionHelper.instantiateFunction(UserDefinedFunctionHelper.java:206) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.table.catalog.FunctionCatalog.getFunctionDefinition(FunctionCatalog.java:659) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.table.catalog.FunctionCatalog.resolveAmbiguousFunctionReference(FunctionCatalog.java:606) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.table.catalog.FunctionCatalog.lookupFunction(FunctionCatalog.java:362) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.table.planner.catalog.FunctionCatalogOperatorTable.lookupOperatorOverloads(FunctionCatalogOperatorTable.java:97) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.calcite.sql.util.ChainedSqlOperatorTable.lookupOperatorOverloads(ChainedSqlOperatorTable.java:67) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.calcite.sql.validate.SqlValidatorImpl.performUnconditionalRewrites(SqlValidatorImpl.java:1183) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.calcite.sql.validate.SqlValidatorImpl.performUnconditionalRewrites(SqlValidatorImpl.java:1169) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.calcite.sql.validate.SqlValidatorImpl.performUnconditionalRewrites(SqlValidatorImpl.java:1200) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.calcite.sql.validate.SqlValidatorImpl.performUnconditionalRewrites(SqlValidatorImpl.java:1169) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.calcite.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:945) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.calcite.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:704) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org > $apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:152) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.table.planner.calcite.FlinkPlannerImpl.validate(FlinkPlannerImpl.scala:110) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:201) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:101) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:704) > > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > test.PyUDFTest.main(PyUDFTest.java:22) > > ~[flink-python-test-1.0-SNAPSHOT.jar:?] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > ~[?:1.8.0_111] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > > ~[?:1.8.0_111] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > ~[?:1.8.0_111] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_111] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355) > > ~[flink-dist_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222) > > ~[flink-dist_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114) > > ~[flink-dist_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; at > > > org.apache.flink.client.deployment.application.ApplicationDispatcherBootstrap.runApplicationEntryPoint(ApplicationDispatcherBootstrap.java:242) > > ~[flink-dist_2.11-1.13.1.jar:1.13.1] > > &nbsp; &nbsp; &nbsp; &nbsp; ... 10 more > > 2021-11-18 20:48:44,458 INFO&nbsp; > > org.apache.flink.runtime.blob.BlobServer&nbsp; &nbsp; > &nbsp; &nbsp; &nbsp; > > &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;[] - > Stopped BLOB server at > > 0.0.0.0:45070 > > 2021-11-18 20:48:44,458 INFO&nbsp; > > org.apache.flink.runtime.entrypoint.ClusterEntrypoint&nbsp; > &nbsp; &nbsp; > > &nbsp; [] - Shutting YarnApplicationClusterEntryPoint down with > application > > status UNKNOWN. Diagnostics Cluster entrypoint has been closed > externally.. > > 2021-11-18 20:48:44,458 INFO&nbsp; > > org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint [] - > Shutting > > down rest endpoint. > > 2021-11-18 20:48:44,474 INFO&nbsp; > > org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint [] - > Removing > > cache directory > > /tmp/flink-web-dd82c3c0-f457-492d-8e64-5ae74fe9abbd/flink-web-ui > > 2021-11-18 20:48:44,474 INFO&nbsp; > > org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint [] - > > http://cdh5node3:40216 lost leadership > > 2021-11-18 20:48:44,474 INFO&nbsp; > > org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint [] - > Shut > > down complete. > > 2021-11-18 20:48:44,474 INFO&nbsp; > > > org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent > > [] - Closing components. > > 2021-11-18 20:48:44,475 INFO&nbsp; > > > org.apache.flink.runtime.dispatcher.runner.SessionDispatcherLeaderProcess > > [] - Stopping SessionDispatcherLeaderProcess. > > 2021-11-18 20:48:44,475 INFO&nbsp; > > org.apache.flink.runtime.dispatcher.StandaloneDispatcher&nbsp; > &nbsp; > > &nbsp;[] - Stopping dispatcher akka.tcp://flink@cdh5node3 > > :34697/user/rpc/dispatcher_1. > > 2021-11-18 20:48:44,476 INFO&nbsp; > > org.apache.flink.runtime.dispatcher.StandaloneDispatcher&nbsp; > &nbsp; > > &nbsp;[] - Stopping all currently running jobs of dispatcher > > akka.tcp://flink@cdh5node3:34697/user/rpc/dispatcher_1. > > > > > > LogType:jobmanager.out > > Log Upload Time:星期四 十一月 18 20:48:45 +0800 2021 > > LogLength:37 > > Log Contents: > > /bin/python: No module named pyflink > > > > > > > > > > > > > > > > > > > > > > > > > > > > ------------------&nbsp;原始邮件&nbsp;------------------ > > 发件人: > > > "user-zh" > > > < > > [email protected]&gt;; > > 发送时间:&nbsp;2021年11月18日(星期四) 下午3:34 > > 收件人:&nbsp;"user-zh"<[email protected]&gt;; > > > > 主题:&nbsp;Re:哪里可以下载到downloads/setup-pyflink-virtual-env.sh脚本 > > > > > > > > Hi!<br/&gt;我在下面的旧版本中找到了,你试下能不能用:<br/&gt; > > > https://nightlies.apache.org/flink/flink-docs-release-1.12/downloads/setup-pyflink-virtual-env.sh > > > <https://nightlies.apache.org/flink/flink-docs-release-1.12/downloads/setup-pyflink-virtual-env.sh>>; > 在 2021-11-18 15:05:03,"Asahi Lee" <[email protected]&gt; 写道: > > &gt;Hi! > > &gt;&amp;nbsp; &amp;nbsp; > > > 我在flink官方文档中看到通过&amp;nbsp;&amp;nbsp;setup-pyflink-virtual-env.sh > > 脚本制作python虚拟环境,请问再哪里可以下载到?文档地址如下: > > &gt; > > > https://nightlies.apache.org/flink/flink-docs-release-1.14/zh/docs/dev/python/faq/
