I created a jira FLINK-26858

丛鹏 <congpeng0...@gmail.com> 于2022年3月25日周五 10:24写道:

> fine,I'll create a jira issue
>
> 胡伟华 <huweihua....@gmail.com> 于2022年3月24日周四 19:59写道:
>
>> HI, congpeng
>>
>> Thanks for your report, I agree with you, we should be more explicit
>> about not finding the corresponding Factory for the specified target. Could
>> you create a jira issue for this?
>>
>>
>> > 2022年3月24日 下午7:05,丛鹏 <congpeng0...@gmail.com> 写道:
>> >
>> > Hi, I'm using flink1.12.2 a problem is found when submitting the task of
>> > yarn application,
>> > An example of the Flink official website submitting a task is
>> > ./bin/flink run-application -t yarn-application ./
>> > examples/streaming/TopSpeedWindowing. jar
>> > If some of them are misspelled, then yen application is written as yen
>> > application
>> > Will report an error:
>> >
>> > java.lang.IllegalStateException: No ClusterClientFactory found. If you
>> were
>> > targeting a Yarn cluster, please make sure to export the
>> HADOOP_CLASSPATH
>> > environment variable or have hadoop in your classpath. For more
>> information
>> > refer to the "Deployment" section of the official Apache Flink
>> > documentation.
>> >
>> > BUT
>> >
>> > I saw that the code is the 213 line configuration set encapsulated by
>> > clifrontend. There is a problem with effectiveconfiguration, resulting
>> in
>> > defaultclusterclientserviceloader Java: 83 judgment entry error
>> >
>> > Eventually lead  compatibleFactories.isEmpty() is true
>> > then
>> >
>> > "No ClusterClientFactory found. If you were targeting a Yarn cluster, "
>> >                            + "please make sure to export the
>> > HADOOP_CLASSPATH environment variable or have hadoop in your "
>> >                            + "classpath. For more information refer to
>> the
>> > \"Deployment\" section of the official "
>> >                            + "Apache Flink documentation."
>> >
>> >
>> > I think there is something wrong with the description of the error
>> > information here, which will lead to misleading. Users mistakenly think
>> it
>> > is their own Hadoop_ There is a problem with the classpath environment.
>> I
>> > hope you can reply
>>
>>

Reply via email to