Ok. l will post this solution to  stackoverflow. The same question with
this on stackoverflow's link is below:
https://stackoverflow.com/q/53137481/7034070

On Wed, Nov 7, 2018, 18:09 Maximilian Michels <[email protected]> wrote:

> Posting solutions to the mailing list is a great way to share knowledge
> because the mailing lists are archived and indexed by your favorite
> search engine.
>
> Replicating this to Stackoverflow could help to further increase
> visibility, as the site is generally more appealing to a wider range of
> people. So why not :)
>
> On 06.11.18 18:56, Austin Bennett wrote:
> > Related to another thread:
> >
> > Is there a value in posting issues that get put here (with follow up
> > solutions, like this thread, which indeed was excellent to have shared
> > the solution with the list) in Stack Overflow?  Again, for ease of
> > discoverability, for those that face similar issues.  Not sure how would
> > formalize, but bringing up nonetheless.
> >
> >
> >
> > On Tue, Nov 6, 2018 at 1:49 AM Maximilian Michels <[email protected]
> > <mailto:[email protected]>> wrote:
> >
> >     Hi Fred,
> >
> >     I see! Thanks for posting your solution here.
> >
> >     Best,
> >     Max
> >
> >     On 06.11.18 03:49, K Fred wrote:
> >      > Hi Max,
> >      >
> >      > I have resolved this issue. It's caused by the flink cluster
> >     kerberos
> >      > configuration. Just need to set some config on flink-conf.yaml
> >     can make
> >      > it work fine!
> >      >
> >      > The settings is below:
> >      >
> >      > security.kerberos.login.use-ticket-cache: false
> >      > security.kerberos.login.keytab: /etc/kafka/kafka.keytab
> >      > security.kerberos.login.principal: [email protected]
> >     <mailto:[email protected]>
> >      > <mailto:[email protected] <mailto:[email protected]>>
> >      > security.kerberos.login.contexts: Client,KafkaClient
> >      >
> >      >
> >      > Thanks,
> >      > Fred.
> >      >
> >      > On Tue, Nov 6, 2018 at 2:56 AM Maximilian Michels <[email protected]
> >     <mailto:[email protected]>
> >      > <mailto:[email protected] <mailto:[email protected]>>> wrote:
> >      >
> >      >     Hi Fred,
> >      >
> >      >     Just to double check: Are you running this from a cluster or
> >     your local
> >      >     machine? Asking because the stack trace indicates that the
> >     exception
> >      >     occurs during job submission through the Flink command-line
> >     client. So
> >      >     the machine you're running this on should also have the file
> >     located in
> >      >     /etc.
> >      >
> >      >     Thanks,
> >      >     Max
> >      >
> >      >     On 05.11.18 12:26, K Fred wrote:
> >      >      > Hi Max,
> >      >      >
> >      >      > Yeah, The config is always located on the remote cluster.
> The
> >      >     exception
> >      >      > looks like that my application can find the config file,
> but
> >      >     cannot find
> >      >      > out the config's KafkaClient entry. So i guess the reason
> >     may be
> >      >     related
> >      >      > to flink cluster some settings!
> >      >      >
> >      >      > /These code depict some stack trace below:/
> >      >      >
> >      >
> >
>  -----------------------------------------------------------------------------
> >      >      > The program finished with the following exception:
> >      >      >
> >      >      >
> >     org.apache.flink.client.program.ProgramInvocationException: The main
> >      >      > method caused an error.
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:420)
> >      >      > at
> >      >
> >
>  org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:404)
> >      >      > at
> >      >      >
> >      >
> >
>  org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:785)
> >      >      > at
> >      >
> >
>  org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:279)
> >      >      > at
> >     org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:214)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1025)
> >      >      > at
> >      >      >
> >      >
> >
>  org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1101)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
> >      >      > at
> >      >
> >       org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1101)
> >      >      > Caused by: java.lang.RuntimeException: Error while
> translating
> >      >      > UnboundedSource:
> >      >     org.apache.beam.sdk.io.kafka.KafkaUnboundedSource@7bc6d27a
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$UnboundedReadSourceTranslator.translateNode(FlinkStreamingTransformTranslators.java:225)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$ReadSourceTranslator.translateNode(FlinkStreamingTransformTranslators.java:273)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.applyStreamingTransform(FlinkStreamingPipelineTranslator.java:122)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.visitPrimitiveTransform(FlinkStreamingPipelineTranslator.java:101)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:649)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:649)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:311)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:245)
> >      >      > at
> >      >
> >
>  org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:458)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.runners.flink.FlinkPipelineTranslator.translate(FlinkPipelineTranslator.java:38)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.translate(FlinkStreamingPipelineTranslator.java:53)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate(FlinkPipelineExecutionEnvironment.java:101)
> >      >      > at
> >      >
> >       org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:105)
> >      >      > at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313)
> >      >      > at org.apache.beam.sdk.Pipeline.run(Pipeline.java:299)
> >      >      > at
> >      >
> >       ac.cn.iie.process.RelationProcess.process(RelationProcess.java:119)
> >      >      > at ac.cn.iie.Bootstrap.main(Bootstrap.java:16)
> >      >      > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> >      >      > at
> >      >      >
> >      >
> >
>  sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >      >      > at
> >      >      >
> >      >
> >
>  
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >      >      > at java.lang.reflect.Method.invoke(Method.java:498)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
> >      >      > ... 9 more
> >      >      > Caused by: org.apache.kafka.common.KafkaException: Failed
> to
> >      >     construct
> >      >      > kafka consumer
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:781)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:604)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:587)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.sdk.io.kafka.KafkaUnboundedSource.split(KafkaUnboundedSource.java:64)
> >      >      > at
> >      >      >
> >     org.apache.beam.runners.flink.translation.wrappers.streaming.io
> >     <
> http://org.apache.beam.runners.flink.translation.wrappers.streaming.io>
> >      >
> >       <http://streaming.io
> >.UnboundedSourceWrapper.<init>(UnboundedSourceWrapper.java:150)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$UnboundedReadSourceTranslator.translateNode(FlinkStreamingTransformTranslators.java:206)
> >      >      > ... 31 more
> >      >      > Caused by: java.lang.IllegalArgumentException: Could not
> >     find a
> >      >      > 'KafkaClient' entry in the JAAS configuration. System
> property
> >      >      > 'java.security.auth.login.config' is
> >     /etc/kafka/kafka_sink_jaas.conf
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.kafka.common.security.JaasContext.defaultContext(JaasContext.java:131)
> >      >      > at
> >      >
> >
>  org.apache.kafka.common.security.JaasContext.load(JaasContext.java:96)
> >      >      > at
> >      >
> >
>  org.apache.kafka.common.security.JaasContext.load(JaasContext.java:78)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:103)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:61)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:86)
> >      >      > at
> >      >      >
> >      >
> >
>  
> org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:702)
> >      >      > ... 36 more
> >      >      >
> >      >      > Thanks!
> >      >      >
> >      >      > On Mon, Nov 5, 2018 at 6:29 PM Maximilian Michels
> >     <[email protected] <mailto:[email protected]>
> >      >     <mailto:[email protected] <mailto:[email protected]>>
> >      >      > <mailto:[email protected] <mailto:[email protected]>
> >     <mailto:[email protected] <mailto:[email protected]>>>> wrote:
> >      >      >
> >      >      >     Hi Fred,
> >      >      >
> >      >      >     This is hard to debug without further information.
> Maybe a
> >      >     stack trace
> >      >      >     would help. Are you sure the config is also located on
> >     the remote
> >      >      >     cluster?
> >      >      >
> >      >      >     Thanks,
> >      >      >     Max
> >      >      >
> >      >      >     On 03.11.18 15:45, K Fred wrote:
> >      >      >      > Hi,
> >      >      >      >
> >      >      >      > I am running into a very strange issue that
> >      >      >      > 'Could not find a "KafkaClient" entry in the JAAS
> >      >     configuration.
> >      >      >     System
> >      >      >      > property "java.security.auth.login.config" is
> >      >      >      > /etc/kafka/kafka_sink_jaas.conf'
> >      >      >      > on a single node flink cluster when i consume
> >     record from
> >      >     kafka
> >      >      >     using
> >      >      >      > beam KafkaIO.
> >      >      >      >
> >      >      >      > The JAAS file contains 'KafkaClient', but the flink
> >     cluster
> >      >      >     cannot find
> >      >      >      > the entry. Does anyone know the reason caused?
> >      >      >      >
> >      >      >      > Thanks!
> >      >      >      > --
> >      >      >      >
> >      >      >      > Fred
> >      >      >      >
> >      >      >
> >      >      > --
> >      >      >
> >      >      > Fred
> >      >      >
> >      >
> >      > --
> >      >
> >      > Fred
> >      >
> >
>
-- 

Fred

Reply via email to