Any inputs ?

On Tue, Aug 11, 2020 at 10:34 AM Vijayendra Yadav <[email protected]>
wrote:

> Dawid, I was able to resolve the keytab issue by passing the service name,
> but now I am facing the KRB5 issue.
>
> Caused by: org.apache.kafka.common.errors.SaslAuthenticationException:
> Failed to create SaslClient with mechanism GSSAPI
> Caused by: javax.security.sasl.SaslException: Failure to initialize
> security context [Caused by GSSException: Invalid name provided (Mechanism
> level: KrbException: Cannot locate default realm)]
>
> I passed KRB5 from yaml conf file like:
>
> env.java.opts.jobmanager: -Djava.security.krb5.conf=/path/krb5.conf
> env.java.opts.taskmanager: -Djava.security.krb5.conf=/path/krb5.conf
>
> How can I resolve this? Is there another way to pass KRB5?
>
> I also tried via option#1 from flink run command -D parameter.
>
> Regards,
> Vijay
>
>
> On Tue, Aug 11, 2020 at 1:26 AM Dawid Wysakowicz <[email protected]>
> wrote:
>
>> Hi,
>>
>> As far as I know the approach 2) is the supported way of setting up
>> Kerberos authentication in Flink. In the second approach have you tried
>> setting the `sasl.kerberos.service.name` in the configuration of your
>> KafkaConsumer/Producer[1]? I think this might be the issue.
>>
>> Best,
>>
>> Dawid
>>
>> [1]
>> https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#enabling-kerberos-authentication
>>
>>
>> On 09/08/2020 20:39, Vijayendra Yadav wrote:
>>
>> Hi Team,
>>
>> I am trying to stream data from kafkaconsumer using:
>> https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html
>>
>> Here my KAFKA is Kerberos secured and SSL enabled.
>>
>> I am running my Flink streaming in yarn-cluster on EMR 5.31.
>>
>> I have tried to pass keytab/principal in following *2 Ways*:
>>
>> 1) Passing as JVM property in Flink run Command.
>>
>> /usr/lib/flink/bin/flink run
>>    -yt ${app_install_path}/conf/
>>         \
>>
>>> -Dsecurity.kerberos.login.use-ticket-cache=false
>>>      \
>>> -yDsecurity.kerberos.login.use-ticket-cache=false
>>>       \
>>> -Dsecurity.kerberos.login.keytab=${app_install_path}/conf/keytab  \
>>> -yDsecurity.kerberos.login.keytab=${app_install_path}/conf/.keytab \
>>> -Djava.security.krb5.conf=${app_install_path}/conf/krb5.conf
>>>      \
>>> -yDjava.security.krb5.conf=${app_install_path}/conf/krb5.conf
>>>       \
>>> [email protected]                 \
>>> -yDsecurity.kerberos.login.principal= [email protected]                \
>>> -Dsecurity.kerberos.login.contexts=Client,KafkaClient
>>>       \
>>> -yDsecurity.kerberos.login.contexts=Client,KafkaClient
>>>
>>
>> *Here, I am getting the following Error, it seems like KEYTAB Was not
>> transported to the run environment and probably not found.*
>>
>>
>> *org.apache.kafka.common.KafkaException: Failed to construct kafka
>> consumer Caused by: java.lang.IllegalArgumentException: Could not find a
>> 'KafkaClient' entry in the JAAS configuration. System property
>> 'java.security.auth.login.config'*
>>
>> 2) Passing from flink config:  * /usr/lib/flink/conf/flink-conf.yaml*
>>
>> security.kerberos.login.use-ticket-cache: false
>> security.kerberos.login.keytab:  ${app_install_path}/conf/keytab
>> security.kerberos.login.principal:  [email protected]
>> security.kerberos.login.contexts: Client,KafkaClient
>>
>> *Here, I am getting the following Error, *
>>
>>
>> *org.apache.kafka.common.KafkaException: Failed to construct kafka
>> consumer Caused by: org.apache.kafka.common.KafkaException:
>> java.lang.IllegalArgumentException: No serviceName defined in either JAAS
>> or Kafka config*
>>
>>
>> Could you please help find, what are probable causes and resolution?
>>
>> Regards,
>> Vijay
>>
>>

Reply via email to