Hi Darshan,

Did you try passing the config directly as an option, like this:

.option("kafka.sasl.jaas.config", saslConfig)


Where saslConfig can look like:

com.sun.security.auth.module.Krb5LoginModule required \
        useKeyTab=true \
        storeKey=true  \
        keyTab="/etc/security/keytabs/kafka_client.keytab" \
        principal="kafka-clien...@example.com";

Reference: http://kafka.apache.org/documentation.html#
security_kerberos_sasl_clientconfig

Thanks,
Prashant.

On Tue, Oct 17, 2017 at 11:21 AM, Darshan Pandya <darshanpan...@gmail.com>
wrote:

> HI Burak,
>
> Well turns out it worked fine when i submit in cluster mode. I also tried
> to convert my app in dstreams. In dstreams too it works well only when
> deployed in cluster mode.
>
> Here is how i configured the stream.
>
>
> val lines = spark.readStream
>   .format("kafka")
>   .option("kafka.bootstrap.servers", jobParams.boorstrapServer)
>   .option("subscribe", jobParams.sourceTopic)
>   .option("startingOffsets", "latest")
>   .option("minPartitions", "10")
>   .option("failOnDataLoss", "true")
>   .load()
>
>
>
> Sincerely,
> Darshan
>
> On Mon, Oct 16, 2017 at 12:08 PM, Burak Yavuz <brk...@gmail.com> wrote:
>
>> Hi Darshan,
>>
>> How are you creating your kafka stream? Can you please share the options
>> you provide?
>>
>> spark.readStream.format("kafka")
>>   .option(...) // all these please
>>   .load()
>>
>>
>> On Sat, Oct 14, 2017 at 1:55 AM, Darshan Pandya <darshanpan...@gmail.com>
>> wrote:
>>
>>> Hello,
>>>
>>> I'm using Spark 2.1.0 on CDH 5.8 with kafka 0.10.0.1 + kerberos
>>>
>>> I am unable to connect to the kafka broker with the following message
>>>
>>>
>>> 17/10/14 14:29:10 WARN clients.NetworkClient: Bootstrap broker
>>> 10.197.19.25:9092 disconnected
>>>
>>> and is unable to consume any messages.
>>>
>>> And am using it as follows
>>>
>>> jaas.conf
>>>
>>> KafkaClient {
>>> com.sun.security.auth.module.Krb5LoginModule required
>>> useKeyTab=true
>>> keyTab="./gandalf.keytab"
>>> storeKey=true
>>> useTicketCache=false
>>> serviceName="kafka"
>>> principal="gand...@domain.com";
>>> };
>>>
>>> $SPARK_HOME/bin/spark-submit \
>>> --master yarn \
>>> --files jaas.conf,gandalf.keytab \
>>> --driver-java-options "-Djava.security.auth.login.config=./jaas.conf 
>>> -Dhdp.version=2.4.2.0-258" \
>>> --conf 
>>> "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=./jaas.conf"
>>>  \
>>> --class com.example.ClassName uber-jar-with-deps-and-hive-site.jar
>>>
>>> Thanks in advance.
>>>
>>>
>>> --
>>> Sincerely,
>>> Darshan
>>>
>>>
>>
>
>
> --
> Sincerely,
> Darshan
>
>

Reply via email to