Hi James,
I did the same sequence of steps that you wrote.
When the kafka processor starts I see this:
2018-09-05 22:44:02,999 INFO [Timer-Driven Process Thread-1]
o.a.k.clients.producer.ProducerConfig ProducerConfig values:
acks = 0
batch.size = 16384
bootstrap.servers = [xyz.servicebus.windows.net:9093]
buffer.memory = 33554432
client.id =
compression.type = none
connections.max.idle.ms = 540000
enable.idempotence = false
interceptor.classes = null
key.serializer = class
org.apache.kafka.common.serialization.ByteArraySerializer
linger.ms = 0
max.block.ms = 5000
max.in.flight.requests.per.connection = 5
max.request.size = 1048576
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.class = class
org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 0
retry.backoff.ms = 100
sasl.jaas.config = [hidden]
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = kafka
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.mechanism = GSSAPI
security.protocol = SASL_SSL
send.buffer.bytes = 131072
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = /home/joaohf/bin/nifi-1.7.1/cacerts
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class
org.apache.kafka.common.serialization.ByteArraySerializer
The Azure documentation says:
security.protocol=SASL_SSL
sasl.mechanism=PLAIN
But Nifi shows:
security.protocol = SASL_SSL
sasl.mechanism = GSSAPI
I'm using nifi 1.7.1.
How I could set sasl.mechanism to PLAIN and use SASL_SSL ? Maybe a
specific combination that Kafka processor hasn't support yet?
Thanks.
On Wed, Sep 5, 2018 at 5:59 PM James Srinivasan <[email protected]>
wrote:
> I've not tried this myself, but once you have a working JAAS config
> (from
> https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-quickstart-kafka-enabled-event-hubs#send-and-receive-messages-with-kafka-in-event-hubs
> ),
> set the corresponding protocol and mechanism properties in the NiFi
> processor, and put the content of sasl.jaas.config in a file and
> reference it from NiFi's bootstrap.conf as indicated by the second
> link you found.
>
> Good luck!
> On Wed, 5 Sep 2018 at 15:54, João Henrique Freitas <[email protected]>
> wrote:
> >
> >
> > Hello!
> >
> > I'm exploring Azure Event Hub with Kafka support. I know that's in
> preview.
> >
> > But I would like to know how to use PublishKafka with this configuration:
> >
> >
> https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-quickstart-kafka-enabled-event-hubs
> >
> > I don't know how to configure kafka processor with authentication
> parameters like:
> >
> > security.protocol=SASL_SSL
> > sasl.mechanism=PLAIN
> > sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule
> required username="$ConnectionString"
> password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
> >
> >
> > Should I follow this?
> >
> >
> https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-kafka-1-0-nar/1.7.1/org.apache.nifi.processors.kafka.pubsub.PublishKafka_1_0/additionalDetails.html
> >
> > Or maybe its necessary do a patch in Publish/Consumer KafkaProcessor ?
> >
> > Best regards,
> >
> > --
> > João Henrique Ferreira de Freitas - joaohf_at_gmail.com
> > Campinas-SP-Brasil
>
--
João Henrique Ferreira de Freitas - joaohf_at_gmail.com
Campinas-SP-Brasil