Hi all,

I work on Azure Event Hubs (Microsoft's PaaS offering similar to Apache Kafka) 
and am trying to get our new Kafka 
head<https://azure.microsoft.com/en-us/blog/azure-event-hubs-for-kafka-ecosystems-in-public-preview/>
 to play nice with Spark's Kafka adapter. The goal is for our Kafka endpoint to 
be completely compatible with Spark's Kafka adapter, but I'm running into some 
issues that I think are related to versioning. I've been trying to tinker with 
the 
kafka-0-10-sql<https://github.com/apache/spark/tree/master/external/kafka-0-10-sql>
 and 
kafka-0-10<https://github.com/apache/spark/tree/master/external/kafka-0-10-sql> 
adapters on Github and was wondering if someone could take a second to point me 
in the right direction with:


  1.  What is the difference between those two adapters? My hunch is that 
kafka-0-10-sql supports structured streaming while kafka-10-0 still uses Spark 
streaming, but I haven't found anything to verify that.
  2.  Event Hubs' Kafka endpoint only supports Kafka 1.0 and later, and the 
errors I get when trying to connect to Spark ("failed to send SSL close 
message" / broken pipe errors) have usually shown up when using Kafka v0.10 
applications with our endpoint. I built from source after I saw that both 
libraries were updated for Kafka 2.0 support (late last week), but I'm still 
running into the same issues. Do Spark's Kafka adapters generally downgrade to 
Kafka v0.10 protocols? If not, is there any other reason to believe that a 
Kafka "broker" that doesn't support v0.10 protocols but supports v1.0+ would be 
incompatible with Spark's Kafka adapter?

Thanks in advance, please let me know if there's a different place I should be 
posting this

Sincerely,
Basil

Reply via email to