Github user markgrover commented on the pull request:

    https://github.com/apache/spark/pull/11143#issuecomment-189383164
  
    Thanks @srowen and @koeninger for your thoughts.
    
    I did take a look at what other projects are doing. In particular, I looked 
at Storm and Flume since they both depend on Kafka in ways similar to Spark 
streaming.
    
    Storm has [this PR](https://github.com/apache/storm/pull/986/files) for 
bumping to Kafka 0.9 which is doing the same thing I am proposing in this PR - 
bumping Kafka from 0.8 to 0.9 without keeping support for 0.8. 
    
    Flume is doing the same thing in 
[FLUME-2855](https://issues.apache.org/jira/browse/FLUME-2855) - moving to 
support 0.9 without keeping 0.8 support.
    
    For both Storm and Flume, it's not even a major release, like it is the 
case for us.
    
    Long story short, I want to do the right thing for Spark, and I am more 
than happy to go implement the 2 subproject approach with maven profiles - one 
supporting Kafka 0.8 and other supporting Kafka 0.9. I also realize that Kafka 
is not a simple library dependency and there is a good chunk of operational 
overhead involved in folks upgrading to Kafka. However, the way things are 
looking right now, perhaps, that's the way Kafka community (cc @jkreps 
@gwenshap) wanted them to, folks essentially would have to move to Kafka 0.9 
because of one of their other Kafka apps may need it. The reason I raised this 
discussion of getting rid of Kafka 0.8 support is because we will have carry 
the burden of supporting Kafka 0.8.x line for Spark 2.x line and now is our 
chance to think through this and drop support for Kafka 0.8, if we decide to.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to