Github user tzulitai commented on the issue:

    https://github.com/apache/flink/pull/2231
  
    Thank you for the description, @radekg .
    
    I think the problems you mentioned should be solvable by working on the 0.9 
connector to be just a bit more general, then users can simply manually use the 
0.10 jars. However, you also have a point on the possible confusion. IMHO, I 
think it is redundant to have two connector modules with almost the same code, 
and it doesn't also seem feasible for code maintainability to keep adding 
modules for new Kafka versions even if they don't have changes in the API.
    
    I think we'll need to loop in @rmetzger and @aljoscha to decide on how we 
can proceed with this. The solutions I currently see is to work on the 0.9 
connector on the above problems so it can be compatible with the 0.10 API, and 
either rename the module to be `flink-connecto-kafka-0.10` (doesn't seem good 
because it'll be breaking user's pom's), or add information to the 
documentation on how to work with Kafka 0.10. Either way, in the long-run, 
we'll probably still need to sort out a way to better manage the connector 
codes in situations of new external system versions like this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to