Re: [DISCUSS] Kafka 2.0.0 in June 2018
Yes, makes sense. Ismael On Mon, Nov 20, 2017 at 4:49 PM, Gwen Shapirawrote: > Agree. I don't know that it actually matters. They can keep using whatever > they are using now since we don't plan on breaking the protocol. > > But since the issue does keep coming up, I figured we'll need a clear > message around what the removal means and what users need to do. > > Gwen > > > On Mon, Nov 20, 2017 at 8:21 AM Ismael Juma wrote: > > > It's worth emphasizing that the impact to such users is independent of > > whether we remove the old high-level consumer in 2.0.0 or not. They are > > unable to use the message format introduced in 0.11.0 or security > features > > today. > > > > Ismael > > > > On Mon, Nov 20, 2017 at 4:11 PM, Gwen Shapira wrote: > > > > > > > > > > > > > > Personally, I suspect that those who absolutely need a rolling > > migration > > > > and cannot handle a short period of downtime while doing a migration > > > > probably have in-house experts on Kafka who are familiar with the > > issues > > > > and willing to figure out a solution. The rest of the world can > > generally > > > > handle a short maintenance window. > > > > > > > > > > I really wish that was true :) > > > I know at least a few companies who are stuck with "no downtime" policy > > and > > > not enough expertise to do with kind of migration (which is really > > > non-trivial). > > > > > > We can say "not our problem", but as we know, lack of good migration > path > > > really slows down adoption (Python 3.0, for instance). > > > > > > I'd love to at least get a feel of how many in the community will be > > > impacted. > > > > > > Gwen > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Fri, Nov 10, 2017 at 10:46 AM, Ismael Juma > > wrote: > > > > > > > > > Hi Gwen, > > > > > > > > > > A KIP has been proposed, but it is stalled: > > > > > > > > > > https://cwiki.apache.org/confluence/display/KAFKA/KIP-125%3A+ > > > > > ZookeeperConsumerConnector+to+KafkaConsumer+Migration+and+Rollback > > > > > > > > > > Unless the interested parties pick that up, we would drop support > > > > without a > > > > > rolling upgrade path. Users would be able to use the old consumers > > from > > > > > 1.1.x for a long time. The old Scala clients don't support the > > message > > > > > format introduced in 0.11.0, so the feature set is pretty much > frozen > > > and > > > > > there's little benefit in upgrading. But there is a cost in keeping > > > them > > > > in > > > > > the codebase. > > > > > > > > > > Ismael > > > > > > > > > > On Fri, Nov 10, 2017 at 6:02 PM, Gwen Shapira > > > wrote: > > > > > > > > > > > Last time we tried deprecating the Scala consumer, there were > > > concerns > > > > > > about a lack of upgrade path. There is no rolling upgrade, and > > > > migrating > > > > > > offsets is not trivial (and not documented). > > > > > > > > > > > > Did anything change in that regard? Or are we planning on > dropping > > > > > support > > > > > > without an upgrade path? > > > > > > > > > > > > > > > > > > On Fri, Nov 10, 2017 at 5:37 PM Guozhang Wang < > wangg...@gmail.com> > > > > > wrote: > > > > > > > > > > > > > Thanks Ismael, the proposal looks good to me. > > > > > > > > > > > > > > A side note regarding: https://issues.apache.org/ > > > > > jira/browse/KAFKA-5637, > > > > > > > could we resolve this ticket sooner than later to make clear > > about > > > > the > > > > > > code > > > > > > > deprecation and support duration when moving from 1.0.x to > 2.0.x? > > > > > > > > > > > > > > > > > > > > > Guozhang > > > > > > > > > > > > > > > > > > > > > On Fri, Nov 10, 2017 at 3:44 AM, Ismael Juma < > ism...@juma.me.uk> > > > > > wrote: > > > > > > > > > > > > > > > Features for 2.0.0 will be known after 1.1.0 is released in > > > > February > > > > > > > 2018. > > > > > > > > We are still doing the usual time-based release process[1]. > > > > > > > > > > > > > > > > I am raising this well ahead of time because of the potential > > > > impact > > > > > of > > > > > > > > removing the old Scala clients (particularly the old > high-level > > > > > > consumer) > > > > > > > > and dropping support for Java 7. Hopefully users can then > plan > > > > > > > accordingly. > > > > > > > > We would do these changes in trunk soon after 1.1.0 is > released > > > > > (around > > > > > > > > February). > > > > > > > > > > > > > > > > I think it makes sense to complete some of the work that was > > not > > > > > ready > > > > > > in > > > > > > > > time for 1.0.0 (Controller improvements and JBOD are two that > > > come > > > > to > > > > > > > mind) > > > > > > > > in 1.1.0 (January 2018) and combined with the desire to give > > > > advance > > > > > > > > notice, June 2018 was the logical choice. > > > > > > > > > > > > > > > > There is no plan to support a particular release for longer. > > 1.x > > > > > versus > > > > > > > 2.x > > > > > > >
Re: [DISCUSS] Kafka 2.0.0 in June 2018
Agree. I don't know that it actually matters. They can keep using whatever they are using now since we don't plan on breaking the protocol. But since the issue does keep coming up, I figured we'll need a clear message around what the removal means and what users need to do. Gwen On Mon, Nov 20, 2017 at 8:21 AM Ismael Jumawrote: > It's worth emphasizing that the impact to such users is independent of > whether we remove the old high-level consumer in 2.0.0 or not. They are > unable to use the message format introduced in 0.11.0 or security features > today. > > Ismael > > On Mon, Nov 20, 2017 at 4:11 PM, Gwen Shapira wrote: > > > > > > > > > > Personally, I suspect that those who absolutely need a rolling > migration > > > and cannot handle a short period of downtime while doing a migration > > > probably have in-house experts on Kafka who are familiar with the > issues > > > and willing to figure out a solution. The rest of the world can > generally > > > handle a short maintenance window. > > > > > > > I really wish that was true :) > > I know at least a few companies who are stuck with "no downtime" policy > and > > not enough expertise to do with kind of migration (which is really > > non-trivial). > > > > We can say "not our problem", but as we know, lack of good migration path > > really slows down adoption (Python 3.0, for instance). > > > > I'd love to at least get a feel of how many in the community will be > > impacted. > > > > Gwen > > > > > > > > > > > > > > > > > > > > > On Fri, Nov 10, 2017 at 10:46 AM, Ismael Juma > wrote: > > > > > > > Hi Gwen, > > > > > > > > A KIP has been proposed, but it is stalled: > > > > > > > > https://cwiki.apache.org/confluence/display/KAFKA/KIP-125%3A+ > > > > ZookeeperConsumerConnector+to+KafkaConsumer+Migration+and+Rollback > > > > > > > > Unless the interested parties pick that up, we would drop support > > > without a > > > > rolling upgrade path. Users would be able to use the old consumers > from > > > > 1.1.x for a long time. The old Scala clients don't support the > message > > > > format introduced in 0.11.0, so the feature set is pretty much frozen > > and > > > > there's little benefit in upgrading. But there is a cost in keeping > > them > > > in > > > > the codebase. > > > > > > > > Ismael > > > > > > > > On Fri, Nov 10, 2017 at 6:02 PM, Gwen Shapira > > wrote: > > > > > > > > > Last time we tried deprecating the Scala consumer, there were > > concerns > > > > > about a lack of upgrade path. There is no rolling upgrade, and > > > migrating > > > > > offsets is not trivial (and not documented). > > > > > > > > > > Did anything change in that regard? Or are we planning on dropping > > > > support > > > > > without an upgrade path? > > > > > > > > > > > > > > > On Fri, Nov 10, 2017 at 5:37 PM Guozhang Wang > > > > wrote: > > > > > > > > > > > Thanks Ismael, the proposal looks good to me. > > > > > > > > > > > > A side note regarding: https://issues.apache.org/ > > > > jira/browse/KAFKA-5637, > > > > > > could we resolve this ticket sooner than later to make clear > about > > > the > > > > > code > > > > > > deprecation and support duration when moving from 1.0.x to 2.0.x? > > > > > > > > > > > > > > > > > > Guozhang > > > > > > > > > > > > > > > > > > On Fri, Nov 10, 2017 at 3:44 AM, Ismael Juma > > > > wrote: > > > > > > > > > > > > > Features for 2.0.0 will be known after 1.1.0 is released in > > > February > > > > > > 2018. > > > > > > > We are still doing the usual time-based release process[1]. > > > > > > > > > > > > > > I am raising this well ahead of time because of the potential > > > impact > > > > of > > > > > > > removing the old Scala clients (particularly the old high-level > > > > > consumer) > > > > > > > and dropping support for Java 7. Hopefully users can then plan > > > > > > accordingly. > > > > > > > We would do these changes in trunk soon after 1.1.0 is released > > > > (around > > > > > > > February). > > > > > > > > > > > > > > I think it makes sense to complete some of the work that was > not > > > > ready > > > > > in > > > > > > > time for 1.0.0 (Controller improvements and JBOD are two that > > come > > > to > > > > > > mind) > > > > > > > in 1.1.0 (January 2018) and combined with the desire to give > > > advance > > > > > > > notice, June 2018 was the logical choice. > > > > > > > > > > > > > > There is no plan to support a particular release for longer. > 1.x > > > > versus > > > > > > 2.x > > > > > > > is no different than 0.10.x versus 0.11.x from the perspective > of > > > > > > > supporting older releases. > > > > > > > > > > > > > > [1] https://cwiki.apache.org/confluence/display/KAFKA/Time+ > > > > > > > Based+Release+Plan > > > > > > > > > > > > > > On Fri, Nov 10, 2017 at 11:21 AM, Jaikiran Pai < > > > > > jai.forums2...@gmail.com > > > > > > > > > > > > > > wrote: > > > > > > > > > > > > >
Re: [DISCUSS] Kafka 2.0.0 in June 2018
It's worth emphasizing that the impact to such users is independent of whether we remove the old high-level consumer in 2.0.0 or not. They are unable to use the message format introduced in 0.11.0 or security features today. Ismael On Mon, Nov 20, 2017 at 4:11 PM, Gwen Shapirawrote: > > > > > > Personally, I suspect that those who absolutely need a rolling migration > > and cannot handle a short period of downtime while doing a migration > > probably have in-house experts on Kafka who are familiar with the issues > > and willing to figure out a solution. The rest of the world can generally > > handle a short maintenance window. > > > > I really wish that was true :) > I know at least a few companies who are stuck with "no downtime" policy and > not enough expertise to do with kind of migration (which is really > non-trivial). > > We can say "not our problem", but as we know, lack of good migration path > really slows down adoption (Python 3.0, for instance). > > I'd love to at least get a feel of how many in the community will be > impacted. > > Gwen > > > > > > > > > > > > > On Fri, Nov 10, 2017 at 10:46 AM, Ismael Juma wrote: > > > > > Hi Gwen, > > > > > > A KIP has been proposed, but it is stalled: > > > > > > https://cwiki.apache.org/confluence/display/KAFKA/KIP-125%3A+ > > > ZookeeperConsumerConnector+to+KafkaConsumer+Migration+and+Rollback > > > > > > Unless the interested parties pick that up, we would drop support > > without a > > > rolling upgrade path. Users would be able to use the old consumers from > > > 1.1.x for a long time. The old Scala clients don't support the message > > > format introduced in 0.11.0, so the feature set is pretty much frozen > and > > > there's little benefit in upgrading. But there is a cost in keeping > them > > in > > > the codebase. > > > > > > Ismael > > > > > > On Fri, Nov 10, 2017 at 6:02 PM, Gwen Shapira > wrote: > > > > > > > Last time we tried deprecating the Scala consumer, there were > concerns > > > > about a lack of upgrade path. There is no rolling upgrade, and > > migrating > > > > offsets is not trivial (and not documented). > > > > > > > > Did anything change in that regard? Or are we planning on dropping > > > support > > > > without an upgrade path? > > > > > > > > > > > > On Fri, Nov 10, 2017 at 5:37 PM Guozhang Wang > > > wrote: > > > > > > > > > Thanks Ismael, the proposal looks good to me. > > > > > > > > > > A side note regarding: https://issues.apache.org/ > > > jira/browse/KAFKA-5637, > > > > > could we resolve this ticket sooner than later to make clear about > > the > > > > code > > > > > deprecation and support duration when moving from 1.0.x to 2.0.x? > > > > > > > > > > > > > > > Guozhang > > > > > > > > > > > > > > > On Fri, Nov 10, 2017 at 3:44 AM, Ismael Juma > > > wrote: > > > > > > > > > > > Features for 2.0.0 will be known after 1.1.0 is released in > > February > > > > > 2018. > > > > > > We are still doing the usual time-based release process[1]. > > > > > > > > > > > > I am raising this well ahead of time because of the potential > > impact > > > of > > > > > > removing the old Scala clients (particularly the old high-level > > > > consumer) > > > > > > and dropping support for Java 7. Hopefully users can then plan > > > > > accordingly. > > > > > > We would do these changes in trunk soon after 1.1.0 is released > > > (around > > > > > > February). > > > > > > > > > > > > I think it makes sense to complete some of the work that was not > > > ready > > > > in > > > > > > time for 1.0.0 (Controller improvements and JBOD are two that > come > > to > > > > > mind) > > > > > > in 1.1.0 (January 2018) and combined with the desire to give > > advance > > > > > > notice, June 2018 was the logical choice. > > > > > > > > > > > > There is no plan to support a particular release for longer. 1.x > > > versus > > > > > 2.x > > > > > > is no different than 0.10.x versus 0.11.x from the perspective of > > > > > > supporting older releases. > > > > > > > > > > > > [1] https://cwiki.apache.org/confluence/display/KAFKA/Time+ > > > > > > Based+Release+Plan > > > > > > > > > > > > On Fri, Nov 10, 2017 at 11:21 AM, Jaikiran Pai < > > > > jai.forums2...@gmail.com > > > > > > > > > > > > wrote: > > > > > > > > > > > > > Hi Ismael, > > > > > > > > > > > > > > Are there any new features other than the language specific > > changes > > > > > that > > > > > > > are being planned for 2.0.0? Also, when 2.x gets released, will > > the > > > > 1.x > > > > > > > series see continued bug fixes and releases in the community or > > is > > > > the > > > > > > plan > > > > > > > to have one single main version that gets continuous updates > and > > > > > > releases? > > > > > > > > > > > > > > By the way, why June 2018? :) > > > > > > > > > > > > > > -Jaikiran > > > > > > > > > > > > > > > > > > > > > > > > > > > > On 09/11/17 3:14 PM, Ismael Juma wrote: > > >
Re: [DISCUSS] Kafka 2.0.0 in June 2018
> > > Personally, I suspect that those who absolutely need a rolling migration > and cannot handle a short period of downtime while doing a migration > probably have in-house experts on Kafka who are familiar with the issues > and willing to figure out a solution. The rest of the world can generally > handle a short maintenance window. > I really wish that was true :) I know at least a few companies who are stuck with "no downtime" policy and not enough expertise to do with kind of migration (which is really non-trivial). We can say "not our problem", but as we know, lack of good migration path really slows down adoption (Python 3.0, for instance). I'd love to at least get a feel of how many in the community will be impacted. Gwen > > > > > On Fri, Nov 10, 2017 at 10:46 AM, Ismael Jumawrote: > > > Hi Gwen, > > > > A KIP has been proposed, but it is stalled: > > > > https://cwiki.apache.org/confluence/display/KAFKA/KIP-125%3A+ > > ZookeeperConsumerConnector+to+KafkaConsumer+Migration+and+Rollback > > > > Unless the interested parties pick that up, we would drop support > without a > > rolling upgrade path. Users would be able to use the old consumers from > > 1.1.x for a long time. The old Scala clients don't support the message > > format introduced in 0.11.0, so the feature set is pretty much frozen and > > there's little benefit in upgrading. But there is a cost in keeping them > in > > the codebase. > > > > Ismael > > > > On Fri, Nov 10, 2017 at 6:02 PM, Gwen Shapira wrote: > > > > > Last time we tried deprecating the Scala consumer, there were concerns > > > about a lack of upgrade path. There is no rolling upgrade, and > migrating > > > offsets is not trivial (and not documented). > > > > > > Did anything change in that regard? Or are we planning on dropping > > support > > > without an upgrade path? > > > > > > > > > On Fri, Nov 10, 2017 at 5:37 PM Guozhang Wang > > wrote: > > > > > > > Thanks Ismael, the proposal looks good to me. > > > > > > > > A side note regarding: https://issues.apache.org/ > > jira/browse/KAFKA-5637, > > > > could we resolve this ticket sooner than later to make clear about > the > > > code > > > > deprecation and support duration when moving from 1.0.x to 2.0.x? > > > > > > > > > > > > Guozhang > > > > > > > > > > > > On Fri, Nov 10, 2017 at 3:44 AM, Ismael Juma > > wrote: > > > > > > > > > Features for 2.0.0 will be known after 1.1.0 is released in > February > > > > 2018. > > > > > We are still doing the usual time-based release process[1]. > > > > > > > > > > I am raising this well ahead of time because of the potential > impact > > of > > > > > removing the old Scala clients (particularly the old high-level > > > consumer) > > > > > and dropping support for Java 7. Hopefully users can then plan > > > > accordingly. > > > > > We would do these changes in trunk soon after 1.1.0 is released > > (around > > > > > February). > > > > > > > > > > I think it makes sense to complete some of the work that was not > > ready > > > in > > > > > time for 1.0.0 (Controller improvements and JBOD are two that come > to > > > > mind) > > > > > in 1.1.0 (January 2018) and combined with the desire to give > advance > > > > > notice, June 2018 was the logical choice. > > > > > > > > > > There is no plan to support a particular release for longer. 1.x > > versus > > > > 2.x > > > > > is no different than 0.10.x versus 0.11.x from the perspective of > > > > > supporting older releases. > > > > > > > > > > [1] https://cwiki.apache.org/confluence/display/KAFKA/Time+ > > > > > Based+Release+Plan > > > > > > > > > > On Fri, Nov 10, 2017 at 11:21 AM, Jaikiran Pai < > > > jai.forums2...@gmail.com > > > > > > > > > > wrote: > > > > > > > > > > > Hi Ismael, > > > > > > > > > > > > Are there any new features other than the language specific > changes > > > > that > > > > > > are being planned for 2.0.0? Also, when 2.x gets released, will > the > > > 1.x > > > > > > series see continued bug fixes and releases in the community or > is > > > the > > > > > plan > > > > > > to have one single main version that gets continuous updates and > > > > > releases? > > > > > > > > > > > > By the way, why June 2018? :) > > > > > > > > > > > > -Jaikiran > > > > > > > > > > > > > > > > > > > > > > > > On 09/11/17 3:14 PM, Ismael Juma wrote: > > > > > > > > > > > >> Hi all, > > > > > >> > > > > > >> I'm starting this discussion early because of the potential > > impact. > > > > > >> > > > > > >> Kafka 1.0.0 was just released and the focus was on achieving the > > > > > original > > > > > >> project vision in terms of features provided while maintaining > > > > > >> compatibility for the most part (i.e. we did not remove > deprecated > > > > > >> components like the Scala clients). > > > > > >> > > > > > >> This was the right decision, in my opinion, but it's time to > start > > > > > >> thinking > > > > > >> about 2.0.0, which is
Re: [DISCUSS] Kafka 2.0.0 in June 2018
Thanks for the update Onur. Are you and the other committers and contributors from LinkedIn planning to push this over the line? Ismael On Fri, Nov 10, 2017 at 9:53 PM, Onur Karamanwrote: > Hey everyone. Regarding the status of KIP-125, just a heads up: I have an > implementation of KIP-125 (KAFKA-4513) here: > https://github.com/onurkaraman/kafka/commit/3b5448006ab70ba2b0b5e177853d19 > 1d0f777452 > > The code might need to be rebased. The steps described in the KIP are a bit > involved. Other than that, the implementation might have a bug with respect > to converting arbitrary blacklist regexes to whitelist regexes since the > new consumer only accepts whitelists. > > On Fri, Nov 10, 2017 at 11:36 AM, Jeff Widman wrote: > > > Re: migrating offsets for old Scala consumers. > > > > I work in the python world, so haven't directly used the old high level > > consumer, but from what I understand the underlying problem remains the > > migration of zookeeper offsets to the __consumer_offsets topic. > > > > We've used a slightly modified version of Grant Henke's script for > > migrating offsets here: https://github.com/apache/kafka/pull/2615 > > It doesn't support rolling upgrades, but other than that it's great... > I've > > used it for multiple migrations, and very thankful for the time Grant put > > into it. > > > > I don't know that it's worth pulling this into core, it might be, it > might > > not be. But it probably is worth documenting the procedure at least > > somewhere. > > > > Personally, I suspect that those who absolutely need a rolling migration > > and cannot handle a short period of downtime while doing a migration > > probably have in-house experts on Kafka who are familiar with the issues > > and willing to figure out a solution. The rest of the world can generally > > handle a short maintenance window. > > > > > > > > > > On Fri, Nov 10, 2017 at 10:46 AM, Ismael Juma wrote: > > > > > Hi Gwen, > > > > > > A KIP has been proposed, but it is stalled: > > > > > > https://cwiki.apache.org/confluence/display/KAFKA/KIP-125%3A+ > > > ZookeeperConsumerConnector+to+KafkaConsumer+Migration+and+Rollback > > > > > > Unless the interested parties pick that up, we would drop support > > without a > > > rolling upgrade path. Users would be able to use the old consumers from > > > 1.1.x for a long time. The old Scala clients don't support the message > > > format introduced in 0.11.0, so the feature set is pretty much frozen > and > > > there's little benefit in upgrading. But there is a cost in keeping > them > > in > > > the codebase. > > > > > > Ismael > > > > > > On Fri, Nov 10, 2017 at 6:02 PM, Gwen Shapira > wrote: > > > > > > > Last time we tried deprecating the Scala consumer, there were > concerns > > > > about a lack of upgrade path. There is no rolling upgrade, and > > migrating > > > > offsets is not trivial (and not documented). > > > > > > > > Did anything change in that regard? Or are we planning on dropping > > > support > > > > without an upgrade path? > > > > > > > > > > > > On Fri, Nov 10, 2017 at 5:37 PM Guozhang Wang > > > wrote: > > > > > > > > > Thanks Ismael, the proposal looks good to me. > > > > > > > > > > A side note regarding: https://issues.apache.org/ > > > jira/browse/KAFKA-5637, > > > > > could we resolve this ticket sooner than later to make clear about > > the > > > > code > > > > > deprecation and support duration when moving from 1.0.x to 2.0.x? > > > > > > > > > > > > > > > Guozhang > > > > > > > > > > > > > > > On Fri, Nov 10, 2017 at 3:44 AM, Ismael Juma > > > wrote: > > > > > > > > > > > Features for 2.0.0 will be known after 1.1.0 is released in > > February > > > > > 2018. > > > > > > We are still doing the usual time-based release process[1]. > > > > > > > > > > > > I am raising this well ahead of time because of the potential > > impact > > > of > > > > > > removing the old Scala clients (particularly the old high-level > > > > consumer) > > > > > > and dropping support for Java 7. Hopefully users can then plan > > > > > accordingly. > > > > > > We would do these changes in trunk soon after 1.1.0 is released > > > (around > > > > > > February). > > > > > > > > > > > > I think it makes sense to complete some of the work that was not > > > ready > > > > in > > > > > > time for 1.0.0 (Controller improvements and JBOD are two that > come > > to > > > > > mind) > > > > > > in 1.1.0 (January 2018) and combined with the desire to give > > advance > > > > > > notice, June 2018 was the logical choice. > > > > > > > > > > > > There is no plan to support a particular release for longer. 1.x > > > versus > > > > > 2.x > > > > > > is no different than 0.10.x versus 0.11.x from the perspective of > > > > > > supporting older releases. > > > > > > > > > > > > [1] https://cwiki.apache.org/confluence/display/KAFKA/Time+ > > > > > >
Re: [DISCUSS] Kafka 2.0.0 in June 2018
Hey everyone. Regarding the status of KIP-125, just a heads up: I have an implementation of KIP-125 (KAFKA-4513) here: https://github.com/onurkaraman/kafka/commit/3b5448006ab70ba2b0b5e177853d191d0f777452 The code might need to be rebased. The steps described in the KIP are a bit involved. Other than that, the implementation might have a bug with respect to converting arbitrary blacklist regexes to whitelist regexes since the new consumer only accepts whitelists. On Fri, Nov 10, 2017 at 11:36 AM, Jeff Widmanwrote: > Re: migrating offsets for old Scala consumers. > > I work in the python world, so haven't directly used the old high level > consumer, but from what I understand the underlying problem remains the > migration of zookeeper offsets to the __consumer_offsets topic. > > We've used a slightly modified version of Grant Henke's script for > migrating offsets here: https://github.com/apache/kafka/pull/2615 > It doesn't support rolling upgrades, but other than that it's great... I've > used it for multiple migrations, and very thankful for the time Grant put > into it. > > I don't know that it's worth pulling this into core, it might be, it might > not be. But it probably is worth documenting the procedure at least > somewhere. > > Personally, I suspect that those who absolutely need a rolling migration > and cannot handle a short period of downtime while doing a migration > probably have in-house experts on Kafka who are familiar with the issues > and willing to figure out a solution. The rest of the world can generally > handle a short maintenance window. > > > > > On Fri, Nov 10, 2017 at 10:46 AM, Ismael Juma wrote: > > > Hi Gwen, > > > > A KIP has been proposed, but it is stalled: > > > > https://cwiki.apache.org/confluence/display/KAFKA/KIP-125%3A+ > > ZookeeperConsumerConnector+to+KafkaConsumer+Migration+and+Rollback > > > > Unless the interested parties pick that up, we would drop support > without a > > rolling upgrade path. Users would be able to use the old consumers from > > 1.1.x for a long time. The old Scala clients don't support the message > > format introduced in 0.11.0, so the feature set is pretty much frozen and > > there's little benefit in upgrading. But there is a cost in keeping them > in > > the codebase. > > > > Ismael > > > > On Fri, Nov 10, 2017 at 6:02 PM, Gwen Shapira wrote: > > > > > Last time we tried deprecating the Scala consumer, there were concerns > > > about a lack of upgrade path. There is no rolling upgrade, and > migrating > > > offsets is not trivial (and not documented). > > > > > > Did anything change in that regard? Or are we planning on dropping > > support > > > without an upgrade path? > > > > > > > > > On Fri, Nov 10, 2017 at 5:37 PM Guozhang Wang > > wrote: > > > > > > > Thanks Ismael, the proposal looks good to me. > > > > > > > > A side note regarding: https://issues.apache.org/ > > jira/browse/KAFKA-5637, > > > > could we resolve this ticket sooner than later to make clear about > the > > > code > > > > deprecation and support duration when moving from 1.0.x to 2.0.x? > > > > > > > > > > > > Guozhang > > > > > > > > > > > > On Fri, Nov 10, 2017 at 3:44 AM, Ismael Juma > > wrote: > > > > > > > > > Features for 2.0.0 will be known after 1.1.0 is released in > February > > > > 2018. > > > > > We are still doing the usual time-based release process[1]. > > > > > > > > > > I am raising this well ahead of time because of the potential > impact > > of > > > > > removing the old Scala clients (particularly the old high-level > > > consumer) > > > > > and dropping support for Java 7. Hopefully users can then plan > > > > accordingly. > > > > > We would do these changes in trunk soon after 1.1.0 is released > > (around > > > > > February). > > > > > > > > > > I think it makes sense to complete some of the work that was not > > ready > > > in > > > > > time for 1.0.0 (Controller improvements and JBOD are two that come > to > > > > mind) > > > > > in 1.1.0 (January 2018) and combined with the desire to give > advance > > > > > notice, June 2018 was the logical choice. > > > > > > > > > > There is no plan to support a particular release for longer. 1.x > > versus > > > > 2.x > > > > > is no different than 0.10.x versus 0.11.x from the perspective of > > > > > supporting older releases. > > > > > > > > > > [1] https://cwiki.apache.org/confluence/display/KAFKA/Time+ > > > > > Based+Release+Plan > > > > > > > > > > On Fri, Nov 10, 2017 at 11:21 AM, Jaikiran Pai < > > > jai.forums2...@gmail.com > > > > > > > > > > wrote: > > > > > > > > > > > Hi Ismael, > > > > > > > > > > > > Are there any new features other than the language specific > changes > > > > that > > > > > > are being planned for 2.0.0? Also, when 2.x gets released, will > the > > > 1.x > > > > > > series see continued bug fixes and releases in the community or > is > > > the > > > > > plan > > > > >
Re: [DISCUSS] Kafka 2.0.0 in June 2018
Re: migrating offsets for old Scala consumers. I work in the python world, so haven't directly used the old high level consumer, but from what I understand the underlying problem remains the migration of zookeeper offsets to the __consumer_offsets topic. We've used a slightly modified version of Grant Henke's script for migrating offsets here: https://github.com/apache/kafka/pull/2615 It doesn't support rolling upgrades, but other than that it's great... I've used it for multiple migrations, and very thankful for the time Grant put into it. I don't know that it's worth pulling this into core, it might be, it might not be. But it probably is worth documenting the procedure at least somewhere. Personally, I suspect that those who absolutely need a rolling migration and cannot handle a short period of downtime while doing a migration probably have in-house experts on Kafka who are familiar with the issues and willing to figure out a solution. The rest of the world can generally handle a short maintenance window. On Fri, Nov 10, 2017 at 10:46 AM, Ismael Jumawrote: > Hi Gwen, > > A KIP has been proposed, but it is stalled: > > https://cwiki.apache.org/confluence/display/KAFKA/KIP-125%3A+ > ZookeeperConsumerConnector+to+KafkaConsumer+Migration+and+Rollback > > Unless the interested parties pick that up, we would drop support without a > rolling upgrade path. Users would be able to use the old consumers from > 1.1.x for a long time. The old Scala clients don't support the message > format introduced in 0.11.0, so the feature set is pretty much frozen and > there's little benefit in upgrading. But there is a cost in keeping them in > the codebase. > > Ismael > > On Fri, Nov 10, 2017 at 6:02 PM, Gwen Shapira wrote: > > > Last time we tried deprecating the Scala consumer, there were concerns > > about a lack of upgrade path. There is no rolling upgrade, and migrating > > offsets is not trivial (and not documented). > > > > Did anything change in that regard? Or are we planning on dropping > support > > without an upgrade path? > > > > > > On Fri, Nov 10, 2017 at 5:37 PM Guozhang Wang > wrote: > > > > > Thanks Ismael, the proposal looks good to me. > > > > > > A side note regarding: https://issues.apache.org/ > jira/browse/KAFKA-5637, > > > could we resolve this ticket sooner than later to make clear about the > > code > > > deprecation and support duration when moving from 1.0.x to 2.0.x? > > > > > > > > > Guozhang > > > > > > > > > On Fri, Nov 10, 2017 at 3:44 AM, Ismael Juma > wrote: > > > > > > > Features for 2.0.0 will be known after 1.1.0 is released in February > > > 2018. > > > > We are still doing the usual time-based release process[1]. > > > > > > > > I am raising this well ahead of time because of the potential impact > of > > > > removing the old Scala clients (particularly the old high-level > > consumer) > > > > and dropping support for Java 7. Hopefully users can then plan > > > accordingly. > > > > We would do these changes in trunk soon after 1.1.0 is released > (around > > > > February). > > > > > > > > I think it makes sense to complete some of the work that was not > ready > > in > > > > time for 1.0.0 (Controller improvements and JBOD are two that come to > > > mind) > > > > in 1.1.0 (January 2018) and combined with the desire to give advance > > > > notice, June 2018 was the logical choice. > > > > > > > > There is no plan to support a particular release for longer. 1.x > versus > > > 2.x > > > > is no different than 0.10.x versus 0.11.x from the perspective of > > > > supporting older releases. > > > > > > > > [1] https://cwiki.apache.org/confluence/display/KAFKA/Time+ > > > > Based+Release+Plan > > > > > > > > On Fri, Nov 10, 2017 at 11:21 AM, Jaikiran Pai < > > jai.forums2...@gmail.com > > > > > > > > wrote: > > > > > > > > > Hi Ismael, > > > > > > > > > > Are there any new features other than the language specific changes > > > that > > > > > are being planned for 2.0.0? Also, when 2.x gets released, will the > > 1.x > > > > > series see continued bug fixes and releases in the community or is > > the > > > > plan > > > > > to have one single main version that gets continuous updates and > > > > releases? > > > > > > > > > > By the way, why June 2018? :) > > > > > > > > > > -Jaikiran > > > > > > > > > > > > > > > > > > > > On 09/11/17 3:14 PM, Ismael Juma wrote: > > > > > > > > > >> Hi all, > > > > >> > > > > >> I'm starting this discussion early because of the potential > impact. > > > > >> > > > > >> Kafka 1.0.0 was just released and the focus was on achieving the > > > > original > > > > >> project vision in terms of features provided while maintaining > > > > >> compatibility for the most part (i.e. we did not remove deprecated > > > > >> components like the Scala clients). > > > > >> > > > > >> This was the right decision, in my opinion, but it's time to start > > > > >> thinking > > > > >> about
Re: [DISCUSS] Kafka 2.0.0 in June 2018
Hi Gwen, A KIP has been proposed, but it is stalled: https://cwiki.apache.org/confluence/display/KAFKA/KIP-125%3A+ ZookeeperConsumerConnector+to+KafkaConsumer+Migration+and+Rollback Unless the interested parties pick that up, we would drop support without a rolling upgrade path. Users would be able to use the old consumers from 1.1.x for a long time. The old Scala clients don't support the message format introduced in 0.11.0, so the feature set is pretty much frozen and there's little benefit in upgrading. But there is a cost in keeping them in the codebase. Ismael On Fri, Nov 10, 2017 at 6:02 PM, Gwen Shapirawrote: > Last time we tried deprecating the Scala consumer, there were concerns > about a lack of upgrade path. There is no rolling upgrade, and migrating > offsets is not trivial (and not documented). > > Did anything change in that regard? Or are we planning on dropping support > without an upgrade path? > > > On Fri, Nov 10, 2017 at 5:37 PM Guozhang Wang wrote: > > > Thanks Ismael, the proposal looks good to me. > > > > A side note regarding: https://issues.apache.org/jira/browse/KAFKA-5637, > > could we resolve this ticket sooner than later to make clear about the > code > > deprecation and support duration when moving from 1.0.x to 2.0.x? > > > > > > Guozhang > > > > > > On Fri, Nov 10, 2017 at 3:44 AM, Ismael Juma wrote: > > > > > Features for 2.0.0 will be known after 1.1.0 is released in February > > 2018. > > > We are still doing the usual time-based release process[1]. > > > > > > I am raising this well ahead of time because of the potential impact of > > > removing the old Scala clients (particularly the old high-level > consumer) > > > and dropping support for Java 7. Hopefully users can then plan > > accordingly. > > > We would do these changes in trunk soon after 1.1.0 is released (around > > > February). > > > > > > I think it makes sense to complete some of the work that was not ready > in > > > time for 1.0.0 (Controller improvements and JBOD are two that come to > > mind) > > > in 1.1.0 (January 2018) and combined with the desire to give advance > > > notice, June 2018 was the logical choice. > > > > > > There is no plan to support a particular release for longer. 1.x versus > > 2.x > > > is no different than 0.10.x versus 0.11.x from the perspective of > > > supporting older releases. > > > > > > [1] https://cwiki.apache.org/confluence/display/KAFKA/Time+ > > > Based+Release+Plan > > > > > > On Fri, Nov 10, 2017 at 11:21 AM, Jaikiran Pai < > jai.forums2...@gmail.com > > > > > > wrote: > > > > > > > Hi Ismael, > > > > > > > > Are there any new features other than the language specific changes > > that > > > > are being planned for 2.0.0? Also, when 2.x gets released, will the > 1.x > > > > series see continued bug fixes and releases in the community or is > the > > > plan > > > > to have one single main version that gets continuous updates and > > > releases? > > > > > > > > By the way, why June 2018? :) > > > > > > > > -Jaikiran > > > > > > > > > > > > > > > > On 09/11/17 3:14 PM, Ismael Juma wrote: > > > > > > > >> Hi all, > > > >> > > > >> I'm starting this discussion early because of the potential impact. > > > >> > > > >> Kafka 1.0.0 was just released and the focus was on achieving the > > > original > > > >> project vision in terms of features provided while maintaining > > > >> compatibility for the most part (i.e. we did not remove deprecated > > > >> components like the Scala clients). > > > >> > > > >> This was the right decision, in my opinion, but it's time to start > > > >> thinking > > > >> about 2.0.0, which is an opportunity for us to remove major > deprecated > > > >> components and to benefit from Java 8 language enhancements (so that > > we > > > >> can > > > >> move faster). So, I propose the following for Kafka 2.0.0: > > > >> > > > >> 1. It should be released in June 2018 > > > >> 2. The Scala clients (Consumer, SimpleConsumer, Producer, > > SyncProducer) > > > >> will be removed > > > >> 3. Java 8 or higher will be required, i.e. support for Java 7 will > be > > > >> dropped. > > > >> > > > >> Thoughts? > > > >> > > > >> Ismael > > > >> > > > >> > > > > > > > > > > > > > > > -- > > -- Guozhang > > >
Re: [DISCUSS] Kafka 2.0.0 in June 2018
Last time we tried deprecating the Scala consumer, there were concerns about a lack of upgrade path. There is no rolling upgrade, and migrating offsets is not trivial (and not documented). Did anything change in that regard? Or are we planning on dropping support without an upgrade path? On Fri, Nov 10, 2017 at 5:37 PM Guozhang Wangwrote: > Thanks Ismael, the proposal looks good to me. > > A side note regarding: https://issues.apache.org/jira/browse/KAFKA-5637, > could we resolve this ticket sooner than later to make clear about the code > deprecation and support duration when moving from 1.0.x to 2.0.x? > > > Guozhang > > > On Fri, Nov 10, 2017 at 3:44 AM, Ismael Juma wrote: > > > Features for 2.0.0 will be known after 1.1.0 is released in February > 2018. > > We are still doing the usual time-based release process[1]. > > > > I am raising this well ahead of time because of the potential impact of > > removing the old Scala clients (particularly the old high-level consumer) > > and dropping support for Java 7. Hopefully users can then plan > accordingly. > > We would do these changes in trunk soon after 1.1.0 is released (around > > February). > > > > I think it makes sense to complete some of the work that was not ready in > > time for 1.0.0 (Controller improvements and JBOD are two that come to > mind) > > in 1.1.0 (January 2018) and combined with the desire to give advance > > notice, June 2018 was the logical choice. > > > > There is no plan to support a particular release for longer. 1.x versus > 2.x > > is no different than 0.10.x versus 0.11.x from the perspective of > > supporting older releases. > > > > [1] https://cwiki.apache.org/confluence/display/KAFKA/Time+ > > Based+Release+Plan > > > > On Fri, Nov 10, 2017 at 11:21 AM, Jaikiran Pai > > > wrote: > > > > > Hi Ismael, > > > > > > Are there any new features other than the language specific changes > that > > > are being planned for 2.0.0? Also, when 2.x gets released, will the 1.x > > > series see continued bug fixes and releases in the community or is the > > plan > > > to have one single main version that gets continuous updates and > > releases? > > > > > > By the way, why June 2018? :) > > > > > > -Jaikiran > > > > > > > > > > > > On 09/11/17 3:14 PM, Ismael Juma wrote: > > > > > >> Hi all, > > >> > > >> I'm starting this discussion early because of the potential impact. > > >> > > >> Kafka 1.0.0 was just released and the focus was on achieving the > > original > > >> project vision in terms of features provided while maintaining > > >> compatibility for the most part (i.e. we did not remove deprecated > > >> components like the Scala clients). > > >> > > >> This was the right decision, in my opinion, but it's time to start > > >> thinking > > >> about 2.0.0, which is an opportunity for us to remove major deprecated > > >> components and to benefit from Java 8 language enhancements (so that > we > > >> can > > >> move faster). So, I propose the following for Kafka 2.0.0: > > >> > > >> 1. It should be released in June 2018 > > >> 2. The Scala clients (Consumer, SimpleConsumer, Producer, > SyncProducer) > > >> will be removed > > >> 3. Java 8 or higher will be required, i.e. support for Java 7 will be > > >> dropped. > > >> > > >> Thoughts? > > >> > > >> Ismael > > >> > > >> > > > > > > > > > -- > -- Guozhang >
Re: [DISCUSS] Kafka 2.0.0 in June 2018
Thanks Ismael, the proposal looks good to me. A side note regarding: https://issues.apache.org/jira/browse/KAFKA-5637, could we resolve this ticket sooner than later to make clear about the code deprecation and support duration when moving from 1.0.x to 2.0.x? Guozhang On Fri, Nov 10, 2017 at 3:44 AM, Ismael Jumawrote: > Features for 2.0.0 will be known after 1.1.0 is released in February 2018. > We are still doing the usual time-based release process[1]. > > I am raising this well ahead of time because of the potential impact of > removing the old Scala clients (particularly the old high-level consumer) > and dropping support for Java 7. Hopefully users can then plan accordingly. > We would do these changes in trunk soon after 1.1.0 is released (around > February). > > I think it makes sense to complete some of the work that was not ready in > time for 1.0.0 (Controller improvements and JBOD are two that come to mind) > in 1.1.0 (January 2018) and combined with the desire to give advance > notice, June 2018 was the logical choice. > > There is no plan to support a particular release for longer. 1.x versus 2.x > is no different than 0.10.x versus 0.11.x from the perspective of > supporting older releases. > > [1] https://cwiki.apache.org/confluence/display/KAFKA/Time+ > Based+Release+Plan > > On Fri, Nov 10, 2017 at 11:21 AM, Jaikiran Pai > wrote: > > > Hi Ismael, > > > > Are there any new features other than the language specific changes that > > are being planned for 2.0.0? Also, when 2.x gets released, will the 1.x > > series see continued bug fixes and releases in the community or is the > plan > > to have one single main version that gets continuous updates and > releases? > > > > By the way, why June 2018? :) > > > > -Jaikiran > > > > > > > > On 09/11/17 3:14 PM, Ismael Juma wrote: > > > >> Hi all, > >> > >> I'm starting this discussion early because of the potential impact. > >> > >> Kafka 1.0.0 was just released and the focus was on achieving the > original > >> project vision in terms of features provided while maintaining > >> compatibility for the most part (i.e. we did not remove deprecated > >> components like the Scala clients). > >> > >> This was the right decision, in my opinion, but it's time to start > >> thinking > >> about 2.0.0, which is an opportunity for us to remove major deprecated > >> components and to benefit from Java 8 language enhancements (so that we > >> can > >> move faster). So, I propose the following for Kafka 2.0.0: > >> > >> 1. It should be released in June 2018 > >> 2. The Scala clients (Consumer, SimpleConsumer, Producer, SyncProducer) > >> will be removed > >> 3. Java 8 or higher will be required, i.e. support for Java 7 will be > >> dropped. > >> > >> Thoughts? > >> > >> Ismael > >> > >> > > > -- -- Guozhang
Re: [DISCUSS] Kafka 2.0.0 in June 2018
Features for 2.0.0 will be known after 1.1.0 is released in February 2018. We are still doing the usual time-based release process[1]. I am raising this well ahead of time because of the potential impact of removing the old Scala clients (particularly the old high-level consumer) and dropping support for Java 7. Hopefully users can then plan accordingly. We would do these changes in trunk soon after 1.1.0 is released (around February). I think it makes sense to complete some of the work that was not ready in time for 1.0.0 (Controller improvements and JBOD are two that come to mind) in 1.1.0 (January 2018) and combined with the desire to give advance notice, June 2018 was the logical choice. There is no plan to support a particular release for longer. 1.x versus 2.x is no different than 0.10.x versus 0.11.x from the perspective of supporting older releases. [1] https://cwiki.apache.org/confluence/display/KAFKA/Time+ Based+Release+Plan On Fri, Nov 10, 2017 at 11:21 AM, Jaikiran Paiwrote: > Hi Ismael, > > Are there any new features other than the language specific changes that > are being planned for 2.0.0? Also, when 2.x gets released, will the 1.x > series see continued bug fixes and releases in the community or is the plan > to have one single main version that gets continuous updates and releases? > > By the way, why June 2018? :) > > -Jaikiran > > > > On 09/11/17 3:14 PM, Ismael Juma wrote: > >> Hi all, >> >> I'm starting this discussion early because of the potential impact. >> >> Kafka 1.0.0 was just released and the focus was on achieving the original >> project vision in terms of features provided while maintaining >> compatibility for the most part (i.e. we did not remove deprecated >> components like the Scala clients). >> >> This was the right decision, in my opinion, but it's time to start >> thinking >> about 2.0.0, which is an opportunity for us to remove major deprecated >> components and to benefit from Java 8 language enhancements (so that we >> can >> move faster). So, I propose the following for Kafka 2.0.0: >> >> 1. It should be released in June 2018 >> 2. The Scala clients (Consumer, SimpleConsumer, Producer, SyncProducer) >> will be removed >> 3. Java 8 or higher will be required, i.e. support for Java 7 will be >> dropped. >> >> Thoughts? >> >> Ismael >> >> >
Re: [DISCUSS] Kafka 2.0.0 in June 2018
Hi Ismael, Are there any new features other than the language specific changes that are being planned for 2.0.0? Also, when 2.x gets released, will the 1.x series see continued bug fixes and releases in the community or is the plan to have one single main version that gets continuous updates and releases? By the way, why June 2018? :) -Jaikiran On 09/11/17 3:14 PM, Ismael Juma wrote: Hi all, I'm starting this discussion early because of the potential impact. Kafka 1.0.0 was just released and the focus was on achieving the original project vision in terms of features provided while maintaining compatibility for the most part (i.e. we did not remove deprecated components like the Scala clients). This was the right decision, in my opinion, but it's time to start thinking about 2.0.0, which is an opportunity for us to remove major deprecated components and to benefit from Java 8 language enhancements (so that we can move faster). So, I propose the following for Kafka 2.0.0: 1. It should be released in June 2018 2. The Scala clients (Consumer, SimpleConsumer, Producer, SyncProducer) will be removed 3. Java 8 or higher will be required, i.e. support for Java 7 will be dropped. Thoughts? Ismael
Re: [DISCUSS] Kafka 2.0.0 in June 2018
Hi Apurva, I agree about KIP-185 (assuming the vote passes). To clarify, my list was not meant to be exhaustive, just the items with highest compatibility impact justifying the major bump. I expect we will have many other great KIPs. :) Ismael On 10 Nov 2017 12:57 am, "Apurva Mehta"wrote: I think this is a good idea and your proposed changes look good. I also think that this might be a good time to adopt KIP-185 ( https://cwiki.apache.org/confluence/display/KAFKA/KIP- 185%3A+Make+exactly+once+in+order+delivery+per+partition+ the+default+producer+setting), and make the idempotent producer the default mode. We missed the window on 1.0.0 because we hadn't completed necessary work. But we should have it done by June 2018. Thanks, Apurva On Thu, Nov 9, 2017 at 4:10 AM, Ismael Juma wrote: > That's correct, Tom. We can only remove deprecated APIs in major releases > since it's a breaking change. > > Ismael > > > On 9 Nov 2017 11:48 am, "Tom Bentley" wrote: > > Hi Stephane, > > I think the version number rules are based on semantic versioning, so Kafka > can't remove even deprecated APIs in a minor release (it is a breaking > change, after all). Therefore until Kafka 2.0 we will have to carry the > weight of the deprecated APIs, and Java 7. > > Cheers, > > Tom > > > > On 9 November 2017 at 11:04, Stephane Maarek au > > wrote: > > > I'm very happy with the milestones but worried about the versioning > number. > > It seems it will mostly bring stuff out of deprecation vs actually > bringing > > in breaking features. A 2.0 to me should bring something major to the > > table, possibly breaking, which would justify a big number hop. I'm still > > new to software development in the oss, but that's my two cents > > > > On 9 Nov. 2017 8:44 pm, "Ismael Juma" wrote: > > > > > Hi all, > > > > > > I'm starting this discussion early because of the potential impact. > > > > > > Kafka 1.0.0 was just released and the focus was on achieving the > original > > > project vision in terms of features provided while maintaining > > > compatibility for the most part (i.e. we did not remove deprecated > > > components like the Scala clients). > > > > > > This was the right decision, in my opinion, but it's time to start > > thinking > > > about 2.0.0, which is an opportunity for us to remove major deprecated > > > components and to benefit from Java 8 language enhancements (so that we > > can > > > move faster). So, I propose the following for Kafka 2.0.0: > > > > > > 1. It should be released in June 2018 > > > 2. The Scala clients (Consumer, SimpleConsumer, Producer, SyncProducer) > > > will be removed > > > 3. Java 8 or higher will be required, i.e. support for Java 7 will be > > > dropped. > > > > > > Thoughts? > > > > > > Ismael > > > > > >
Re: [DISCUSS] Kafka 2.0.0 in June 2018
I think this is a good idea and your proposed changes look good. I also think that this might be a good time to adopt KIP-185 ( https://cwiki.apache.org/confluence/display/KAFKA/KIP-185%3A+Make+exactly+once+in+order+delivery+per+partition+the+default+producer+setting), and make the idempotent producer the default mode. We missed the window on 1.0.0 because we hadn't completed necessary work. But we should have it done by June 2018. Thanks, Apurva On Thu, Nov 9, 2017 at 4:10 AM, Ismael Jumawrote: > That's correct, Tom. We can only remove deprecated APIs in major releases > since it's a breaking change. > > Ismael > > > On 9 Nov 2017 11:48 am, "Tom Bentley" wrote: > > Hi Stephane, > > I think the version number rules are based on semantic versioning, so Kafka > can't remove even deprecated APIs in a minor release (it is a breaking > change, after all). Therefore until Kafka 2.0 we will have to carry the > weight of the deprecated APIs, and Java 7. > > Cheers, > > Tom > > > > On 9 November 2017 at 11:04, Stephane Maarek au > > wrote: > > > I'm very happy with the milestones but worried about the versioning > number. > > It seems it will mostly bring stuff out of deprecation vs actually > bringing > > in breaking features. A 2.0 to me should bring something major to the > > table, possibly breaking, which would justify a big number hop. I'm still > > new to software development in the oss, but that's my two cents > > > > On 9 Nov. 2017 8:44 pm, "Ismael Juma" wrote: > > > > > Hi all, > > > > > > I'm starting this discussion early because of the potential impact. > > > > > > Kafka 1.0.0 was just released and the focus was on achieving the > original > > > project vision in terms of features provided while maintaining > > > compatibility for the most part (i.e. we did not remove deprecated > > > components like the Scala clients). > > > > > > This was the right decision, in my opinion, but it's time to start > > thinking > > > about 2.0.0, which is an opportunity for us to remove major deprecated > > > components and to benefit from Java 8 language enhancements (so that we > > can > > > move faster). So, I propose the following for Kafka 2.0.0: > > > > > > 1. It should be released in June 2018 > > > 2. The Scala clients (Consumer, SimpleConsumer, Producer, SyncProducer) > > > will be removed > > > 3. Java 8 or higher will be required, i.e. support for Java 7 will be > > > dropped. > > > > > > Thoughts? > > > > > > Ismael > > > > > >
Re: [DISCUSS] Kafka 2.0.0 in June 2018
That's correct, Tom. We can only remove deprecated APIs in major releases since it's a breaking change. Ismael On 9 Nov 2017 11:48 am, "Tom Bentley"wrote: Hi Stephane, I think the version number rules are based on semantic versioning, so Kafka can't remove even deprecated APIs in a minor release (it is a breaking change, after all). Therefore until Kafka 2.0 we will have to carry the weight of the deprecated APIs, and Java 7. Cheers, Tom On 9 November 2017 at 11:04, Stephane Maarek wrote: > I'm very happy with the milestones but worried about the versioning number. > It seems it will mostly bring stuff out of deprecation vs actually bringing > in breaking features. A 2.0 to me should bring something major to the > table, possibly breaking, which would justify a big number hop. I'm still > new to software development in the oss, but that's my two cents > > On 9 Nov. 2017 8:44 pm, "Ismael Juma" wrote: > > > Hi all, > > > > I'm starting this discussion early because of the potential impact. > > > > Kafka 1.0.0 was just released and the focus was on achieving the original > > project vision in terms of features provided while maintaining > > compatibility for the most part (i.e. we did not remove deprecated > > components like the Scala clients). > > > > This was the right decision, in my opinion, but it's time to start > thinking > > about 2.0.0, which is an opportunity for us to remove major deprecated > > components and to benefit from Java 8 language enhancements (so that we > can > > move faster). So, I propose the following for Kafka 2.0.0: > > > > 1. It should be released in June 2018 > > 2. The Scala clients (Consumer, SimpleConsumer, Producer, SyncProducer) > > will be removed > > 3. Java 8 or higher will be required, i.e. support for Java 7 will be > > dropped. > > > > Thoughts? > > > > Ismael > > >
Re: [DISCUSS] Kafka 2.0.0 in June 2018
Hi Stephane, I think the version number rules are based on semantic versioning, so Kafka can't remove even deprecated APIs in a minor release (it is a breaking change, after all). Therefore until Kafka 2.0 we will have to carry the weight of the deprecated APIs, and Java 7. Cheers, Tom On 9 November 2017 at 11:04, Stephane Maarekwrote: > I'm very happy with the milestones but worried about the versioning number. > It seems it will mostly bring stuff out of deprecation vs actually bringing > in breaking features. A 2.0 to me should bring something major to the > table, possibly breaking, which would justify a big number hop. I'm still > new to software development in the oss, but that's my two cents > > On 9 Nov. 2017 8:44 pm, "Ismael Juma" wrote: > > > Hi all, > > > > I'm starting this discussion early because of the potential impact. > > > > Kafka 1.0.0 was just released and the focus was on achieving the original > > project vision in terms of features provided while maintaining > > compatibility for the most part (i.e. we did not remove deprecated > > components like the Scala clients). > > > > This was the right decision, in my opinion, but it's time to start > thinking > > about 2.0.0, which is an opportunity for us to remove major deprecated > > components and to benefit from Java 8 language enhancements (so that we > can > > move faster). So, I propose the following for Kafka 2.0.0: > > > > 1. It should be released in June 2018 > > 2. The Scala clients (Consumer, SimpleConsumer, Producer, SyncProducer) > > will be removed > > 3. Java 8 or higher will be required, i.e. support for Java 7 will be > > dropped. > > > > Thoughts? > > > > Ismael > > >
Re: [DISCUSS] Kafka 2.0.0 in June 2018
I'm very happy with the milestones but worried about the versioning number. It seems it will mostly bring stuff out of deprecation vs actually bringing in breaking features. A 2.0 to me should bring something major to the table, possibly breaking, which would justify a big number hop. I'm still new to software development in the oss, but that's my two cents On 9 Nov. 2017 8:44 pm, "Ismael Juma"wrote: > Hi all, > > I'm starting this discussion early because of the potential impact. > > Kafka 1.0.0 was just released and the focus was on achieving the original > project vision in terms of features provided while maintaining > compatibility for the most part (i.e. we did not remove deprecated > components like the Scala clients). > > This was the right decision, in my opinion, but it's time to start thinking > about 2.0.0, which is an opportunity for us to remove major deprecated > components and to benefit from Java 8 language enhancements (so that we can > move faster). So, I propose the following for Kafka 2.0.0: > > 1. It should be released in June 2018 > 2. The Scala clients (Consumer, SimpleConsumer, Producer, SyncProducer) > will be removed > 3. Java 8 or higher will be required, i.e. support for Java 7 will be > dropped. > > Thoughts? > > Ismael >