Hi Geoffrey,

Thanks for the writeup. Couple of questions:
- Is it possible to configure suites using ducktape? For example: assume all 
the tests in system_tests have been migrated to ducktape. Can I run a subset of 
all tests grouped by functional areas i.e. replication, broker failure etc.

- Ducktape allows us to run tests on a vagrant cluster or on a static cluster 
configured via JSON. Once ported to ducktape, can we very easily run the 
existing system tests in both flavors?

Thanks,
Aditya

________________________________________
From: Geoffrey Anderson [ge...@confluent.io]
Sent: Monday, June 08, 2015 10:56 PM
To: dev@kafka.apache.org
Subject: Re: [DISCUSS] KIP-25 System test improvements

Hi KIP-25 thread,

I consolidated some of the questions from this thread and elsewhere.

Q: Can we see a map of what system-test currently tests, which ones we want
to replace and JIRAs for replacing?
A: Initial draft here:
https://cwiki.apache.org/confluence/display/KAFKA/Roadmap+-+port+existing+system+tests

Q: Will ducktape be maintained separately as a github repo?
A: Yes https://github.com/confluentinc/ducktape

Q: How easy is viewing the test results and logs, how will test output be
structured?
A: Hierarchical structure as outlined here:
https://github.com/confluentinc/ducktape/wiki/Design-overview#output

Q: Does it support code coverage? If not, how easy/ difficult would it be
to support?
A: It does not, and we have no immediate plans to support this. Difficulty
unclear.

Q: It would be nice if each Kafka version that we release will also
have a separate "tests" artifact that users can download, untar and easily
run against a Kafka cluster of the same version.
A: This seems reasonable and not too much extra work. Definitely open to
discussion on this.

Q: Why not share running services across multiple tests?
A: Prefer to optimize for simplicity and correctness over what might be a
questionable improvement in run-time.

Q: Are regressions - in the road map?
A: yes

Q: Are Jepsen style tests involving network failures in the road map?
A: yes

Thanks much,
Geoff



On Mon, Jun 8, 2015 at 4:55 PM, Geoffrey Anderson <ge...@confluent.io>
wrote:

> Hi Gwen,
>
> I don't see any problem with this as long as we're convinced there's a
> good use case, which seems to be true.
>
> Cheers,
> Geoff
>
> On Thu, Jun 4, 2015 at 5:20 PM, Gwen Shapira <gshap...@cloudera.com>
> wrote:
>
>> Not completely random places :)
>> People may use Cloudera / HWX distributions which include Kafka, but want
>> to verify that these bits match a specific upstream release.
>>
>> I think having the tests separately will be useful for this. In this case,
>> finding the tests are not a big issue - we'll add a download link :)
>>
>> On Thu, Jun 4, 2015 at 5:00 PM, Jiangjie Qin <j...@linkedin.com.invalid>
>> wrote:
>>
>> > Hey Gwen,
>> >
>> > Currently the test and code are downloaded at the same time. Supposedly
>> > the tests in the same repository should cover match the code.
>> > Are you saying people downloaded a release from some random place and
>> want
>> > to verify it? If that is the case, does that mean people still need to
>> > find the correct place to download the right test artifact?
>> >
>> > Thanks,
>> >
>> > Jiangjie (Becket) Qin
>> >
>> >
>> >
>> > On 6/4/15, 4:29 PM, "Gwen Shapira" <gshap...@cloudera.com> wrote:
>> >
>> > >Hi,
>> > >
>> > >Reviving the discussion a bit :)
>> > >
>> > >I think it will be nice if each Kafka version that we release will also
>> > >have a separate "tests" artifact that users can download, untar and
>> easily
>> > >run against a Kafka cluster of the same version.
>> > >
>> > >The idea is that if someone downloads packages that claim to contain
>> > >something of a specific Kafka version (i.e. Kafka 0.8.2.0 + patches),
>> > >users
>> > >can easily download the tests and verify that it indeed passes the
>> tests
>> > >for this version and therefore behaves the way this version is
>> expected to
>> > >behave.
>> > >
>> > >Does it make sense?
>> > >
>> > >Gwen
>> > >
>> > >On Thu, May 21, 2015 at 3:26 PM, Geoffrey Anderson <ge...@confluent.io
>> >
>> > >wrote:
>> > >
>> > >> Hi Ashish,
>> > >>
>> > >> Looks like Ewen already hit the main points, but a few additions:
>> > >>
>> > >> 1. ducktape repo is here: https://github.com/confluentinc/ducktape
>> > >> ducktape itself will be pip installable in the near future, and Kafka
>> > >> system tests will be able to depend on a particular version of
>> ducktape.
>> > >>
>> > >> 2.  The reporting is nothing fancy. We're definitely open to
>> feedback,
>> > >>but
>> > >> it consists of:
>> > >> - top level summary of the test run (simple PASS/FAIL for each test)
>> > >> - top level info and debug logs
>> > >> - per-test info and debug logs
>> > >> - per-test "service" logs gathered from each service used in the
>> test.
>> > >>For
>> > >> example, if your test pulls up a Kafka cluster with 5 brokers, the
>> end
>> > >> result will have the Kafka logs from each of those 5 machines.
>> > >>
>> > >> Cheers,
>> > >> Geoff
>> > >>
>> > >> On Thu, May 21, 2015 at 3:15 PM, Ewen Cheslack-Postava
>> > >><e...@confluent.io>
>> > >> wrote:
>> > >>
>> > >> > Ashish,
>> > >> >
>> > >> > 1. That was the plan. We put some effort into cleanly separating
>> the
>> > >> > framework so it would be reusable across many projects.
>> > >> > 2. I think you're seeing a test in progress where the final report
>> > >>hasn't
>> > >> > been created yet. If you visit one of the older ones you'll see it
>> > >>has a
>> > >> > landing page with links:
>> > >> > http://testing.confluent.io/confluent_platform/2015-05-20--001/
>> > >> Apparently
>> > >> > we need to adjust when we update the 'latest' symlink. The logs
>> that
>> > >>are
>> > >> > collected for tests are configurable, and service implementations
>> > >>include
>> > >> > sane defaults (so, e.g., you will always get the normal log file
>> for
>> > >> Kafka,
>> > >> > but only get the data files if the test asks for them).
>> > >> > 3. No code coverage support. Haven't looked into it, so I couldn't
>> > >> comment
>> > >> > on how hard it would be to add.
>> > >> >
>> > >> > -Ewen
>> > >> >
>> > >> > On Thu, May 21, 2015 at 2:38 PM, Ashish Singh <asi...@cloudera.com
>> >
>> > >> wrote:
>> > >> >
>> > >> > > Geoffrey,
>> > >> > >
>> > >> > > This looks great!
>> > >> > >
>> > >> > > A few questions.
>> > >> > > 1. Will ducktape be maintained separately as a github repo?
>> > >> > > 2. How easy is viewing the test results and logs. The link in
>> KIP,
>> > >> > > http://testing.confluent.io/confluent_platform/latest/, lists a
>> > >>bunch
>> > >> of
>> > >> > > files and dirs. Could you add to KIP how the result and logs for
>> the
>> > >> > tests
>> > >> > > will be organized.
>> > >> > > 3. Does it support code coverage? If not, how easy/ difficult
>> would
>> > >>it
>> > >> > be?
>> > >> > >
>> > >> > > On Thu, May 21, 2015 at 2:03 PM, Geoffrey Anderson
>> > >><ge...@confluent.io
>> > >> >
>> > >> > > wrote:
>> > >> > >
>> > >> > > > Great, I'll work on putting together a more detailed map of
>> this
>> > >> > > > replacement process.
>> > >> > > >
>> > >> > > > On Thu, May 21, 2015 at 11:13 AM, Gwen Shapira <
>> > >> gshap...@cloudera.com>
>> > >> > > > wrote:
>> > >> > > >
>> > >> > > > > Love this idea :)
>> > >> > > > >
>> > >> > > > > I took a look at Ducktape API and it looks like a good fit -
>> > >>clean
>> > >> > API,
>> > >> > > > > extensible, easy to use and powerful enough for our use-case.
>> > >> > > > >
>> > >> > > > > Something I'd like to see as part of the KIP is a map of what
>> > >> > > system-test
>> > >> > > > > currently tests, which ones we want to replace and a JIRA for
>> > >> > replacing
>> > >> > > > > (possibly one for each group of tests).
>> > >> > > > > Basically, I know we all want to use the new system for new
>> test
>> > >> > cases
>> > >> > > > > (upgrades, etc), but I really want to make sure we don't get
>> > >>stuck
>> > >> > with
>> > >> > > > > both systems forever.
>> > >> > > > >
>> > >> > > > > Gwen
>> > >> > > > >
>> > >> > > > > On Thu, May 21, 2015 at 9:01 PM, Geoffrey Anderson <
>> > >> > ge...@confluent.io
>> > >> > > >
>> > >> > > > > wrote:
>> > >> > > > >
>> > >> > > > > > Hi,
>> > >> > > > > >
>> > >> > > > > > Just kicking off the discussion thread on KIP-25
>> > >> > > > > >
>> > >> > > > > >
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> > >>
>> >
>> https://cwiki.apache.org/confluence/display/KAFKA/KIP+-+25+System+test+im
>> > >>provements
>> > >> > > > > >
>> > >> > > > > > Thanks,
>> > >> > > > > > Geoff
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> > >
>> > >> > >
>> > >> > > --
>> > >> > >
>> > >> > > Regards,
>> > >> > > Ashish
>> > >> > >
>> > >> >
>> > >> >
>> > >> >
>> > >> > --
>> > >> > Thanks,
>> > >> > Ewen
>> > >> >
>> > >>
>> >
>> >
>>
>
>

Reply via email to