I have absolutely no problem with the time taken for the tests. My only issue is that they intermittently fail. Because of that, I am now suspicious of any results I get.
Is it really an error, or is it a timing issue or a race condition? Suspicious tests are next to useless. On 19 Mar 2010, at 11:46, Robert Dionne wrote: > I see similar issues, though never with 100-ref-counter. It looks like a > race condition but should be checked because the place where it's used, > couch_db:is_idle, depends on that value being right. > > make check is much faster that make cover > > I think it's ok for tests to take a long time to run and I suspect most users > are used to it. It's a measure of how solid the code is. Perhaps there could > be two levels of testing, one that's quick and superficial and sufficient to > verify the build so you can run it repeatedly in reasonable time, and the > other for users at install time that includes long running performance tests, > test that run a server and so forth. At build time you'd only need to run > this once at the end. > > > > On Mar 19, 2010, at 7:13 AM, Noah Slater wrote: > >> Some of the test suites rely on timing delays, and these are unpredictable, >> resulting in non-deterministic test failures. The full tag/build/test cycle >> is long enough as it is - but having to start again from scratch when the >> last, and second, run of the test suite fails adds a significant amount of >> friction for me. I would like to ask that this issue is address as soon as >> possible. It is entirely my fault that this release has been delayed as much >> as it has, but my job would be made significantly easier if the test suite >> behaved consistently. >> >> I got the error included below this morning, and when I ran it again, there >> was no error. I am going to ignore this for now, and just call a vote on the >> release. But doing so is risky. I have no idea why this failed once, and as >> release manager, it is my duty to understand the bugs we're shipping with. I >> don't like being in a position where I am ignoring them for convenience. >> They exist as warning beacons, primarily for me, and when I start having to >> ignore them, they have utterly failed to do their job properly. >> >> Apologies if this email sounds frustrated. I am frustrated. >> >> I'm not finger pointing, just trying to illustrate the reasons for my belief >> that this problem should be addressed as soon as possible. >> >> ./test/etap/run >> /tmp/couchdb/0.11.0/test/etap/001-load........................ok >> /tmp/couchdb/0.11.0/test/etap/002-icu-driver..................ok >> /tmp/couchdb/0.11.0/test/etap/010-file-basics.................ok >> /tmp/couchdb/0.11.0/test/etap/011-file-headers................ok >> /tmp/couchdb/0.11.0/test/etap/020-btree-basics................ok >> /tmp/couchdb/0.11.0/test/etap/021-btree-reductions............ok >> /tmp/couchdb/0.11.0/test/etap/030-doc-from-json...............ok >> /tmp/couchdb/0.11.0/test/etap/031-doc-to-json.................ok >> /tmp/couchdb/0.11.0/test/etap/040-util........................ok >> /tmp/couchdb/0.11.0/test/etap/041-uuid-gen....................ok >> /tmp/couchdb/0.11.0/test/etap/050-stream......................ok >> /tmp/couchdb/0.11.0/test/etap/060-kt-merging..................ok >> /tmp/couchdb/0.11.0/test/etap/061-kt-missing-leaves...........ok >> /tmp/couchdb/0.11.0/test/etap/062-kt-remove-leaves............ok >> /tmp/couchdb/0.11.0/test/etap/063-kt-get-leaves...............ok >> /tmp/couchdb/0.11.0/test/etap/064-kt-counting.................ok >> /tmp/couchdb/0.11.0/test/etap/065-kt-stemming.................ok >> /tmp/couchdb/0.11.0/test/etap/070-couch-db....................ok >> /tmp/couchdb/0.11.0/test/etap/080-config-get-set..............ok >> /tmp/couchdb/0.11.0/test/etap/081-config-override.............ok >> /tmp/couchdb/0.11.0/test/etap/082-config-register.............ok >> /tmp/couchdb/0.11.0/test/etap/083-config-no-files.............ok >> /tmp/couchdb/0.11.0/test/etap/090-task-status.................ok >> /tmp/couchdb/0.11.0/test/etap/100-ref-counter.................FAILED test 8 >> Failed 1/8 tests, 87.50% okay >> /tmp/couchdb/0.11.0/test/etap/110-replication-httpc...........ok >> /tmp/couchdb/0.11.0/test/etap/111-replication-changes-feed....ok >> /tmp/couchdb/0.11.0/test/etap/112-replication-missing-revs....ok >> /tmp/couchdb/0.11.0/test/etap/120-stats-collect...............ok >> /tmp/couchdb/0.11.0/test/etap/121-stats-aggregates............ok >> /tmp/couchdb/0.11.0/test/etap/130-attachments-md5.............ok >> /tmp/couchdb/0.11.0/test/etap/140-attachment-comp.............ok >> /tmp/couchdb/0.11.0/test/etap/150-invalid-view-seq............ok >> /tmp/couchdb/0.11.0/test/etap/160-vhosts......................ok >> Failed Test Stat Wstat Total Fail List of >> Failed >> ------------------------------------------------------------------------------- >> /tmp/couchdb/0.11.0/test/etap/100-ref-cou 8 1 8 >> Failed 1/33 test scripts. 1/456 subtests failed. >> Files=33, Tests=456, 74 wallclock secs (44.01 cusr + 4.38 csys = 48.39 CPU) >> Failed 1/33 test programs. 1/456 subtests failed. >> make: *** [check] Error 255 >> >
