BadApple report

2020-12-21 Thread Erick Erickson
Still noisy, waiting for the reference impl to untangle. Short form: Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 136 failures Week: 1 had 185 failures Week: 2 had 210 failures Week: 3 had 112 failures Failures in Hoss' reports in

BadApple report

2020-11-23 Thread Erick Erickson
Unfortunately, the reference impl is creating quite a bit of noise in Hoss’ rollups. That said, I have a mail filter for test failures that puts the reference impl tests in a different mail folder and my sense is that the regular branch is getting an increasing number of failures. If I have

BadApple report

2020-11-09 Thread Erick Erickson
Still seeing quite a bit of noise due to the reference impl. That said, we do have a reproducible error for TestRandomDVFaceting both 8x and master, see SOLR-14990. Meanwhile, here’s the report for this week. Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0

BadApple report

2020-11-02 Thread Erick Erickson
Not much change this week, still getting considerable noise from the reference impl. Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 110 failures Week: 1 had 150 failures Week: 2 had 174 failures Week: 3 had 142 failures Failures in

BadApple report

2020-10-26 Thread Erick Erickson
Still working through the failures on the reference impl, so AFAIK, the tests failing large percentages of the time are on that branch. Processing file (History bit 3): HOSS-2020-10-26.csv Processing file (History bit 2): HOSS-2020-10-19.csv Processing file (History bit 1): HOSS-2020-10-12.csv

BadApple report

2020-10-19 Thread Erick Erickson
The BadApple report remains skewed as the results include the reference impl so this is mostly in case people are curious…. I expect next week to see an uptick in the number of tests that have failed each of the last 4 weeks, that’ll be when the reference-impl parts of the report kick

BadApple report

2020-10-12 Thread Erick Erickson
Mostly for historical context for a while, It includes the reference impl so the stats will be skewed from now until we integrate it all. Short form: Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 142 failures Week: 1 had 153 failures Week: 2 had

RE: BadApple report

2020-08-25 Thread Uwe Schindler
be the same. Uwe - Uwe Schindler Achterdiek 19, D-28357 Bremen https://www.thetaphi.de eMail: u...@thetaphi.de > -Original Message- > From: Erick Erickson > Sent: Monday, August 24, 2020 3:59 PM > To: dev@lucene.apache.org > Subject: BadApple report > > We have some pre

BadApple report

2020-08-24 Thread Erick Erickson
We have some pretty frequent failures, see: http://fucit.org/solr-jenkins-reports/failure-report.html I’m pretty sure LBSolrClientTest has been addressed. I’m looking at what commit caused TestConfigOverlay to start failing… This can be a little hard to interpret since it includes tests that

BadApple report

2020-08-17 Thread Erick Erickson
Failures in Hoss' reports for the last 4 rollups. There were 242 unannotated tests that failed in Hoss' rollups. Ordered by the date I downloaded the rollup file, newest->oldest. See above for the dates the files were collected These tests were NOT BadApple'd or AwaitsFix'd Failures

Re: BadApple report, but please read the first bit

2020-08-13 Thread David Smiley
change to HDFS stuff. Starting June/July failing regularly. > > Kevin Risden > > > > On Wed, Aug 12, 2020 at 9:03 AM Erick Erickson > wrote: > >> I have the weekly rollups (with a few gaps) going back to about April >> 2018, but nothing’s been done to try to mak

Re: BadApple report, but please read the first bit

2020-08-12 Thread Erick Erickson
Didn’t think at first (only one cup of coffee). Here’s the Emails that test appears in, the formatting is poor… After that is the raw data from Hoss’ rollups that might be easier to ingest. I have 1.3G of this kind of historical data, I’ve had vague thoughts about putting it someplace

Re: BadApple report, but please read the first bit

2020-08-12 Thread Kevin Risden
Risden On Wed, Aug 12, 2020 at 9:03 AM Erick Erickson wrote: > I have the weekly rollups (with a few gaps) going back to about April > 2018, but nothing’s been done to try to make them generally available. Each > BadApple report has rates for the last 4 weeks in the attached file, jus

Re: BadApple report, but please read the first bit

2020-08-12 Thread Erick Erickson
I have the weekly rollups (with a few gaps) going back to about April 2018, but nothing’s been done to try to make them generally available. Each BadApple report has rates for the last 4 weeks in the attached file, just below "Failures over the last 4 weeks, but not every week. Ordered

Re: BadApple report, but please read the first bit

2020-08-12 Thread David Smiley
Do we have any long term (aka "longitudinal") pass/fail rates for tests? SharedFSAutoReplicaFailoverTest in particular is kinda-sorta tied to HDFS, and that's going away to a plug-in for 9.0. The shared file system notion isn't well supported in SolrCloud, I think. ~ David Smiley Apache

Re: Badapple report

2020-08-11 Thread Atri Sharma
Merged (thanks Mike D!). Atri On Tue, Aug 11, 2020 at 5:32 PM Erick Erickson wrote: > > Great, thanks! Let me know when you push it, I can beast the test again. > > > On Aug 11, 2020, at 3:48 AM, Atri Sharma wrote: > > > > I investigated testRequestRateLimiters and hardened the tests up: > > >

Re: Badapple report

2020-08-11 Thread Erick Erickson
Great, thanks! Let me know when you push it, I can beast the test again. > On Aug 11, 2020, at 3:48 AM, Atri Sharma wrote: > > I investigated testRequestRateLimiters and hardened the tests up: > > https://github.com/apache/lucene-solr/pull/1736 > > This will stop testConcurrentRequests from

Re: Badapple report

2020-08-11 Thread Atri Sharma
I investigated testRequestRateLimiters and hardened the tests up: https://github.com/apache/lucene-solr/pull/1736 This will stop testConcurrentRequests from failing and should hopefully stop testSlotBorrowing as well. If testSlotBorrowing continues to fail, I will have to rethink the test. On

Re: Badapple report

2020-08-10 Thread Erick Erickson
OK, thanks. I’m not really annotating things at this point, although occasionally removing some that haven’t failed in a long time. > On Aug 10, 2020, at 1:44 PM, Tomás Fernández Löbbe > wrote: > > Hi Erick, > I've introduced and later fixed a bug in TestConfig. It hasn't failed since, > so

Re: Badapple report

2020-08-10 Thread Tomás Fernández Löbbe
Hi Erick, I've introduced and later fixed a bug in TestConfig. It hasn't failed since, so please don't annotate it. On Mon, Aug 10, 2020 at 7:47 AM Erick Erickson wrote: > We’re backsliding some. I encourage people to look at: > http://fucit.org/solr-jenkins-reports/failure-report.html, we have

Badapple report

2020-08-10 Thread Erick Erickson
We’re backsliding some. I encourage people to look at: http://fucit.org/solr-jenkins-reports/failure-report.html, we have a number of ill-behaved tests, particularly TestRequestRateLimiter, TestBulkSchemaConcurrent, TestConfig, SchemaApiFailureTest and TestIndexingSequenceNumbers… Raw fail

BadApple report, but please read the first bit

2020-08-03 Thread Erick Erickson
There are several tests that are causing a lot of noise: SharedFSAutoReplicaFailoverTest is failing 90%+ of the time. TestBulkSchemaConcurrent 31% StressHdfsTest 16% SchemaApiFailureTest 13.88% I encourage people to look at: http://fucit.org/solr-jenkins-reports/failure-report.html and see if

BadApple report

2020-07-27 Thread Erick Erickson
Short form: Processing file (History bit 3): HOSS-2020-07-27.csv Processing file (History bit 2): HOSS-2020-07-20.csv Processing file (History bit 1): HOSS-2020-07-13.csv Processing file (History bit 0): HOSS-2020-07-06.csv Number of AwaitsFix: 33 Number of BadApples: 4 **Annotated tests that

BadApple report

2020-07-20 Thread Erick Erickson
Well, that’s one way to reduce the number of SuppressWarnings… cut out massive amounts of code ;)…. SuppressWarnings count: last week: 5,353, this week: 4,835, delta -518 We had quite a spike in the raw number of tests that have failed at least once in the last week: Raw fail count by week

BadApple report

2020-07-13 Thread Erick Erickson
Actaully, pretty good. The attached file has a lot of noise in it that’s a listing of the files that have more or less SuppressWarnings annotations than last week, the delta is -19. It’s a crude measure, I can replace N SuppressWarnings in a class with one for the entire class, but it’s also

Re: BadApple report

2020-07-06 Thread Erick Erickson
Megan: There are a number of tests that have been flagged by some devs that, no matter what, should _not_ be annotated with BadApple or AwaitsFix and that’s just a list to remind me what they are. It’s not much of a deal, though, because I’m not doing much annotating lately. The original

Re: BadApple report

2020-07-06 Thread Megan Carey
Hi Erick, I'm wondering what is meant by "DO NOT ANNOTATE LIST" at the start of the report? Better yet, can you please link to the scraping tool used to generate the report? Thank you! Megan On Mon, Jul 6, 2020 at 8:07 AM Erick Erickson wrote: > Holding fairly steady, but IDK whether Hoss’

BadApple report

2020-07-06 Thread Erick Erickson
Holding fairly steady, but IDK whether Hoss’ scraping is getting data from Uwe’s machines, thought I saw an e-mail go by about that. this is the first report where the suppresswarnings stats mean anything. Full report attached: DO NOT ENABLE LIST: MoveReplicaHDFSTest.testFailedMove

BadApple report

2020-06-29 Thread Erick Erickson
Holding fairly steady. Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 26 failures Week: 1 had 26 failures Week: 2 had 34 failures Week: 3 had 128 failures This week’s report includes the SuppressWarnings summary. This is really the baseline, I

BadApple report

2020-06-22 Thread Erick Erickson
Not a bad week all told, but something seems a little odd, I remember a lot more e-mails going by, but perhaps it’s just these 26 tests failing repeatedly. Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 26 failures Week: 1 had 34 failures Week: 2

BadApple report

2020-06-15 Thread Erick Erickson
The number of chronically failing tests dropped considerably this past week, whether that’s an anomaly or not is a good question. I’ve finished the SuppressWarnings annotations, so next week I _should_ be able to include how many new SuppressWarnings have been added to the code and have it

Re: BadApple report

2020-06-08 Thread Erick Erickson
Thanks for letting me know Tomás As useful as Hoss’ rollups are, there’s always a lag to deal with, sounds like this is one. > On Jun 8, 2020, at 2:26 PM, Tomás Fernández Löbbe > wrote: > > Thanks for keeping an eye Erick. I took a quick look at the > "TestIndexSearcher" failures and I

Re: BadApple report

2020-06-08 Thread Tomás Fernández Löbbe
Thanks for keeping an eye Erick. I took a quick look at the "TestIndexSearcher" failures and I think they're related to SOLR-14525. Should be fixed after this[1] commit by Noble. [1] https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=5827ddf On Mon, Jun 8, 2020 at 7:52 AM Erick Erickson

BadApple report

2020-06-08 Thread Erick Erickson
If people don’t know about: http://fucit.org/solr-jenkins-reports/suspicious-failure-report.html, I strongly recommend you periodically check it. It reports tests that have changed their failure rates lately. There are three currently:

Re: BadApple report. It's worth reviewing the SuppressWarnings section even if you ignore the rest.

2020-06-02 Thread Erick Erickson
ignore it. Eventually, when all the warnings are fixed or >> suppressed, I will be advocating for _not_ introducing new warnings at least >> on Master. To encourage this, I want un-suppressed warnings to become >> compile-time errors. >> >> That’ll tempt people

Re: BadApple report. It's worth reviewing the SuppressWarnings section even if you ignore the rest.

2020-06-02 Thread Noble Paul
nt un-suppressed warnings to become > compile-time errors. > > That’ll tempt people to just add @SuppressWarnings, and I don’t think that’s > a proper fix, so the BadApple report will flag files that have more > @SuppressWarnings than they did last week and I’ll complain ;) There’

BadApple report. It's worth reviewing the SuppressWarnings section even if you ignore the rest.

2020-06-01 Thread Erick Erickson
ter. To encourage this, I want un-suppressed warnings to become compile-time errors. That’ll tempt people to just add @SuppressWarnings, and I don’t think that’s a proper fix, so the BadApple report will flag files that have more @SuppressWarnings than they did last week and I’ll complain ;) There

Re: BadApple report

2020-05-27 Thread Jason Gerlowski
> Hoss’s rollups are here: > http://fucit.org/solr-jenkins-reports/failure-report.html which show the > rates, but not where they came from. If I click on a particular test entry on "failure-report.html", I'm presented with dialog with links for each failure. Clicking that link takes me to a

Re: BadApple report

2020-05-25 Thread Ilan Ginzburg
Thanks that helps. I'll try to have a look at some of the failures related to areas I know. Ilan On Mon, May 25, 2020 at 7:07 PM Erick Erickson wrote: > Ilan: > > That’s, unfortunately, not an easy question. Hoss’s rollups are here: > http://fucit.org/solr-jenkins-reports/failure-report.html

Re: BadApple report

2020-05-25 Thread Erick Erickson
Ilan: That’s, unfortunately, not an easy question. Hoss’s rollups are here: http://fucit.org/solr-jenkins-reports/failure-report.html which show the rates, but not where they came from. Here’s an example of a failure from Jenkins, if you follow the link you can see the full output, (click

Re: BadApple report

2020-05-25 Thread Ilan Ginzburg
Where are the test failure details? On Mon, May 25, 2020 at 4:47 PM Erick Erickson wrote: > Here’s the summary: > > Raw fail count by week totals, most recent week first (corresponds to > bits): > Week: 0 had 113 failures > Week: 1 had 103 failures > Week: 2 had 102 failures > Week: 3

BadApple report

2020-05-25 Thread Erick Erickson
Here’s the summary: Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 113 failures Week: 1 had 103 failures Week: 2 had 102 failures Week: 3 had 343 failures Failures in Hoss' reports for the last 4 rollups. There were 511 unannotated

BadApple report

2020-05-18 Thread Erick Erickson
Short form: Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 103 failures Week: 1 had 102 failures Week: 2 had 343 failures Week: 3 had 86 failures Failures in Hoss' reports for the last 4 rollups. There were 493 unannotated tests that

BadApple report

2020-05-11 Thread Erick Erickson
Largely ignore the fact that weeks 0 and 1 had so many failures, that was due to Jenkins running out of space, which bled over into the week0 report. This is the first one that reports the number of SuppressWarnings annotations that we can use as a baseline. If I start adding SuppressWarnings

Re: PLEASE READ! BadApple report. Last week was horrible!

2020-05-06 Thread Michael McCandless
Phew! Thanks for digging Erick, and for producing these BadApple reports. Mike McCandless http://blog.mikemccandless.com On Wed, May 6, 2020 at 7:59 AM Erick Erickson wrote: > OK, this morning things are back to normal. I think the disk space issue > was to blame because checking after

Re: PLEASE READ! BadApple report. Last week was horrible!

2020-05-06 Thread Erick Erickson
OK, this morning things are back to normal. I think the disk space issue was to blame because checking after Mike’s fix didn’t look like it cured the problem. Thanks all! > On May 5, 2020, at 1:41 PM, Chris Hostetter wrote: > > > : And FWIW, I beasted one of the failing suites last night

Re: PLEASE READ! BadApple report. Last week was horrible!

2020-05-05 Thread Erick Erickson
OK, thanks Chris. The 24 hour rollup still shows many failures in the several classes, I’ll check tomorrow to see if that’s a consequence of the disk full problem. > On May 5, 2020, at 1:41 PM, Chris Hostetter wrote: > > > : And FWIW, I beasted one of the failing suites last night _without_

Re: PLEASE READ! BadApple report. Last week was horrible!

2020-05-05 Thread Chris Hostetter
: And FWIW, I beasted one of the failing suites last night _without_ : Mike’s changes and didn’t get any failures so I can’t say anything about : whether Mike’s changes helped or not. IIUC McCandless's failure only affects you if you use the "jenkins" test data file (the really big wikipedia

Re: PLEASE READ! BadApple report. Last week was horrible!

2020-05-05 Thread Erick Erickson
; Achterdiek 19, D-28357 Bremen > https://www.thetaphi.de > eMail: u...@thetaphi.de > >> -Original Message- >> From: Erick Erickson >> Sent: Monday, May 4, 2020 1:54 PM >> To: dev@lucene.apache.org >> Subject: PLEASE READ! BadApple report. Last week

RE: PLEASE READ! BadApple report. Last week was horrible!

2020-05-05 Thread Uwe Schindler
://www.thetaphi.de eMail: u...@thetaphi.de > -Original Message- > From: Erick Erickson > Sent: Monday, May 4, 2020 1:54 PM > To: dev@lucene.apache.org > Subject: PLEASE READ! BadApple report. Last week was horrible! > > I don’t know whether we had some temporary g

Re: PLEASE READ! BadApple report. Last week was horrible!

2020-05-04 Thread Erick Erickson
Mike: I saw the push. Hoss’ rollups go for “the last 24 hours”, so it’ll be Tuesday evening before things have had a chance to work their way through, I’ll look tomorrow. Meanwhile I’m beasting one of the failing test suites (without the change) and 280 iterations so far and no failures. That

Re: PLEASE READ! BadApple report. Last week was horrible!

2020-05-04 Thread Michael McCandless
Hi Erick, OK I pushed a fix! See if it decreases the failure rate for those newly bad apples? Sorry and thanks :) Mike McCandless http://blog.mikemccandless.com On Mon, May 4, 2020 at 1:06 PM Erick Erickson wrote: > Mike: > > I have no idea. Hoss’ rollups don’t link back to builds, they >

Re: PLEASE READ! BadApple report. Last week was horrible!

2020-05-04 Thread Erick Erickson
Mike: I have no idea. Hoss’ rollups don’t link back to builds, they just aggregate the results. Not a huge deal if it’s something like this of course. Let’s just say I’ve had my share or “moments” ;). And unfortunately, the test failures are pretty rare on a percentage basis, so it’s hard to

Re: PLEASE READ! BadApple report. Last week was horrible!

2020-05-04 Thread Michael McCandless
Hi Erick, It's possible this was the root cause of many of the failures: https://issues.apache.org/jira/browse/LUCENE-9191 Do these transient failures look something like this? [junit4]> Throwable #1: java.nio.charset.MalformedInputException: Input length = 1 [junit4]>at

PLEASE READ! BadApple report. Last week was horrible!

2020-05-04 Thread Erick Erickson
I don’t know whether we had some temporary glitch that broke lots of tests and they’ve been fixed or we had a major regression, but this needs to be addressed ASAP if they’re still failing. See everything below the line "ALL OF THE TESTS BELOW HERE HAVE ONLY FAILED IN THE LAST WEEK!” in this

BadApple report

2020-04-27 Thread Erick Erickson
Kevin: The good news is that no SyncSliceTest failures in the last week, cool! Number of AwaitsFix: 42 Number of BadApples: 4 Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 86 failures Week: 1 had 78 failures Week: 2 had 117 failures Week: 3 had

BadApple report

2020-04-20 Thread Erick Erickson
Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 78 failures Week: 1 had 117 failures Week: 2 had 99 failures Week: 3 had 69 failures Failures in Hoss' reports for the last 4 rollups. There were 243 unannotated tests that failed in

Re: BadApple report

2020-04-18 Thread Kevin Risden
> > 0123 59.4 195 92 HdfsSyncSliceTest.test I'm looking into this HdfsSyncSliceTest failure. Jira https://issues.apache.org/jira/browse/SOLR-13886 Kevin Risden Kevin Risden On Mon, Apr 13, 2020 at 8:35 AM Erick Erickson wrote: > We’re backsliding a bit. Note that over the

BadApple report

2020-04-13 Thread Erick Erickson
We’re backsliding a bit. Note that over the last two weeks we’ve had successively more failures, HdfsSyncSliceTest is failing over half the time! Can we just nuke it? Here’s the short form aw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 117 failures

BadApple report

2020-04-06 Thread Erick Erickson
Short form: We had a slight uptick in failures last week, root cause unknown. Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 99 failures Week: 1 had 69 failures Week: 2 had 65 failures Week: 3 had 129 failures Failures in Hoss' reports

BadApple report

2020-03-30 Thread Erick Erickson
There are a couple of tests that can have BadApple removed, MultiThreadedOCPTest.test SolrZkClientTest.testSimpleUpdateACLs I’ll take care of those today or tomorrow. Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 69 failures Week: 1 had 65

BadApple report

2020-03-24 Thread Erick Erickson
Short form: There were 287 unannotated tests that failed in Hoss' rollups. Ordered by the date I downloaded the rollup file, newest->oldest. See above for the dates the files were collected These tests were NOT BadApple'd or AwaitsFix'd Failures in the last 4 reports.. Report Pct

BadApple report

2020-03-16 Thread Erick Erickson
I was on vacation the last couple of weeks so missed the BadApple reports. Full results attached Failures in Hoss' reports for the last 4 rollups. There were 373 unannotated tests that failed in Hoss' rollups. Ordered by the date I downloaded the rollup file, newest->oldest. See above

Badapple report

2020-02-24 Thread Erick Erickson
Attached. Short form: **Haven't failed in the last 4 rollups. **Methods: 2 MultiThreadedOCPTest.test SolrZkClientTest.testSimpleUpdateACLs Failures in Hoss' reports for the last 4 rollups. There were 292 unannotated tests that failed in Hoss' rollups. Ordered by the date I

BadApple report

2020-02-10 Thread Erick Erickson
Holding reasonable steady in terms of failures every week for the last 4: Failures in the last 4 reports.. Report Pct runsfails test 0123 2.4 1694 49 BasicDistributedZkTest.test 0123 0.2 1645 5 ExecutePlanActionTest.testTaskTimeout

BadApple report

2020-02-03 Thread Erick Erickson
Won’t add annotations. Here’s the failures in the last 4 runs: Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 114 failures Week: 1 had 125 failures Week: 2 had 191 failures Week: 3 had 118 failures Failures in the last 4 reports.. Report Pct

BadApple report

2020-01-20 Thread Erick Erickson
Failures in each of the last 4 reports.. Report Pct runsfails test 0123 0.3 1384 11 AutoScalingHandlerTest.testReadApi 0123 0.3 1402 8 HttpPartitionTest.test 0123 0.3 1393 11 HttpPartitionWithTlogReplicasTest.test

BadApple report

2020-01-13 Thread Erick Erickson
I’m not actively annotating anything at this point, the number of failed tests over each of the last 4 weeks is short enough that I’ll just echo those in these e-mails, the full report is attached for anyone who wants to track history. I’ll revise the wording to not make it look like I’ll

Re: BadApple report

2020-01-06 Thread Erick Erickson
Will do. Actually, won’t do (disable that is)…. One of the things that’s kind of a pain is that the report doesn’t distinguish between different JVMs so there’s no really convenient way to ignore this kind of thing. Anyway, I’ve put both of them in my list, and I have to say I’m not actively

Re: BadApple report

2020-01-06 Thread Robert Muir
Same goes for TestPackedInts. Currently test runs containing ZGC or Shenandoah garbage collectors don't reflect the test itself. Please don't disable them. On Mon, Jan 6, 2020 at 12:38 PM Robert Muir wrote: > We shouldn't disable Test2BPostings since there is nothing wrong with the > test: this

Re: BadApple report

2020-01-06 Thread Robert Muir
We shouldn't disable Test2BPostings since there is nothing wrong with the test: this is one impacted by bugs in the Shenandoah and ZGC garbage collectors. See the other threads on the dev-list about them. On Mon, Jan 6, 2020 at 10:47 AM Erick Erickson wrote: > Short form: > > There were 1480

BadApple report

2020-01-06 Thread Erick Erickson
Short form: There were 1480 unannotated tests that failed in Hoss' rollups. Ordered by the date I downloaded the rollup file, newest->oldest. See above for the dates the files were collected These tests were NOT BadApple'd or AwaitsFix'd All tests that failed 4 weeks running will be BadApple'd

BadApple report

2019-12-23 Thread Erick Erickson
As all the security stuff settles down, I’m still taking these snapshots but mostly to keep a complete record. The longer records, i.e. for the last 7 days contains a lot of noise comparatively. That said, it’s worth looking at Hoss’ last 7 day rollup, we do have a number of tests failing

Badapple report

2019-12-02 Thread Erick Erickson
Short form: Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 83 failures Week: 1 had 253 failures Week: 2 had 56 failures Week: 3 had 66 failures Failures in the last 4 reports.. Report Pct runsfails test 0123 16.7

BadApple report, not a good week.

2019-11-25 Thread Erick Erickson
This is not a good week at all: Raw fail count by week totals, most recent week first (corresponds to bits): Week: 0 had 253 failures Most recent 7 days Week: 1 had 56 failures 7 days before that Week: 2 had 66 failures Week: 3 had 83 failures Going from 56 failures to 253 is A Very

Badapple report. Please read the first 5 lines at least.

2019-11-11 Thread Erick Erickson
MoveReplicaHDFSTest.test LegacyCloudClusterPropTest.testCreateCollectionSwitchLegacyCloud TestModelManagerPersistence all fail more than 10%, MoveReplicaHDFSTest 50%. BasicAuthIntegrationTest.testBasicAuth comes in at just under 10%. Short form: There were 147 unannotated tests that failed in

BadApple report

2019-10-28 Thread Erick Erickson
It’s been a while. I think this is mostly informational. I was all excited when the reports were getting s much better, but that was an artifact of some test environments not being up and running. When Mark’s test work hits, we’ll probably have to start over. That said, people SHOULD LOOK

BadApple report

2019-09-16 Thread Erick Erickson
I’m going to suspend these until we build up a better backlog of tests since a number of machines weren’t being collected by Hoss’ rollups. I’ll continue to gather the rollups every week, but for a while I don’t think it’s worth cluttering your inbox.

No BadApple report this week

2019-09-02 Thread Erick Erickson
I’ll probably just continue to gather Hoss’ rollups each week, but until we get the jenkins stuff back running it’s probably not worth the effort. - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional

Badapple report

2019-08-19 Thread Erick Erickson
No annotation changes will happen this week. Summary: Processing file (History bit 3): HOSS-2019-088-05.csv Processing file (History bit 2): HOSS-2019-08-19.csv Processing file (History bit 1): HOSS-2019-08-12.csv Processing file (History bit 0): HOSS-2019-07-29.csv Number of AwaitsFix: 38

Badapple report

2019-08-12 Thread Erick Erickson
Continued improvement I think. Or at least the improvements 3 weeks ago are working their way through the system. Note that the number of tests that _only_ failed three weeks ago is almost half the total. So I have some optimism that next week we’ll see a further large drop. Here’s the

BadApple report

2019-08-05 Thread Erick Erickson
Interestingly, the numbers of failed test has gone down pretty radically over the last while. I skipped about 4 weeks of collecting the reports while moving, but if I compare the tests that failed during the last two weeks in the rollup from July 1 with the the last two weeks sollected today,

BadApple report

2019-07-29 Thread Erick Erickson
Here it is after a hiatus. I have moved from California to South Orange, NJ… it’s a long story why. But I’ll be glad to tell y’all about driving a Chevy Bolt EV across country and how Wyoming has very few commercial charging options… But I did get to see Old Faithful erupt… Any, I won’t make

Re: BadApple report

2019-07-01 Thread Kevin Risden
HdfsAutoAddReplicasIntegrationTest.testSimple I am going to awaitsfix this test - https://issues.apache.org/jira/browse/SOLR-13338. I haven't had time to look into recent failures. I thought the Jetty upgrade would have helped. It had very similar timeout waiting exception. Kevin Risden On

BadApple report

2019-07-01 Thread Erick Erickson
Pretty steady, I won’t be doing anything with annotations this week: **Annotations will be removed from the following tests because they haven't failed in the last 4 rollups. **Methods: 3 FullSolrCloudDistribCmdsTest.test MultiThreadedOCPTest.test

BadApple report

2019-06-24 Thread Erick Erickson
I won’t change annotations again this week. Here’s the short from: **Annotations will be removed from the following tests because they haven't failed in the last 4 rollups. **Methods: 2 FullSolrCloudDistribCmdsTest.test SolrZkClientTest.testSimpleUpdateACLs Failures in Hoss'

BadApple report

2019-06-10 Thread Erick Erickson
Holding pretty steady, won’t remove annotations just yet. Full report attached. I _strongly_ urge people to take a quick glance at: http://fucit.org/solr-jenkins-reports/failure-report.html regularly. There are 5 tests that are failing 25% of the time or more currently. ——Report

BadApple report

2019-06-03 Thread Erick Erickson
I probably won’t remove the annotations indicated this week, kinda busy. Overall looks like we’re getting gradually better. Full report attached: **Annotations will be removed from the following tests because they haven't failed in the last 4 rollups. **Methods: 3

BadApple report, things are changing

2019-02-18 Thread Erick Erickson
things are settled down quite a bit. So ongoing I’ll publish this each week, but will only periodically change the annotations. If/when we stop running 7x Jenkins jobs, I may start annotating with BadApple again, we’ll see. Meanwhile I’ll post the list of new test failures over the last 4

BadApple report

2019-01-14 Thread Erick Erickson
Well, I didn't add stuff last week, slipped through the cracks. Anyway, here's the current list. NOTE: lots more tests are being un-annotated than annotated, which is good. Also, this last report has 421 total tests that failed sometime in the last 4 weeks. The report before had 655. Still quite

BadApple report for Monday

2018-10-08 Thread Erick Erickson
Well, I missed two weeks in a row. So sue me ;). This week fer sure Here's the condensed report. Let me know if there are any issues. Full report attached. DO NOT ENABLE LIST: 'TestControlledRealTimeReopenThread.testCRTReopen' 'TestICUNormalizer2CharFilter.testRandomStrings'

BadApple report, 60+ tests to be annotated

2018-09-18 Thread Erick Erickson
This is a pretty bad week. 60+ tests to be annotated and only 4 to be un-annotated. Here's the culled list, full report attached. **Annotations will be removed from the following tests because they haven't failed in the last 4 rollups. **Methods: 4 MoveReplicaHDFSTest.testNormalFailedMove

Re: BadApple report, PLEASE CHECK THE FIRST PART.

2018-09-10 Thread Adrien Grand
Hi Erick, Le lun. 10 sept. 2018 à 20:06, Erick Erickson a écrit : > First, I have these two lists, are they still current? > > DO NOT ENABLE LIST: > 'TestControlledRealTimeReopenThread.testCRTReopen' > 'TestICUNormalizer2CharFilter.testRandomStrings' > 'TestICUTokenizerCJK' > +1 to

BadApple report, PLEASE CHECK THE FIRST PART.

2018-09-10 Thread Erick Erickson
First, I have these two lists, are they still current? DO NOT ENABLE LIST: 'TestControlledRealTimeReopenThread.testCRTReopen' 'TestICUNormalizer2CharFilter.testRandomStrings' 'TestICUTokenizerCJK' 'TestImpersonationWithHadoopAuth.testForwarding'

Re: BadApple report TestPolicy, TestCollectionStateWatchers TestWithCollection

2018-08-27 Thread Erick Erickson
Sure, won't BadApple TestWithCollection. On Mon, Aug 27, 2018 at 10:01 PM Shalin Shekhar Mangar wrote: > > Thanks Erick. I'm working on fixing TestWithCollection so please do not > BadApple it this week. > > On Tue, Aug 28, 2018 at 1:04 AM Erick Erickson > wrote: >> >> On the plus side, the

Re: BadApple report TestPolicy, TestCollectionStateWatchers TestWithCollection

2018-08-27 Thread Shalin Shekhar Mangar
Thanks Erick. I'm working on fixing TestWithCollection so please do not BadApple it this week. On Tue, Aug 28, 2018 at 1:04 AM Erick Erickson wrote: > On the plus side, the CDCR tests (except BiDir) seem to be fixed. > > Also on the plus side, there are quite a number of tests that have > _not_

BadApple report TestPolicy, TestCollectionStateWatchers TestWithCollection

2018-08-27 Thread Erick Erickson
On the plus side, the CDCR tests (except BiDir) seem to be fixed. Also on the plus side, there are quite a number of tests that have _not_ failed in the last 4 weeks and I'll un-annotate. On the minus side, TestPolicy has 39 tests that have failed at least once in the last 4 weeks. I'll beast

Weekly BadApple report

2018-08-06 Thread Erick Erickson
**Annotated tests/suites that didn't fail in the last 4 weeks. **Annotations will be removed from the following tests because they haven't failed in the last 4 rollups. **Methods: 8 BasicAuthIntegrationTest.testBasicAuth CollectionsAPIAsyncDistributedZkTest.testAsyncRequests

Re: BadApple report. Seems like I'm wasting my time.

2018-08-01 Thread Mark Miller
I still think it’s a mistake to try and use all the Jenkins results to drive ignoring tests. It needs to be an objective measure in a good env. We also should not be ignoring tests in mass.l without individual consideration. Critical test coverage should be treated differently than any random

Re: BadApple report. Seems like I'm wasting my time.

2018-08-01 Thread Erick Erickson
Alexandre: Feel free! What I'm struggling with is not that someone checked in some code that all the sudden started breaking things. Rather that a test that's been working perfectly will fail once the won't reproducibly fail again and does _not_ appear to be related to recent code changes. In

  1   2   >