[Bug 72209] testExturlusage takes forever on test.wikipedia

2014-10-21 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=72209 John Mark Vandenberg jay...@gmail.com changed: What|Removed |Added Status|PATCH_TO_REVIEW |RESOLVED

[Bug 72209] testExturlusage takes forever on test.wikipedia

2014-10-20 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=72209 --- Comment #9 from Gerrit Notification Bot gerritad...@wikimedia.org --- Change 167438 merged by jenkins-bot: Increase limits in QueryGenerator when data are sparse https://gerrit.wikimedia.org/r/167438 -- You are receiving this mail

[Bug 72209] testExturlusage takes forever on test.wikipedia

2014-10-19 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=72209 --- Comment #4 from Mpaa mpaa.w...@gmail.com --- (In reply to John Mark Vandenberg from comment #3) (In reply to Mpaa from comment #2) It tries to fetch only the number of elements left to reach 5. When 1 is reached, it stays there for

[Bug 72209] testExturlusage takes forever on test.wikipedia

2014-10-19 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=72209 --- Comment #5 from John Mark Vandenberg jay...@gmail.com --- geulimit=1 says the client wants 1 only record. The MW API isnt returning one record. It is moving the cursor forward by one and returning zero records. It feels like MW is

[Bug 72209] testExturlusage takes forever on test.wikipedia

2014-10-19 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=72209 --- Comment #6 from John Mark Vandenberg jay...@gmail.com --- The API documentation explains it eunamespace - The page namespace(s) to enumerate. NOTE: Due to $wgMiserMode, using this may result in fewer than

[Bug 72209] testExturlusage takes forever on test.wikipedia

2014-10-19 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=72209 --- Comment #7 from Gerrit Notification Bot gerritad...@wikimedia.org --- Change 167438 had a related patch set uploaded by Mpaa: api.py: increase api limits when data are sparse https://gerrit.wikimedia.org/r/167438 -- You are receiving

[Bug 72209] testExturlusage takes forever on test.wikipedia

2014-10-19 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=72209 Gerrit Notification Bot gerritad...@wikimedia.org changed: What|Removed |Added Status|NEW

[Bug 72209] testExturlusage takes forever on test.wikipedia

2014-10-19 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=72209 --- Comment #8 from Mpaa mpaa.w...@gmail.com --- Yes, that is what I meant. -- You are receiving this mail because: You are on the CC list for the bug. ___ Wikibugs-l mailing list

[Bug 72209] testExturlusage takes forever on test.wikipedia

2014-10-18 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=72209 Mpaa mpaa.w...@gmail.com changed: What|Removed |Added CC||mpaa.w...@gmail.com ---

[Bug 72209] testExturlusage takes forever on test.wikipedia

2014-10-18 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=72209 --- Comment #2 from Mpaa mpaa.w...@gmail.com --- A possible strategy could be to increase the new_limit if the code is in this condition in api.py, line 1090: else: # if query-continue is present, self.resultkey might not have been #

[Bug 72209] testExturlusage takes forever on test.wikipedia

2014-10-18 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=72209 --- Comment #3 from John Mark Vandenberg jay...@gmail.com --- (In reply to Mpaa from comment #2) It tries to fetch only the number of elements left to reach 5. When 1 is reached, it stays there for 12000 queries .. But MW doesnt return one