Analyzing history instead of parsing timestamps from wikitext won't work
because it will catch things like {{resolved}} tags that shouldn't delay
archiving. It also will keep the common hack of adding a timestamp far in
the future to prevent premature archiving from working.
On Aug 2, 2014 10:26 AM, "Amir Ladsgroup" <[email protected]> wrote:> Mpaa: I meant getting section date by getting history and analyzing > revisions instead of checking signature on them. > > > Best > > On 8/2/14, Mpaa <[email protected]> wrote: > > @Amir > > > > If a section is unsigned and there is no timestamp, how can you find > > sections via their revision timestamp? And expect that the timestamp is a > > datetime object? > > > > Mpaa > > > > > > On Sat, Aug 2, 2014 at 8:57 AM, John Mark Vandenberg <[email protected]> > > wrote: > > > >> Recently we had a few cases of code that didnt compile getting merged > >> (I have +2 some of them.. :/), so I have fast tracked the addition of > >> a set of tests which run every script with -help , and with -simulate. > >> These new tests add about 3 minutes to the test suite execution, and > >> add basic validation that the scripts compile and at least main() can > >> be executed. > >> > >> There are a few scripts which do not emit help on -help, Not too many. > >> > >> The -simulate argument prevents the scripts from writing to any wiki. > >> Without any other argument, the script should do argument parsing, and > >> usually quit as if called with -help, or provide some informative > >> error message. > >> > >> Many scripts do not do proper argument parsing and environment sanity > >> checking, resulting in exceptions. > >> > >> The new tests, annotated with bug numbers, is here: > >> > >> > http://git.wikimedia.org/blob/pywikibot%2Fcore.git/master/tests%2Fscript_tests.py > >> > >> The "auto_run_script_list" is the list of scripts which start work > >> without any additional arguments. For those scripts, the tester may > >> wait up to 5 seconds before it kills the process - we may be able to > >> reduce that delay per script by fixing some of the bugs. > >> > >> Due to some fancy legwork by Legoktm, we now have six Travis builds > >> occurring after each checkin, including running these tests against > >> 1. English Wikipedia, > >> 2. Arabic Wikipedia, and > >> 3. Wikidata. > >> > >> https://travis-ci.org/wikimedia/pywikibot-core > >> > >> This means that a month old critical bug is now visible in the two > >> Arabic Wikipedia builds which are failing. There is a patch to be > >> reviewed: > >> https://gerrit.wikimedia.org/r/#/c/149898/ > >> > >> The 'basic' script failing on the py2.6 wikidata build seems to be > >> because py2.6 unit tests are executed in alpha order of the test > >> script name, and the wikidata login doesnt happen earlier for > >> wikidata, but does occur earlier for test scripts against English and > >> Arabic Wikipedia. I am currently working on a fix for this build > >> problem. > >> > >> -- > >> John Vandenberg > >> > >> _______________________________________________ > >> Pywikipedia-l mailing list > >> [email protected] > >> https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l > >> > > > > > -- > Amir > > _______________________________________________ > Pywikipedia-l mailing list > [email protected] > https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l >
_______________________________________________ Pywikipedia-l mailing list [email protected] https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
