Re: [Wikitech-l] Use Supercounter as an API

2015-05-30 Thread Ricordisamoa

I saw there's an API at https://tools.wmflabs.org/supercount/api.php

Il 30/05/2015 19:48, Nasir Khan ha scritto:

Hi,
I want to build a tool which will generate the statistics for my native
wiki. Before starting to build a tool i was searching for an existing one.
The Super Counter[1] has some of the features i want to implement in my
tool.

So is there any way to use the Super Counter as an API? if so, can anyone
please show me the path of the documentation?


[1] - https://tools.wmflabs.org/supercount/index.php

--
*Nasir Khan Saikat*
www.nasirkhn.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Use Supercounter as an API

2015-05-30 Thread Nasir Khan
Hi,
I want to build a tool which will generate the statistics for my native
wiki. Before starting to build a tool i was searching for an existing one.
The Super Counter[1] has some of the features i want to implement in my
tool.

So is there any way to use the Super Counter as an API? if so, can anyone
please show me the path of the documentation?


[1] - https://tools.wmflabs.org/supercount/index.php

--
*Nasir Khan Saikat*
www.nasirkhn.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Simplifying the WMF deployment cadence

2015-05-30 Thread Brad Jorsch (Anomie)
On May 30, 2015 4:07 AM, John Mark Vandenberg jay...@gmail.com wrote:

 So if a shorter deploy process is implemented, we need to find ways to
 get bug reports to you sooner,

I think work/life balance is going to continue to prevent me personally
from getting things any sooner ;)

 and ensure you are not the only one who
 can notice and fix bugs related to API breakages, etc.

Anyone *can*, just no one really *does* at the moment. The Platform reorg
made a team that would have, but then the Engineering reorg changed that
plan.

We decided to have the new Reading Infrastructure team take on API
maintenance, but at the moment the team is short on developers.

 IIRC your API warnings system can send multiple distinct warnings as a
 single string, with each warning separated by only a new line, which
 is especially nasty for user agents to 'understand'. (but this may
 only be in older versions of the API - I'm not sure)

Fixing that is on my todo list.

 that there would be a warning related to the module used, e.g.
 result['warnings']['allpages'] , but that didnt exist because this
 warning was in result['warnings']['query'].

The query module is used there too ;)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Simplifying the WMF deployment cadence

2015-05-30 Thread John Mark Vandenberg
On Fri, May 29, 2015 at 9:36 PM, Brad Jorsch (Anomie)
bjor...@wikimedia.org wrote:
 On Thu, May 28, 2015 at 2:39 PM, John Mark Vandenberg jay...@gmail.com
 wrote:

 [T96942 https://phabricator.wikimedia.org/T96942] was reported by
 pywikibot devs almost as soon as we detected that
 the test wikis were failing in our travis-ci tests.


 At 20:22 in the timezone of the main API developer (me).

 It was 12 hours before a MediaWiki API fix was submitted to Gerrit,


 09:31, basically first thing in the morning for the main API developer.
 There's really not much to complain about there.

In the proposed deploy sequence, 12 hours is a serious amount of time.

So if a shorter deploy process is implemented, we need to find ways to
get bug reports to you sooner, and ensure you are not the only one who
can notice and fix bugs related to API breakages, etc.

 and it took four additional *days* to get merged.


 That part does suck.

On a positive note, the API breakage this week was rectified much
quicker and pywikibot test builds are green again.

https://phabricator.wikimedia.org/T100775
https://travis-ci.org/wikimedia/pywikibot-core/builds/64631025

 This also doesnt give clients sufficient time to workaround
 MediaWiki's wonderful intentional API breakages. e.g. raw continue,
 which completely broke pywikibot and needed a large chunk of code
 rewritten urgently, both for pywikibot core and the much older and
 harder to fix pywikibot compat, which is still used as part of
 processes that wiki communities rely on.


 The continuation change hasn't actually broken anything yet.

Hmm.  You still don't appreciate that you actually really truly
fair-dinkum broke pywikibot?  warnings are part of the API, and
adding/changing them can break clients.  warnings are one of the more
brittle part of the API.

The impact on pywikibot core users wasnt so apparent, as the pywikibot
core devs fixed the problem when it hit the test servers and it was
merged before it hit production servers.  Not all users had 'git pull'
the latest pywikibot core code, and they informed us their bots were
broken, but as far as I am aware we didnt get any pywikibot core bug
reports submitted because (often after mentioning problems on IRC)
their problems disappeared after they ran 'git pull'.

However pywikipedia / compat isnt actively maintained, and it broke
badly in production, with some scripts being broken for over a month:

https://phabricator.wikimedia.org/T74667
https://phabricator.wikimedia.org/T74749

pywikibot core is gradually improving its understanding of the API
warning system, but it isnt well supported yet.  As a result,
generally pywikibot reports warnings to the user.
IIRC your API warnings system can send multiple distinct warnings as a
single string, with each warning separated by only a new line, which
is especially nasty for user agents to 'understand'. (but this may
only be in older versions of the API - I'm not sure)

So adding a new warning to the API can result in the same warning
appearing many many times on the user console / logs, and thousands of
warnings on the screen sends the users into panic mode.

I strongly recommend fixing the warning system before using it again
aggressively like was done for rawcontinue.  e.g. It would be nice if
the API emitted codes for each warning scenario (dont reuse codes for
similar scenarios), so we don't need to do string matching to detect 
discard expected warnings, and you can i18n those messages without
breaking clients. (I think there is already a phab task for this.)

I also strongly recommend that Wikimedia gets heavily involved in
decommissioning pywikibot compat bots on Wikimedia servers, and any
other very old unmaintained clients, so that the API can be
aggressively updated without breaking the many bots still using
compat.  pywikibot devs did some initial work with WMF staff at Lyon
on this front, and we need to keep that moving ahead.

 Unless pywikibot was treating warnings as errors and that's what broke it?

Yes, some of the code was raising an exception when it detected an API warning.

However another part of the breakage was that the JSON structure of
the new rawcontinue warnings was not what was expected.  Some
pywikipedia / compat code assumed that the presence of warnings
implied that there would be a warning related to the module used, e.g.
result['warnings']['allpages'] , but that didnt exist because this
warning was in result['warnings']['query'].

https://gerrit.wikimedia.org/r/#/c/176910/4/wikipedia.py,cm
https://gerrit.wikimedia.org/r/#/c/170075/2/wikipedia.py,cm

 It's coming
 soon though. Nor should a large chunk of code *need* rewriting, just add
 one parameter to your action=query requests.

I presume you mean we could have just added rawcontinue='' .  Our very
limited testing at the time (we didnt have much time to fix the bugs
before it would hit production), and subsequent testing, indicates
that the rawcontinue parameter can be used even for 

Re: [Wikitech-l] [reading-wmf] Fwd: Mobile Firendly

2015-05-30 Thread Bartosz Dziewoński
W dniu piątek, 29 maja 2015 Pine W wiki.p...@gmail.com napisał(a):

 Hmm. Why would we block robots there?



We don't. We disallowed /w/, because (quoting from doc comments) Friendly,
low-speed bots are welcome viewing article pages, but not
dynamically-generated pages please., and that accidentally caught
/w/load.php, which until recently was loaded from a different domain.


-- 
-- Matma Rex
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l