[Wikitech-l] Testing pywikibot on beta wikis (Was: Simplifying the WMF deployment cadence)

2015-05-29 Thread John Mark Vandenberg
On Fri, May 29, 2015 at 2:09 AM, Brian Wolff bawo...@gmail.com wrote:
 Shouldnt such [pywikibot] tests be run against beta wiki not testwiki?

Primarily this hasnt happened because pywikibot doesnt have a family
file for the beta wiki, but that is our issue, so I've done a simple
test run on beta wiki

four initial beta wiki problems:

- no https

(not nice - that means test accounts must be created and accessed
using passwords that are sent in essentially cleartext - so sharing
passwords with the same account name on the real wikis is a security
risk)

- invalid siteinfo interwiki map; includes entries to wikis that do
not exist.  That means pywikibot cant parse wikitext links, as it cant
distinguish between iwprefix: and namespace: (pywikibot team might be
able to reduce the impact of this wiki config problem)

- paramInfo failure that breaks pywikibot's dynamic api detection;
pywikibot batch loads the paraminfo for all modules, for example
issuing the following API as part of the initialisation, and it is a
503 Service Unavailable
http://en.wikipedia.beta.wmflabs.org/w/api.php?action=paraminfomodules=abusefiltercheckmatch|abusefilterchecksyntax|abusefilterevalexpression|abusefilterunblockautopromote|antispoof|block|bouncehandler|centralauthtoken|centralnoticebannerchoicedata|centralnoticequerycampaign|checktoken|cirrus-config-dump|cirrus-mapping-dump|cirrus-settings-dump|clearhasmsg|compare|createaccount|cxconfiguration|cxdelete|cxpublish|delete|deleteglobalaccount|echomarkread|echomarkseen|edit|editlist|editmassmessagelist|emailuser|expandtemplates|fancycaptchareload|featuredfeed|feedcontributions|feedrecentchanges|feedwatchlist|filerevert|flowthank|globalblock|globaluserrights|help|imagerotate|import|jsonconfig|languagesearch|login|logout|managetags|massmessage|mobileview|move|opensearchmaxlag=5format=json

A little digging shows that the problematic module is fancycaptchareload

http://en.wikipedia.beta.wmflabs.org/w/api.php?action=paraminfomodules=fancycaptchareloadmaxlag=5format=json

Our servers are currently experiencing a technical problem. This is
probably temporary and should be fixed soon. Please try again in a few
minutes.

If you report this error to the Wikimedia System Administrators,
please include the details below.
Request: GET 
http://en.wikipedia.beta.wmflabs.org/w/api.php?action=paraminfomodules=fancycaptchareloadmaxlag=5format=json,
from 127.0.0.1 via deployment-cache-text02 deployment-cache-text02
([127.0.0.1]:3128), Varnish XID 855090368
Forwarded for: [your IP], 127.0.0.1
Error: 503, Service Unavailable at Fri, 29 May 2015 07:57:09 GMT 

Could someone investigate this / fix that module?

Pywikibot devs could also help reduce the impact of this type of
problem in the future, by falling back from batch mode fetch to
loading individual modules in order to skip buggy modules.

- no SUL with the real wikis

(probably the best choice given no https on the beta cluster, but it
complicates adding beta wiki to our existing Travis-CI test matrix
which includes real wikis)

actual results at
https://travis-ci.org/jayvdb/pywikibot-core/builds/64532033

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] XML dumps

2015-05-29 Thread Gerard Meijssen
hear hear
Gerard

On 29 May 2015 at 01:52, Lars Aronsson l...@aronsson.se wrote:

 The XML database dumps are missing all through May, apparently
 because of a memory leak that is being worked on, as described
 here,
 https://phabricator.wikimedia.org/T98585

 However, that information doesn't reach the person who wants to
 download a fresh dump and looks here,
 http://dumps.wikimedia.org/backup-index.html

 I think it should be possible to make a regular schedule for
 when these dumps should be produced, e.g. once each month or
 once every second month, and treat any delay as a bug. The
 process to produce them has been halted by errors many times
 in the past, and even when it runs as intended the interval
 is unpredictable. Now when there is a bug, all dumps are
 halted, i.e. much delayed. For a user of the dumps, this is
 extremely frustrating. With proper release management, it
 should be possible to run the old version of the process
 until the new version has been tested, first on some smaller
 wikis, and gradually on the larger ones.


 --
   Lars Aronsson (l...@aronsson.se)
   Aronsson Datateknik - http://aronsson.se



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Testing pywikibot on beta wikis (Was: Simplifying the WMF deployment cadence)

2015-05-29 Thread John Mark Vandenberg
On Fri, May 29, 2015 at 3:14 PM, John Mark Vandenberg jay...@gmail.com wrote:
 ..
 - paramInfo failure that breaks pywikibot's dynamic api detection;
 pywikibot batch loads the paraminfo for all modules, for example
 issuing the following API as part of the initialisation, and it is a
 503 Service Unavailable
 http://en.wikipedia.beta.wmflabs.org/w/api.php?action=paraminfomodules=abusefiltercheckmatch|abusefilterchecksyntax|abusefilterevalexpression|abusefilterunblockautopromote|antispoof|block|bouncehandler|centralauthtoken|centralnoticebannerchoicedata|centralnoticequerycampaign|checktoken|cirrus-config-dump|cirrus-mapping-dump|cirrus-settings-dump|clearhasmsg|compare|createaccount|cxconfiguration|cxdelete|cxpublish|delete|deleteglobalaccount|echomarkread|echomarkseen|edit|editlist|editmassmessagelist|emailuser|expandtemplates|fancycaptchareload|featuredfeed|feedcontributions|feedrecentchanges|feedwatchlist|filerevert|flowthank|globalblock|globaluserrights|help|imagerotate|import|jsonconfig|languagesearch|login|logout|managetags|massmessage|mobileview|move|opensearchmaxlag=5format=json

 A little digging shows that the problematic module is fancycaptchareload

 http://en.wikipedia.beta.wmflabs.org/w/api.php?action=paraminfomodules=fancycaptchareloadmaxlag=5format=json

 Our servers are currently experiencing a technical problem. This is
 probably temporary and should be fixed soon. Please try again in a few
 minutes.

 If you report this error to the Wikimedia System Administrators,
 please include the details below.
 Request: GET 
 http://en.wikipedia.beta.wmflabs.org/w/api.php?action=paraminfomodules=fancycaptchareloadmaxlag=5format=json,
 from 127.0.0.1 via deployment-cache-text02 deployment-cache-text02
 ([127.0.0.1]:3128), Varnish XID 855090368
 Forwarded for: [your IP], 127.0.0.1
 Error: 503, Service Unavailable at Fri, 29 May 2015 07:57:09 GMT 

 Could someone investigate this / fix that module?

After pywikibot devs cleared some other test breakages (our problems)
we see this problem is also affecting the test wikis, which I assume
means this bug will be affecting real wikis soon, so I've created an
Unbreak Now! ticket for it and cc'd @greg.

https://phabricator.wikimedia.org/T100775

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Testing pywikibot on beta wikis (Was: Simplifying the WMF deployment cadence)

2015-05-29 Thread Jeremy Baron
On May 29, 2015 09:10, Alex Monk kren...@gmail.com wrote:
 On 29 May 2015 at 09:14, John Mark Vandenberg jay...@gmail.com wrote:
 It's risky anyway, do you know who has access to the beta cluster? It's
not
 considered secure and you do not need any NDA or anything to get access -
 if you are using a real password on beta, change it. It's in labs.

+1. we don't really need HTTPS on beta to secure beta. Rogue sysops and
other vandals aren't the end of the world. rather we need HTTPS so that we
can do tests with HTTPS.

Don't share passwords between beta and any other site unless you also don't
care about the other site. (e.g. you could share if both passwords are so
weak as abcde12345)

-Jeremy
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Article coverage on Beta Cluster (was Re: Simplifying the WMF deployment cadence)

2015-05-29 Thread Greg Grossmeier
quote name=C. Scott Ananian date=2015-05-29 time=13:38:49 -0400
 I'll echo those nervous about the faster pace for deploys -- although
 not so nervous as to dig my feet in and yell stop.  Mostly my concerns
 boil down to the fact that the beta environment isn't really a good
 test for anything other than absolute crashers.  Hardly anyone uses
 the Collection extension (for example) on beta, for example.

Sounds like you need more manual testing then? Who is tasked with doing
that for your team?

 I'd *really* like to see some effort put into improving beta.  In
 particular, running beta with an up-to-date (but sanitized, perhaps)
 mirror of the main WP databases would ensure that we have a decent
 amount of test cases on beta.  We recently spent a couple of hours
 trying to test Parsoid on beta, only to find out that the behavior we
 were chasing down was caused by the fact that beta's copy of the IPA
 formatting templates on enwiki (!) were out-of-date and incomplete.
 We really need to do better than [[English language]] as far as
 articles to test on enbeta.

Yeah, that'd be nice.

See related tasks:
https://phabricator.wikimedia.org/T51779
https://phabricator.wikimedia.org/T54382

Greg

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Simplifying the WMF deployment cadence

2015-05-29 Thread C. Scott Ananian
I'll echo those nervous about the faster pace for deploys -- although
not so nervous as to dig my feet in and yell stop.  Mostly my concerns
boil down to the fact that the beta environment isn't really a good
test for anything other than absolute crashers.  Hardly anyone uses
the Collection extension (for example) on beta, for example.

I'd *really* like to see some effort put into improving beta.  In
particular, running beta with an up-to-date (but sanitized, perhaps)
mirror of the main WP databases would ensure that we have a decent
amount of test cases on beta.  We recently spent a couple of hours
trying to test Parsoid on beta, only to find out that the behavior we
were chasing down was caused by the fact that beta's copy of the IPA
formatting templates on enwiki (!) were out-of-date and incomplete.
We really need to do better than [[English language]] as far as
articles to test on enbeta.
  --scott

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] API returns empty strings for boolean fields?

2015-05-29 Thread S Page
On Thu, May 28, 2015 at 2:58 PM, Brian Gerstle bgers...@wikimedia.org
wrote:


 mainpage field is omitted when querying Barack Obama
 
 https://en.wikipedia.org/wiki/Special:ApiSandbox#action=mobileviewformat=jsonpage=Barack_Obamaprop=pagepropspageprops=mainpage
 
 :

  {
  mobileview: {
  pageprops: [],
  sections: []
  }
  }


API results often omit return parameters that have their default value in
order to save space.

Legoktm wrote:

If you pass formatversion=2 in the request, you'll get proper booleans (as
 well as other improvements).


Well, after API modules are changed to return booleans (which are turned
into empty string for backwards compatibility if you don't specify
formatversion=2). Legoktm fixed ApiMobileView.php with
https://gerrit.wikimedia.org/r/#/c/214636/ last night :
$resultObj-addValue( null, $moduleName,
*// was: array( 'mainpage' = '' )*
array( 'mainpage' = true )
);
so specifying formatversion=2 now gets you the boolean behavior on beta
labs, and soon on enwiki. http://en.wikipedia.*beta.wmflabs*
.org/w/api.php?action=mobileviewformat=jsonfmpage=Main_Pagepageprops=mainpage
*formatversion=2*
http://en.wikipedia.beta.wmflabs.org/w/api.php?action=mobileviewformat=jsonfmpage=Main_Pagepageprops=mainpageformatversion=2
works: returns:

{
mobileview: {
normalizedtitle: Main Page,
mainpage: true,
sections: [...

If you're writing an extension for the WMF cluster or 1.26 only, you should
always use formatversion=2 (with the proviso that some API modules may
change). It gives you cleaner return values and defaults to utf8 instead
of  ascii with unicode escape sequences.  More at
https://www.mediawiki.org/wiki/API:JSON_version_2#Using_the_new_JSON_results_format

Cheers,
-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [reading-wmf] Fwd: Mobile Firendly

2015-05-29 Thread Jon Katz
+external mobile and wikitech

Shoot.  I meant to send this the external list.  For those of you just
joining us, we recently got an email from google letting us know that some
of our pages are now failing the mobile-friendly test, which has an adverse
impact on our search results.  It appears that most of our pages are also
blocking style info.

Google doesn't offer much insight into penalties, but if they're sending us
the email then it is or will have some impact.  I'd like to see if we can
better understand the other side of the equation- what is cost of fixing
it?  I think Jon's questions below are the ones to start with.

-J


On Fri, May 29, 2015 at 1:35 PM, Jon Robson jrob...@wikimedia.org wrote:

 It's all in the report. We block w/ in robots.txt always have.

 There have been a bunch of changes to improve performance of the site
 overall which might have led to that. Mailing this to the public mailing
 list wikitech would give you a better idea.

 Our site /is/ mobile friendly it's just we tell google not to load styles
 from w/load.php so no need to panic.

 The question is what is the penalty of us failing this tool? Does it
 impact our google search rankings?

 Fix is trivial.. Update robots.txt but first the question is why are we
 blocking scripts and styles on that url?
 On 29 May 2015 1:17 pm, Jon Katz jk...@wikimedia.org wrote:

 Hi Readership team and broader community,
 Any changes we might have recently made to cause this warning to appear
 about googlebot not being able to access our site?
 I consider this to be a very serious issue.
 The examples below are not mobile, but same issue applies when you try an
 en.m. version of the pages.

 Best,
 Jon


 -- Forwarded message --
 From: Wes Moran wmo...@wikimedia.org
 Date: Fri, May 29, 2015 at 12:47 PM
 Subject: Mobile Firendly
 To: Jon Katz jk...@wikimedia.org
 Cc: Adam Baso ab...@wikimedia.org


 Jon,

 Google notified us of the followin...

 We recently noticed that there was a change with how you embed CSS  JS
 which results in us not being able to use the CSS  JS to recognize what
 the page looks like. That's making some of your pages fail the
 mobile-friendly-test, for example. You used to load CSS  JS from
 bits.wikimedia.org, but now they're loaded through /w/load.php?...
 directly from the Wikipedia host, where that path is blocked by robots.txt.

 You can see how we render the pages with Fetch as Google in Webmaster
 Tools / Search Console, you can also see most of that with the test-page at:

 https://www.google.com/webmasters/tools/mobile-friendly/?url=http%3A%2F%2Fen.wikipedia.org%2Fwiki%2FBusot

 Some of the pages still pass the test there (example
 https://www.google.com/webmasters/tools/mobile-friendly/?url=http%3A%2F%2Fen.wikipedia.org%2Fwiki%2FZ%25C3%25BCrich),
 but the CSS is broken there too since it's blocked. 

 Any ideas what can be causing this?

 Regards,
 Wes


 ___
 reading-wmf mailing list
 reading-...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/reading-wmf


 ___
 reading-wmf mailing list
 reading-...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/reading-wmf


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [reading-wmf] Fwd: Mobile Firendly

2015-05-29 Thread Jon Katz
Since Brandon's email a minute ago crossed mine in the ether and did not
make it to external lists, I am pasting it here:

On Fri, May 29, 2015 at 1:53 PM, Brandon Black bbl...@wikimedia.org wrote:
We've merged up https://gerrit.wikimedia.org/r/#/c/214705/1 to address
this, and I've purged the caches to ensure the update is fully live
already.  It should address the issue, assuming that the block on RL's
/w/load.php entry point was the only part of the problem.

On Fri, May 29, 2015 at 1:55 PM, Jon Katz jk...@wikimedia.org wrote:

 +external mobile and wikitech

 Shoot.  I meant to send this the external list.  For those of you just
 joining us, we recently got an email from google letting us know that some
 of our pages are now failing the mobile-friendly test, which has an adverse
 impact on our search results.  It appears that most of our pages are also
 blocking style info.

 Google doesn't offer much insight into penalties, but if they're sending
 us the email then it is or will have some impact.  I'd like to see if we
 can better understand the other side of the equation- what is cost of
 fixing it?  I think Jon's questions below are the ones to start with.

 -J


 On Fri, May 29, 2015 at 1:35 PM, Jon Robson jrob...@wikimedia.org wrote:

 It's all in the report. We block w/ in robots.txt always have.

 There have been a bunch of changes to improve performance of the site
 overall which might have led to that. Mailing this to the public mailing
 list wikitech would give you a better idea.

 Our site /is/ mobile friendly it's just we tell google not to load styles
 from w/load.php so no need to panic.

 The question is what is the penalty of us failing this tool? Does it
 impact our google search rankings?

 Fix is trivial.. Update robots.txt but first the question is why are we
 blocking scripts and styles on that url?
 On 29 May 2015 1:17 pm, Jon Katz jk...@wikimedia.org wrote:

 Hi Readership team and broader community,
 Any changes we might have recently made to cause this warning to appear
 about googlebot not being able to access our site?
 I consider this to be a very serious issue.
 The examples below are not mobile, but same issue applies when you try
 an en.m. version of the pages.

 Best,
 Jon


 -- Forwarded message --
 From: Wes Moran wmo...@wikimedia.org
 Date: Fri, May 29, 2015 at 12:47 PM
 Subject: Mobile Firendly
 To: Jon Katz jk...@wikimedia.org
 Cc: Adam Baso ab...@wikimedia.org


 Jon,

 Google notified us of the followin...

 We recently noticed that there was a change with how you embed CSS  JS
 which results in us not being able to use the CSS  JS to recognize what
 the page looks like. That's making some of your pages fail the
 mobile-friendly-test, for example. You used to load CSS  JS from
 bits.wikimedia.org, but now they're loaded through /w/load.php?...
 directly from the Wikipedia host, where that path is blocked by robots.txt.

 You can see how we render the pages with Fetch as Google in Webmaster
 Tools / Search Console, you can also see most of that with the test-page at:

 https://www.google.com/webmasters/tools/mobile-friendly/?url=http%3A%2F%2Fen.wikipedia.org%2Fwiki%2FBusot

 Some of the pages still pass the test there (example
 https://www.google.com/webmasters/tools/mobile-friendly/?url=http%3A%2F%2Fen.wikipedia.org%2Fwiki%2FZ%25C3%25BCrich),
 but the CSS is broken there too since it's blocked. 

 Any ideas what can be causing this?

 Regards,
 Wes


 ___
 reading-wmf mailing list
 reading-...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/reading-wmf


 ___
 reading-wmf mailing list
 reading-...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/reading-wmf



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Testing pywikibot on beta wikis (Was: Simplifying the WMF deployment cadence)

2015-05-29 Thread John Mark Vandenberg
On Fri, May 29, 2015 at 3:14 PM, John Mark Vandenberg jay...@gmail.com wrote:
 On Fri, May 29, 2015 at 2:09 AM, Brian Wolff bawo...@gmail.com wrote:
 Shouldnt such [pywikibot] tests be run against beta wiki not testwiki?

 Primarily this hasnt happened because pywikibot doesnt have a family
 file for the beta wiki, but that is our issue, so I've done a simple
 test run on beta wiki

I've create a task for this little project:

https://phabricator.wikimedia.org/T100796

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Testing pywikibot on beta wikis (Was: Simplifying the WMF deployment cadence)

2015-05-29 Thread Brad Jorsch (Anomie)
On Fri, May 29, 2015 at 7:01 AM, John Mark Vandenberg jay...@gmail.com
wrote:

 On Fri, May 29, 2015 at 3:14 PM, John Mark Vandenberg jay...@gmail.com
 wrote:
  ..
  - paramInfo failure that breaks pywikibot's dynamic api detection;
  pywikibot batch loads the paraminfo for all modules, for example
  issuing the following API as part of the initialisation, and it is a
  503 Service Unavailable
 
 http://en.wikipedia.beta.wmflabs.org/w/api.php?action=paraminfomodules=abusefiltercheckmatch|abusefilterchecksyntax|abusefilterevalexpression|abusefilterunblockautopromote|antispoof|block|bouncehandler|centralauthtoken|centralnoticebannerchoicedata|centralnoticequerycampaign|checktoken|cirrus-config-dump|cirrus-mapping-dump|cirrus-settings-dump|clearhasmsg|compare|createaccount|cxconfiguration|cxdelete|cxpublish|delete|deleteglobalaccount|echomarkread|echomarkseen|edit|editlist|editmassmessagelist|emailuser|expandtemplates|fancycaptchareload|featuredfeed|feedcontributions|feedrecentchanges|feedwatchlist|filerevert|flowthank|globalblock|globaluserrights|help|imagerotate|import|jsonconfig|languagesearch|login|logout|managetags|massmessage|mobileview|move|opensearchmaxlag=5format=json
 
  A little digging shows that the problematic module is fancycaptchareload
 
 
 http://en.wikipedia.beta.wmflabs.org/w/api.php?action=paraminfomodules=fancycaptchareloadmaxlag=5format=json
 
  Our servers are currently experiencing a technical problem. This is
  probably temporary and should be fixed soon. Please try again in a few
  minutes.
 
  If you report this error to the Wikimedia System Administrators,
  please include the details below.
  Request: GET
 http://en.wikipedia.beta.wmflabs.org/w/api.php?action=paraminfomodules=fancycaptchareloadmaxlag=5format=json
 ,
  from 127.0.0.1 via deployment-cache-text02 deployment-cache-text02
  ([127.0.0.1]:3128), Varnish XID 855090368
  Forwarded for: [your IP], 127.0.0.1
  Error: 503, Service Unavailable at Fri, 29 May 2015 07:57:09 GMT 
 
  Could someone investigate this / fix that module?

 After pywikibot devs cleared some other test breakages (our problems)
 we see this problem is also affecting the test wikis, which I assume
 means this bug will be affecting real wikis soon, so I've created an
 Unbreak Now! ticket for it and cc'd @greg.

 https://phabricator.wikimedia.org/T100775


https://gerrit.wikimedia.org/r/#/c/214610/

I'll very likely go ahead and backport it to 1.26wmf8 as soon as it's
merged and I confirm Beta is fixed.



-- 
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Simplifying the WMF deployment cadence

2015-05-29 Thread Brad Jorsch (Anomie)
On Thu, May 28, 2015 at 2:39 PM, John Mark Vandenberg jay...@gmail.com
wrote:

 [T96942 https://phabricator.wikimedia.org/T96942] was reported by
 pywikibot devs almost as soon as we detected that
 the test wikis were failing in our travis-ci tests.


At 20:22 in the timezone of the main API developer (me).


 It was 12 hours before a MediaWiki API fix was submitted to Gerrit,


09:31, basically first thing in the morning for the main API developer.
There's really not much to complain about there.


 and it took four additional *days* to get merged.


That part does suck.


 This also doesnt give clients sufficient time to workaround
 MediaWiki's wonderful intentional API breakages. e.g. raw continue,
 which completely broke pywikibot and needed a large chunk of code
 rewritten urgently, both for pywikibot core and the much older and
 harder to fix pywikibot compat, which is still used as part of
 processes that wiki communities rely on.


The continuation change hasn't actually broken anything yet. It's coming
soon though. Nor should a large chunk of code *need* rewriting, just add
one parameter to your action=query requests.

Unless pywikibot was treating warnings as errors and that's what broke it?
Or you're referring to unit tests rather than actual breakage? But a notice
about the warnings was sent to mediawiki-api-announce in September 2014,[1]
a bit over a month before the warnings started.[2]

 [1]:
https://lists.wikimedia.org/pipermail/mediawiki-api-announce/2014-September/69.html
 [2]: https://gerrit.wikimedia.org/r/#/c/160222/


 Another example is the action=help rewrite not being backwards
 compatible. pywikibot wasnt broken, as it only uses the help module
 for older MW releases; but it wouldnt surprise me if there are clients
 that were parsing the help text and they would have been broken.


Comments on that and other proposed changes were requested on
mediawiki-api-announce in July 2014,[3] three months before the change[4]
was merged. No concerns were raised at the requested location[5] or on the
mediawiki-api mailing list.

 [3]:
https://lists.wikimedia.org/pipermail/mediawiki-api-announce/2014-July/62.html
 [4]: https://gerrit.wikimedia.org/r/#/c/160798/
 [5]:
https://www.mediawiki.org/wiki/API/Architecture_work/Planning#HTMLizing_action.3Dhelp

-- 
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [WikimediaMobile] [reading-wmf] Fwd: Mobile Firendly

2015-05-29 Thread Max Semenik
Already fixed by Brandon and Ori: https://gerrit.wikimedia.org/r/#/c/214705/

On Fri, May 29, 2015 at 1:55 PM, Jon Katz jk...@wikimedia.org wrote:

 +external mobile and wikitech

 Shoot.  I meant to send this the external list.  For those of you just
 joining us, we recently got an email from google letting us know that some
 of our pages are now failing the mobile-friendly test, which has an adverse
 impact on our search results.  It appears that most of our pages are also
 blocking style info.

 Google doesn't offer much insight into penalties, but if they're sending
 us the email then it is or will have some impact.  I'd like to see if we
 can better understand the other side of the equation- what is cost of
 fixing it?  I think Jon's questions below are the ones to start with.

 -J


 On Fri, May 29, 2015 at 1:35 PM, Jon Robson jrob...@wikimedia.org wrote:

 It's all in the report. We block w/ in robots.txt always have.

 There have been a bunch of changes to improve performance of the site
 overall which might have led to that. Mailing this to the public mailing
 list wikitech would give you a better idea.

 Our site /is/ mobile friendly it's just we tell google not to load styles
 from w/load.php so no need to panic.

 The question is what is the penalty of us failing this tool? Does it
 impact our google search rankings?

 Fix is trivial.. Update robots.txt but first the question is why are we
 blocking scripts and styles on that url?
 On 29 May 2015 1:17 pm, Jon Katz jk...@wikimedia.org wrote:

 Hi Readership team and broader community,
 Any changes we might have recently made to cause this warning to appear
 about googlebot not being able to access our site?
 I consider this to be a very serious issue.
 The examples below are not mobile, but same issue applies when you try
 an en.m. version of the pages.

 Best,
 Jon


 -- Forwarded message --
 From: Wes Moran wmo...@wikimedia.org
 Date: Fri, May 29, 2015 at 12:47 PM
 Subject: Mobile Firendly
 To: Jon Katz jk...@wikimedia.org
 Cc: Adam Baso ab...@wikimedia.org


 Jon,

 Google notified us of the followin...

 We recently noticed that there was a change with how you embed CSS  JS
 which results in us not being able to use the CSS  JS to recognize what
 the page looks like. That's making some of your pages fail the
 mobile-friendly-test, for example. You used to load CSS  JS from
 bits.wikimedia.org, but now they're loaded through /w/load.php?...
 directly from the Wikipedia host, where that path is blocked by robots.txt.

 You can see how we render the pages with Fetch as Google in Webmaster
 Tools / Search Console, you can also see most of that with the test-page at:

 https://www.google.com/webmasters/tools/mobile-friendly/?url=http%3A%2F%2Fen.wikipedia.org%2Fwiki%2FBusot

 Some of the pages still pass the test there (example
 https://www.google.com/webmasters/tools/mobile-friendly/?url=http%3A%2F%2Fen.wikipedia.org%2Fwiki%2FZ%25C3%25BCrich),
 but the CSS is broken there too since it's blocked. 

 Any ideas what can be causing this?

 Regards,
 Wes


 ___
 reading-wmf mailing list
 reading-...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/reading-wmf


 ___
 reading-wmf mailing list
 reading-...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/reading-wmf



 ___
 Mobile-l mailing list
 mobil...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/mobile-l




-- 
Best regards,
Max Semenik ([[User:MaxSem]])
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [WikimediaMobile] [reading-wmf] Fwd: Mobile Firendly

2015-05-29 Thread Pine W
Hmm. Why would we block robots there?

Pine
On May 29, 2015 1:55 PM, Jon Katz jk...@wikimedia.org wrote:

 +external mobile and wikitech

 Shoot.  I meant to send this the external list.  For those of you just
 joining us, we recently got an email from google letting us know that some
 of our pages are now failing the mobile-friendly test, which has an adverse
 impact on our search results.  It appears that most of our pages are also
 blocking style info.

 Google doesn't offer much insight into penalties, but if they're sending
 us the email then it is or will have some impact.  I'd like to see if we
 can better understand the other side of the equation- what is cost of
 fixing it?  I think Jon's questions below are the ones to start with.

 -J


 On Fri, May 29, 2015 at 1:35 PM, Jon Robson jrob...@wikimedia.org wrote:

 It's all in the report. We block w/ in robots.txt always have.

 There have been a bunch of changes to improve performance of the site
 overall which might have led to that. Mailing this to the public mailing
 list wikitech would give you a better idea.

 Our site /is/ mobile friendly it's just we tell google not to load styles
 from w/load.php so no need to panic.

 The question is what is the penalty of us failing this tool? Does it
 impact our google search rankings?

 Fix is trivial.. Update robots.txt but first the question is why are we
 blocking scripts and styles on that url?
 On 29 May 2015 1:17 pm, Jon Katz jk...@wikimedia.org wrote:

 Hi Readership team and broader community,
 Any changes we might have recently made to cause this warning to appear
 about googlebot not being able to access our site?
 I consider this to be a very serious issue.
 The examples below are not mobile, but same issue applies when you try
 an en.m. version of the pages.

 Best,
 Jon


 -- Forwarded message --
 From: Wes Moran wmo...@wikimedia.org
 Date: Fri, May 29, 2015 at 12:47 PM
 Subject: Mobile Firendly
 To: Jon Katz jk...@wikimedia.org
 Cc: Adam Baso ab...@wikimedia.org


 Jon,

 Google notified us of the followin...

 We recently noticed that there was a change with how you embed CSS  JS
 which results in us not being able to use the CSS  JS to recognize what
 the page looks like. That's making some of your pages fail the
 mobile-friendly-test, for example. You used to load CSS  JS from
 bits.wikimedia.org, but now they're loaded through /w/load.php?...
 directly from the Wikipedia host, where that path is blocked by robots.txt.

 You can see how we render the pages with Fetch as Google in Webmaster
 Tools / Search Console, you can also see most of that with the test-page at:

 https://www.google.com/webmasters/tools/mobile-friendly/?url=http%3A%2F%2Fen.wikipedia.org%2Fwiki%2FBusot

 Some of the pages still pass the test there (example
 https://www.google.com/webmasters/tools/mobile-friendly/?url=http%3A%2F%2Fen.wikipedia.org%2Fwiki%2FZ%25C3%25BCrich),
 but the CSS is broken there too since it's blocked. 

 Any ideas what can be causing this?

 Regards,
 Wes


 ___
 reading-wmf mailing list
 reading-...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/reading-wmf


 ___
 reading-wmf mailing list
 reading-...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/reading-wmf



 ___
 Mobile-l mailing list
 mobil...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/mobile-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l