Re: [Wikitech-l] Urgent help needed from WMF admins

2016-09-22 Thread Gergo Tisza
On Thu, Sep 22, 2016 at 7:53 PM, James Alexander 
wrote:

> Thank you very much Max ( and Gergő and anyone else who helped here).
>

No, I was only involved in breaking it :) Max just finished fixing by the
time I got there (thanks indeed!)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Urgent help needed from WMF admins

2016-09-22 Thread James Alexander
Thank you very much Max ( and Gergő and anyone else who helped here).

James

*James Alexander*
Manager, Trust & Safety

On Thu, Sep 22, 2016 at 10:15 PM Max Semenik  wrote:

> Fixed.
>
> On Thu, Sep 22, 2016 at 6:43 PM, Huji Lee  wrote:
>
> > Please see https://phabricator.wikimedia.org/T146440 and assist.
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
>
> --
> Best regards,
> Max Semenik ([[User:MaxSem]])
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Urgent help needed from WMF admins

2016-09-22 Thread Max Semenik
Fixed.

On Thu, Sep 22, 2016 at 6:43 PM, Huji Lee  wrote:

> Please see https://phabricator.wikimedia.org/T146440 and assist.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Best regards,
Max Semenik ([[User:MaxSem]])
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Urgent help needed from WMF admins

2016-09-22 Thread Huji Lee
Please see https://phabricator.wikimedia.org/T146440 and assist.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Debian and Ubuntu packages for MediaWiki now available

2016-09-22 Thread David Gerard
Don't forget to update the wiki:

https://www.mediawiki.org/wiki/Ubuntu
https://www.mediawiki.org/wiki/Debian


- d.


On 22 September 2016 at 07:44, Legoktm  wrote:
> Hi,
>
> I'm very excited to announce that Debian and Ubuntu packages for
> MediaWiki are now available. These packages will follow the MediaWiki
> LTS schedule and currently contain 1.27.1. If you've always wanted to
> "sudo apt install mediawiki", then this is for you :)
>
> For Debian users, you can get the mediawiki package for Jessie from the
> official Debian repositories using jessie-backports, and it will be
> included in the upcoming Stretch release.
>
> Ubuntu users will need to enable a PPA for Xenial or Trusty to set up
> the package.
>
> Instructions, links, and help can all be found at
> . Please let me
> know if you have any questions or feedback.
>
> Finally, thanks to Luke Faraone, Faidon Liambotis, Moritz Mühlenhoff,
> Max Semenik, Chad Horohoe, Antoine Musso, and all of the beta testers of
> the package for making this a reality.
>
> -- Legoktm
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Recent changes, notifications & pageprops

2016-09-22 Thread Stas Malyshev
Hi!

I'd like to raise a topic of handling change notifications and
up-to-date-ness of Wiki pages data with relation to page props.

First, a little background about how I arrived at the issue. I am
maintaining Wikidata Query Service, which updates from Wikidata using
recent changes API and RDF export format for Wikidata pages. Recently,
we have implemented using certain page properties, such as link &
statement counts. This is when I discovered the issue: the page
properties are not updated when the page (Wikidata item) is edited, but
are updated later, as I understand by a job.

Now, this leads to a situation where when I have a recent changes entry,
and I look at the RDF export page - which contains page props derived
data now - I can not know if page props data is up-to-date or not.
Moreover, if the job - some unknown and undefined time later - updates
the page props, I get no notification since the modification is not
reflected in recent changes. This makes usage of information derived
from page props very hard - you never know if the data is stale or
whether the data in page props matches the data in the page.
The problem is described in more detail in
https://phabricator.wikimedia.org/T145712

I'd like to find a solution for it, but not sure how to proceed.
The data specific to this case can be easily generated from the data
already present in memory during the page update, but I assume there
were some reasons why it was deferred.
We could make some kind of notification when updating page props, though
that would probably seriously increase the number of notifications and
thus slow the updates. Also, in some cases, the second notification may
not be necessary since the page props were updated before I've processed
the first one, but I have no way of knowing it now.

Any advice on how to solve this issue?
-- 
Stas Malyshev
smalys...@wikimedia.org

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Memento MediaWiki Extension at the W3C

2016-09-22 Thread Shawn Jones
Dear Rob,

Thanks for your response.

I would like to start by pointing out that the 2013 Phabricator ticket
pertained to a significantly different version of the Memento MediaWiki
extension. It’s 2016 by now, and the current version of the extension was
developed from scratch with a grant from the Andrew Mellon foundation, and
with interaction from MediaWiki developers, both on and off of the
wikitech-l list. As a matter of fact, there are currently even two Memento
extensions: one that only adds HTTP headers used by the Memento protocol
and relies on an external TimeGate, another one that implements all aspects
of the Memento protocol. In our work with the W3C, the latter was used,
bringing all aspects of Memento to the W3C wiki.

Regarding the issues you raised:

(1) Memento supports caching. The protocol uses two registered HTTP
response headers: Link and Memento-Datetime. Just like all HTTP headers,
they are cacheable, both for topic pages (which use Link headers) and oldid
pages (which use both Link and Memento-Datetime headers). TimeGate
responses use the 302 HTTP status code, which have no body and are not
cacheable by default [1]. The Memento protocol works with caches, not
against them. As a matter of fact, prior to being published as RFC7089, the
specification was thoroughly assessed on behalf of the IETF by Mark
Nottingham, an expert on web caching.

(2) Supporting the Memento protocol is scalable. This is exemplified by
means of its adoption by 16 web archives around the world, including the
massive Internet Archive that has exposed Memento TimeGate and TimeMap
end-points for many years. We have never heard any concerns regarding
scalability from any of these web archives. Quite to the contrary, many web
archives use OpenWayback software to replay pages and an effort is ongoing
to build the APIs for the new OpenWayback version around the Memento
protocol [2]. In addition, we have specifically assessed the performance
impact of the MediaWiki extensions for Memento and found it to be
negligible [3].

(3) Memento supports various obvious use cases related to “web time travel”
that involves Wikipedia resources. It’s really hard to assess how big a
user group would be interested in these because we are facing a chicken/egg
situation. Clearly, the Wikipedia editors that were involved in the
Wikipedia RFC about Memento thought that supporting the protocol would be
valuable. Anyhow, I list some use cases and want to emphasize that they can
be supported both for machine clients and for browsers. Browsers currently
do not natively support Memento but extensions are available and it is also
possible to build Memento functionality in web pages using JavaScript.

3.1 Memento TimeMaps are an RFC-specified (read “standard”) approach to
expose a version history for Wikipedia topic pages. The recent W3C Data on
the Web Best Practices [4] recommends the use of Memento TimeMaps for
exposing resource version history. The same document also recommends using
Memento TimeGates for access to temporal resource versions.

3.2 Memento can be used for intra-site time travel, as desired for
Wikipedia by Vernor Vinge [5]. This is important for a variety of reasons.
Historical researchers can easily determine what the state of
(inter-linked) Wikipedia topic pages was at a specific date. Memento can
also be used to avoid spoilers for current TV shows and sports, something
that has been desired by users and discussed at Wikipedia [6, 7].

3.3 Memento can be used for inter-site time travel that involves Wikipedia
pages. A user can set a sticky date in the past and visit pages around that
date across web archives and version control systems that support Memento.
For example, we recently conducted a study of almost 400,000 academic
papers containing URI references and found that more than 3,300 research
papers published between 2003-2012 reference Wikipedia articles, a number
that grows each year. The content of these Wikipedia articles has most
likely changed since the time the research paper was published. Using
Memento, a user can revisit the state of the Wikipedia page as it was at
the time the referencing paper was published by using the paper’s
publication date as the sticky date. As per the above, the user can then
also keep navigating subject to that date, visiting both version pages in
Wikipedia (for internal links) and archived pages in web archives (for
external links). This way time travel is seamless, allowing a user to stay
fixed to a datetime regardless of which web site they visit. Note that
Wikipedia itself uses Memento in this manner to link to archived content in
web archives in case of broken external links, using a Memento library that
we developed for them [8].

3.4 There is a growing interest in the use of historic web content, with
efforts such as Archives Unleashed [9] promoting studies in a variety of
fields, including sociology and history. With Memento, a researcher can
gather content from a variety 

Re: [Wikitech-l] Maintenance of WMF project footers

2016-09-22 Thread Deborah Tankersley
The Discovery team has also investigated updating the footer and there are
two tickets for possibly taking on this work to update the template (work
to do  and community involvement
).

To date, we haven't done much work or investigation on either ticket.

Cheers,

Deb




--
Deb Tankersley
Product Manager, Discovery
IRC: debt
Wikimedia Foundation

On Thu, Sep 22, 2016 at 12:48 PM, Ryan Kaldari 
wrote:

> The footers are controlled at 3 different levels:
> * The interface messages built into MediaWiki
> * Hook handlers that override those interface messages with other interface
> messages specifically for Wikimedia projects in the WikimediaMessages
> extension
> * Local message overrides controlled by local wiki admins
> There are also two different sets of footers (and corresponding interface
> messages), one set for desktop and another set for mobile.
>
> So the answer to your question is "a lot of different people". On the WMF
> side, I would say that probably no one "owns" the footers, but I would
> nominate the Reading team to address any issues (since they are most
> familiar with the mobile components). On the community side, you can check
> the histories of the following interface messages on a particular wiki:
> * wikimedia-copyright
> * privacy
> * aboutsite
> * disclaimers
> * contact
> * wikimedia-developers
> * wikimedia-cookiestatement
> * mobile-frontend-view
>
>
> On Thu, Sep 22, 2016 at 11:36 AM, Zhou Zhou  wrote:
>
> > Hi,
> >
> > Would anyone know who maintains the footers on our projects below linking
> > to our policies and terms? The second footer especially appears it might
> be
> > automatically set somewhere.
> >
> >
> >- Text is available under the Creative Commons Attribution-ShareAlike
> >License
> > > Creative_Commons_Attribution-ShareAlike_3.0_Unported_License>;
> >additional terms may apply. By using this site, you agree to the Terms
> >of Use  and
> Privacy
> >Policy .
> > Wikipedia®
> >is a registered trademark of the Wikimedia Foundation, Inc.
> >, a non-profit organization.
> >
> >
> >- Privacy policy  >
> >- About Wikipedia 
> >- Disclaimers
> >
> >- Contact Wikipedia  wiki/Wikipedia:Contact_us
> > >
> >- Developers
> >
> >- Cookie statement
> >
> >- Mobile view
> > > mobileaction=toggle_view_mobile>
> >
> >
> > --
> > Thanks!
> >
> > Zhou
> >
> > Zhou Zhou
> > Legal Counsel
> > Wikimedia Foundation
> > 149 New Montgomery Street, 6th Floor
> > San Francisco, CA 94105
> > zz...@wikimedia.org
> >
> > NOTICE: This message might have confidential or legally privileged
> > information in it. If you have received this message by accident, please
> > delete it and let us know about the mistake. As an attorney for the
> > Wikimedia Foundation, for legal/ethical reasons I cannot give legal
> advice
> > to, or serve as a lawyer for, community members, volunteers, or staff
> > members in their personal capacity. For more on what this means, please
> see
> > our legal disclaimer
> > .
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Maintainers needed for Grrrit-wm

2016-09-22 Thread Daniel Zahn
You might want to remove yourself from

https://www.mediawiki.org/wiki/Grrrit-wm

and the translated versions of that, like
https://www.mediawiki.org/wiki/Grrrit-wm/es
https://www.mediawiki.org/wiki/Grrrit-wm/ja

I just ran into that via Google, like "grrritt-wm es un bot de IRC
manejado por Yuvipanda." and Paladox pointed out this mail.




On Mon, Sep 12, 2016 at 9:34 AM, Yuvi Panda  wrote:
> Hello!
>
> Grrrit-wm[1] is an IRC bot that reports gerrit activiy to IRC. I've
> been maintaining it for a few years now, but I no longer have the
> bandwidth to do this. I'm going to remove myself as maintainer
> shortly. It already has a bunch of people as maintainers on tool labs,
> and anyone with +2 on mediawiki core can merge changes in it.
> Hopefully that's all good enough :)
>
> Thanks for all the fish!
>
> [1] https://wikitech.wikimedia.org/wiki/Grrrit-wm
>
> --
> Yuvi Panda T
> http://yuvi.in/blog
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Daniel Zahn 
Operations Engineer

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] WikiDev'17 main topics -- more feedback needed

2016-09-22 Thread Quim Gil
Hi, we need more voices and perspectives in the definition of the main
topics of the Wikimedia developer Summit 2017. Your participation is very
important!

RobLa went through the suggestions made and has proposed an organization
around main topics, and the discussion has started. Please check
https://www.mediawiki.org/wiki/WikiDev17/Topic_ideas and the discussion
page.

Rachel and I have decided to make a soft-launch of the Summit by opening
the registration and travel sponsorship requests only, because we cannot
keep postponing this step. You should hear news from us tomorrow.

We will launch properly opening the call for participation and promoting
the Summit widely in Wikimedia channels and beyond hopefully next week,
when we have a "good enough" list of main topics.

-- 
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Maintenance of WMF project footers

2016-09-22 Thread Ryan Kaldari
The footers are controlled at 3 different levels:
* The interface messages built into MediaWiki
* Hook handlers that override those interface messages with other interface
messages specifically for Wikimedia projects in the WikimediaMessages
extension
* Local message overrides controlled by local wiki admins
There are also two different sets of footers (and corresponding interface
messages), one set for desktop and another set for mobile.

So the answer to your question is "a lot of different people". On the WMF
side, I would say that probably no one "owns" the footers, but I would
nominate the Reading team to address any issues (since they are most
familiar with the mobile components). On the community side, you can check
the histories of the following interface messages on a particular wiki:
* wikimedia-copyright
* privacy
* aboutsite
* disclaimers
* contact
* wikimedia-developers
* wikimedia-cookiestatement
* mobile-frontend-view


On Thu, Sep 22, 2016 at 11:36 AM, Zhou Zhou  wrote:

> Hi,
>
> Would anyone know who maintains the footers on our projects below linking
> to our policies and terms? The second footer especially appears it might be
> automatically set somewhere.
>
>
>- Text is available under the Creative Commons Attribution-ShareAlike
>License
> Creative_Commons_Attribution-ShareAlike_3.0_Unported_License>;
>additional terms may apply. By using this site, you agree to the Terms
>of Use  and Privacy
>Policy .
> Wikipedia®
>is a registered trademark of the Wikimedia Foundation, Inc.
>, a non-profit organization.
>
>
>- Privacy policy 
>- About Wikipedia 
>- Disclaimers
>
>- Contact Wikipedia  >
>- Developers
>
>- Cookie statement
>
>- Mobile view
> mobileaction=toggle_view_mobile>
>
>
> --
> Thanks!
>
> Zhou
>
> Zhou Zhou
> Legal Counsel
> Wikimedia Foundation
> 149 New Montgomery Street, 6th Floor
> San Francisco, CA 94105
> zz...@wikimedia.org
>
> NOTICE: This message might have confidential or legally privileged
> information in it. If you have received this message by accident, please
> delete it and let us know about the mistake. As an attorney for the
> Wikimedia Foundation, for legal/ethical reasons I cannot give legal advice
> to, or serve as a lawyer for, community members, volunteers, or staff
> members in their personal capacity. For more on what this means, please see
> our legal disclaimer
> .
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Maintenance of WMF project footers

2016-09-22 Thread Zhou Zhou
Hi,

Would anyone know who maintains the footers on our projects below linking
to our policies and terms? The second footer especially appears it might be
automatically set somewhere.


   - Text is available under the Creative Commons Attribution-ShareAlike
   License
   
;
   additional terms may apply. By using this site, you agree to the Terms
   of Use  and Privacy
   Policy . Wikipedia®
   is a registered trademark of the Wikimedia Foundation, Inc.
   , a non-profit organization.


   - Privacy policy 
   - About Wikipedia 
   - Disclaimers
   
   - Contact Wikipedia 
   - Developers
   
   - Cookie statement
   
   - Mobile view
   



-- 
Thanks!

Zhou

Zhou Zhou
Legal Counsel
Wikimedia Foundation
149 New Montgomery Street, 6th Floor
San Francisco, CA 94105
zz...@wikimedia.org

NOTICE: This message might have confidential or legally privileged
information in it. If you have received this message by accident, please
delete it and let us know about the mistake. As an attorney for the
Wikimedia Foundation, for legal/ethical reasons I cannot give legal advice
to, or serve as a lawyer for, community members, volunteers, or staff
members in their personal capacity. For more on what this means, please see
our legal disclaimer
.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Debian and Ubuntu packages for MediaWiki now available

2016-09-22 Thread Rob Lanphier
On Wed, Sep 21, 2016 at 11:44 PM, Legoktm 
wrote:

> Instructions, links, and help can all be found at
> . Please let me
> know if you have any questions or feedback.
>
> Finally, thanks to Luke Faraone, Faidon Liambotis, Moritz Mühlenhoff,
> Max Semenik, Chad Horohoe, Antoine Musso, and all of the beta testers of
> the package for making this a reality.
>

Congratulations, y'all!  This is a huge step toward making safe
installation of MediaWiki simple.  Thanks Legoktm for leading the charge!

Rob
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Recently proposed patchsets by new contributors awaiting code review

2016-09-22 Thread Jon Robson
Andre,
Thanks for these e-mails. It really helps having an overview of where we
can help. I really appreciate them.

Would you be also open to flagging some of our oldest patches as part of
this mail (I think you are right to keep the number of patches low - a long
list can be overwhelming)?

I just ran a Gerrit query and found these old patches that had no merge
conflicts. I'd love to get us to a point where at least core's patchsets
are weeks old rather than months. It seems these e-mails could be a good
mechanism for reaching the right people.

Some old patches:
Added custom label for links in category pages
https://gerrit.wikimedia.org/r/#/c/104905/

Add category name in ID property for extension row in Special:Version page
https://gerrit.wikimedia.org/r/#/c/275836/2



On Thu, 22 Sep 2016 at 06:49 Andre Klapper  wrote:

> Your help is welcome to provide feedback and guidance:
>
> == in "mediawiki/tools/mwdumper": ==
>
> since 2016-09-08:
> Major refactoring
> https://gerrit.wikimedia.org/r/#/c/309314/
>
> Thanks in advance for your reviews.
>
>
> Of last weeks' 5 listed patches, 4 got merged & 1 got reviewed. Thanks
> to Amire80, Daniel, FlorianSW, Jdlrobson, MatmaRex, Smalyshev, Tjones!
>
> andre
> --
> Andre Klapper | Wikimedia Bugwrangler
> http://blogs.gnome.org/aklapper/
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] The Revision Scoring weekly update

2016-09-22 Thread Aaron Halfaker
Hey,

This is the 22nd weekly update from revision scoring team that we have sent
to this mailing list.

UI work:

   - We configured the default threshold for the ORES review tool on
   Wikidata to be more strict (higher recall, lower precision)[1]


   - We fixed a display issue on Special:Contributions where the filters
   would not wrap[2]


Increasing model fitness:

   - We finished demonstrating model fitness gains using hash-vector
   features[3].  Next, we'll be working to get the hash-vector features
   implemented in revscoring/ORES[4].


   - We implemented a new strategy for training and testing on all data
   using cross-validation[5].  This will both increase the fitness of the
   models and make the statistics reported more robust.


Maintenance and robustness

   - We fixed an indexing issues in ores_model that prevented the
   deployment of updated models[6].


   - We did a minor investigation to a short period of degraded service
   quality on WMF Labs[7]


1. https://phabricator.wikimedia.org/T144784 -- Change default threshold
for Wikidata to high
2. https://phabricator.wikimedia.org/T143518 -- Filter on user contribs has
nowrap, causing issues
3. https://phabricator.wikimedia.org/T128087 -- [Spike] Investigate
HashingVectorizer
4. https://phabricator.wikimedia.org/T145812 -- Implement ~100 most
important hash vector features in editquality models
5. https://phabricator.wikimedia.org/T142953 -- Train on all data, Report
test statistics on cross-validation
6. https://phabricator.wikimedia.org/T144432 -- oresm_model index should
not be unique
7. https://phabricator.wikimedia.org/T145353 -- Investigate short period of
ores-web-03 insanity

Sincerely,
Aaron from the Revision Scoring team
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Debian and Ubuntu packages for MediaWiki now available

2016-09-22 Thread Arthur Richards
Woah! This is rad! Awesome work :D

On Thu, Sep 22, 2016 at 8:51 AM Cyken Zeraux  wrote:

> Its cool to see there's more easy deployment software being made. I have
> been working on a similiar project meant to be like EasyEngine. Due to the
> fact that there are better technologies out there than just apache and
> mod-php. This is a step in the right direction, though. The biggest barrier
> to developers and users having their own wiki's is that to install it
> properly requires a lot of work.
>
> On Thu, Sep 22, 2016 at 10:42 AM, Bryan Davis  wrote:
>
> > On Thu, Sep 22, 2016 at 12:44 AM, Legoktm 
> > wrote:
> > > Hi,
> > >
> > > I'm very excited to announce that Debian and Ubuntu packages for
> > > MediaWiki are now available. These packages will follow the MediaWiki
> > > LTS schedule and currently contain 1.27.1. If you've always wanted to
> > > "sudo apt install mediawiki", then this is for you :)
> > >
> > > For Debian users, you can get the mediawiki package for Jessie from the
> > > official Debian repositories using jessie-backports, and it will be
> > > included in the upcoming Stretch release.
> > >
> > > Ubuntu users will need to enable a PPA for Xenial or Trusty to set up
> > > the package.
> > >
> > > Instructions, links, and help can all be found at
> > > . Please let me
> > > know if you have any questions or feedback.
> > >
> > > Finally, thanks to Luke Faraone, Faidon Liambotis, Moritz Mühlenhoff,
> > > Max Semenik, Chad Horohoe, Antoine Musso, and all of the beta testers
> of
> > > the package for making this a reality.
> >
> > Congratulations Legoktm! This is huge!
> >
> > For those who haven't been following this, Legoktm started working
> > seriously on this project during the Wikimania 2015 hackathon in
> > Mexico City. He has kept the process moving over these many months and
> > worked through lots and lots of blocking issues that would have
> > stopped most people. For me this is just one more example of why
> > Legoktm is awesome and deserving of public praise. :)
> >
> > Bryan
> > --
> > Bryan Davis  Wikimedia Foundation
> > [[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
> > irc: bd808v:415.839.6885 x6855
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Debian and Ubuntu packages for MediaWiki now available

2016-09-22 Thread Cyken Zeraux
Its cool to see there's more easy deployment software being made. I have
been working on a similiar project meant to be like EasyEngine. Due to the
fact that there are better technologies out there than just apache and
mod-php. This is a step in the right direction, though. The biggest barrier
to developers and users having their own wiki's is that to install it
properly requires a lot of work.

On Thu, Sep 22, 2016 at 10:42 AM, Bryan Davis  wrote:

> On Thu, Sep 22, 2016 at 12:44 AM, Legoktm 
> wrote:
> > Hi,
> >
> > I'm very excited to announce that Debian and Ubuntu packages for
> > MediaWiki are now available. These packages will follow the MediaWiki
> > LTS schedule and currently contain 1.27.1. If you've always wanted to
> > "sudo apt install mediawiki", then this is for you :)
> >
> > For Debian users, you can get the mediawiki package for Jessie from the
> > official Debian repositories using jessie-backports, and it will be
> > included in the upcoming Stretch release.
> >
> > Ubuntu users will need to enable a PPA for Xenial or Trusty to set up
> > the package.
> >
> > Instructions, links, and help can all be found at
> > . Please let me
> > know if you have any questions or feedback.
> >
> > Finally, thanks to Luke Faraone, Faidon Liambotis, Moritz Mühlenhoff,
> > Max Semenik, Chad Horohoe, Antoine Musso, and all of the beta testers of
> > the package for making this a reality.
>
> Congratulations Legoktm! This is huge!
>
> For those who haven't been following this, Legoktm started working
> seriously on this project during the Wikimania 2015 hackathon in
> Mexico City. He has kept the process moving over these many months and
> worked through lots and lots of blocking issues that would have
> stopped most people. For me this is just one more example of why
> Legoktm is awesome and deserving of public praise. :)
>
> Bryan
> --
> Bryan Davis  Wikimedia Foundation
> [[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
> irc: bd808v:415.839.6885 x6855
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Debian and Ubuntu packages for MediaWiki now available

2016-09-22 Thread Bryan Davis
On Thu, Sep 22, 2016 at 12:44 AM, Legoktm  wrote:
> Hi,
>
> I'm very excited to announce that Debian and Ubuntu packages for
> MediaWiki are now available. These packages will follow the MediaWiki
> LTS schedule and currently contain 1.27.1. If you've always wanted to
> "sudo apt install mediawiki", then this is for you :)
>
> For Debian users, you can get the mediawiki package for Jessie from the
> official Debian repositories using jessie-backports, and it will be
> included in the upcoming Stretch release.
>
> Ubuntu users will need to enable a PPA for Xenial or Trusty to set up
> the package.
>
> Instructions, links, and help can all be found at
> . Please let me
> know if you have any questions or feedback.
>
> Finally, thanks to Luke Faraone, Faidon Liambotis, Moritz Mühlenhoff,
> Max Semenik, Chad Horohoe, Antoine Musso, and all of the beta testers of
> the package for making this a reality.

Congratulations Legoktm! This is huge!

For those who haven't been following this, Legoktm started working
seriously on this project during the Wikimania 2015 hackathon in
Mexico City. He has kept the process moving over these many months and
worked through lots and lots of blocking issues that would have
stopped most people. For me this is just one more example of why
Legoktm is awesome and deserving of public praise. :)

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Long running tasks/scripts now included on [[wikitech:Deployments]]

2016-09-22 Thread Greg Grossmeier
What Alex said. This is for one off type things.

--
Sent from my phone, please excuse brevity.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Long running tasks/scripts now included on [[wikitech:Deployments]]

2016-09-22 Thread Jaime Crespo
Let me clarify the reasoning for the idea:

We realized that some schema changes (which used to be scheduled like other
deployments) no longer take 1 hour (they can take 1 month, running
continuously like https://phabricator.wikimedia.org/T139090 , because it
affects 3 of our largest tables). Also, they no longer requires read-only
mode or affect code in anyway (unless they are a prerequisite).

On the other side, a schema change, combined with high read or write load
from long-running maintenance jobs, like those of the updateCollation
script, or any other (those where just an example), could potentially make
lagging a worse problem: a single transaction has to store pending changes
during its lifetime, or long-running reads can block and create pileups due
to metadata locking. We want to avoid those, which certainly caused
infrastructure issues in the past.

So, in summary, regular deployments are exclusive from each others.
Long-running maintenance work could affect each other. This is a way for me
(and others) to have visibility of those potential negative interactions,
and make sure we can coordinate: "You are doing work on enwiki? No problem,
we will just run this task for commons". "you need to do an emergency data
recovery? I will wait to do this other task that can wait". Even if only
DBAs use it, it is already useful to not perform incompatible changes at
the same time. But it will be even more useful if everybody uses it!

On Thu, Sep 22, 2016 at 4:27 PM, Alex Monk  wrote:

> I had been assuming that puppetised crons were not really relevant...
>
> On 22 September 2016 at 15:19, Guillaume Lederrey  > wrote:
>
>> Hello!
>>
>> Increasing visibility sounds like a great idea! How far do we want to
>> go in that direction? In particular, I'm thinking of a few of the
>> crons we have for Cirrus. For example, we do have daily crons on
>> terbium that re-generate the suggester indices. Those can run for >
>> 1h.
>>
>> My understanding is that those kind of crons should not be considered
>> scripts, but standard working parts of the system. Adding them will
>> probably generate more noise than useful information. Is this a
>> reasonable understanding?
>>
>> Thanks!
>>
>>Guillaume
>>
>>
>>
>> On Wed, Sep 21, 2016 at 12:29 AM, Greg Grossmeier 
>> wrote:
>> > In an effort to reduce surprises and potential mishaps it is now
>> > required to include any long running tasks in the deployment
>> > calendar[0].
>> >
>> > "Long running tasks" include any script that is run on production 'work
>> > machines' such as terbium that last for longer than ~1 hour. Think:
>> > migration and maintenance scripts.
>> >
>> > This was discussed and proposed in T144661[1].
>> >
>> > Best,
>> >
>> > Greg
>> >
>> > [0] https://wikitech.wikimedia.org/wiki/Deployments
>> > Relevant diff:
>> > https://wikitech.wikimedia.org/w/index.php?diff=850923=850244
>> > [1] https://phabricator.wikimedia.org/T144661
>> >
>> > --
>> > | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
>> > | Release Team ManagerA18D 1138 8E47 FAC8 1C7D |
>> >
>> > ___
>> > Engineering mailing list
>> > engineer...@lists.wikimedia.org
>> > https://lists.wikimedia.org/mailman/listinfo/engineering
>> >
>>
>>
>>
>> --
>> Guillaume Lederrey
>> Operations Engineer, Discovery
>> Wikimedia Foundation
>> UTC+2 / CEST
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
>
>
> --
> Alex Monk
> VisualEditor/Editing team
> https://wikimediafoundation.org/wiki/User:Krenair_(WMF)
>
> ___
> Engineering mailing list
> engineer...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/engineering
>
>


-- 
Jaime Crespo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Long running tasks/scripts now included on [[wikitech:Deployments]]

2016-09-22 Thread Guillaume Lederrey
I assumed the same, but better be explicit about those assumption :)

On Thu, Sep 22, 2016 at 4:27 PM, Alex Monk  wrote:
> I had been assuming that puppetised crons were not really relevant...
>
> On 22 September 2016 at 15:19, Guillaume Lederrey 
> wrote:
>>
>> Hello!
>>
>> Increasing visibility sounds like a great idea! How far do we want to
>> go in that direction? In particular, I'm thinking of a few of the
>> crons we have for Cirrus. For example, we do have daily crons on
>> terbium that re-generate the suggester indices. Those can run for >
>> 1h.
>>
>> My understanding is that those kind of crons should not be considered
>> scripts, but standard working parts of the system. Adding them will
>> probably generate more noise than useful information. Is this a
>> reasonable understanding?
>>
>> Thanks!
>>
>>Guillaume
>>
>>
>>
>> On Wed, Sep 21, 2016 at 12:29 AM, Greg Grossmeier 
>> wrote:
>> > In an effort to reduce surprises and potential mishaps it is now
>> > required to include any long running tasks in the deployment
>> > calendar[0].
>> >
>> > "Long running tasks" include any script that is run on production 'work
>> > machines' such as terbium that last for longer than ~1 hour. Think:
>> > migration and maintenance scripts.
>> >
>> > This was discussed and proposed in T144661[1].
>> >
>> > Best,
>> >
>> > Greg
>> >
>> > [0] https://wikitech.wikimedia.org/wiki/Deployments
>> > Relevant diff:
>> > https://wikitech.wikimedia.org/w/index.php?diff=850923=850244
>> > [1] https://phabricator.wikimedia.org/T144661
>> >
>> > --
>> > | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
>> > | Release Team ManagerA18D 1138 8E47 FAC8 1C7D |
>> >
>> > ___
>> > Engineering mailing list
>> > engineer...@lists.wikimedia.org
>> > https://lists.wikimedia.org/mailman/listinfo/engineering
>> >
>>
>>
>>
>> --
>> Guillaume Lederrey
>> Operations Engineer, Discovery
>> Wikimedia Foundation
>> UTC+2 / CEST
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
>
> --
> Alex Monk
> VisualEditor/Editing team
> https://wikimediafoundation.org/wiki/User:Krenair_(WMF)
>
> ___
> Engineering mailing list
> engineer...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/engineering
>



-- 
Guillaume Lederrey
Operations Engineer, Discovery
Wikimedia Foundation
UTC+2 / CEST

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Long running tasks/scripts now included on [[wikitech:Deployments]]

2016-09-22 Thread Guillaume Lederrey
Hello!

Increasing visibility sounds like a great idea! How far do we want to
go in that direction? In particular, I'm thinking of a few of the
crons we have for Cirrus. For example, we do have daily crons on
terbium that re-generate the suggester indices. Those can run for >
1h.

My understanding is that those kind of crons should not be considered
scripts, but standard working parts of the system. Adding them will
probably generate more noise than useful information. Is this a
reasonable understanding?

Thanks!

   Guillaume



On Wed, Sep 21, 2016 at 12:29 AM, Greg Grossmeier  wrote:
> In an effort to reduce surprises and potential mishaps it is now
> required to include any long running tasks in the deployment
> calendar[0].
>
> "Long running tasks" include any script that is run on production 'work
> machines' such as terbium that last for longer than ~1 hour. Think:
> migration and maintenance scripts.
>
> This was discussed and proposed in T144661[1].
>
> Best,
>
> Greg
>
> [0] https://wikitech.wikimedia.org/wiki/Deployments
> Relevant diff:
> https://wikitech.wikimedia.org/w/index.php?diff=850923=850244
> [1] https://phabricator.wikimedia.org/T144661
>
> --
> | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
> | Release Team ManagerA18D 1138 8E47 FAC8 1C7D |
>
> ___
> Engineering mailing list
> engineer...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/engineering
>



-- 
Guillaume Lederrey
Operations Engineer, Discovery
Wikimedia Foundation
UTC+2 / CEST

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Recently proposed patchsets by new contributors awaiting code review

2016-09-22 Thread Andre Klapper
Your help is welcome to provide feedback and guidance:

== in "mediawiki/tools/mwdumper": ==

since 2016-09-08:
Major refactoring
https://gerrit.wikimedia.org/r/#/c/309314/

Thanks in advance for your reviews.


Of last weeks' 5 listed patches, 4 got merged & 1 got reviewed. Thanks
to Amire80, Daniel, FlorianSW, Jdlrobson, MatmaRex, Smalyshev, Tjones!

andre
-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] REMINDER: No deploys next week (week of Sept 26th)

2016-09-22 Thread Alex Brollo
Hurra! 
Alex

2016-09-20 19:09 GMT+02:00 Greg Grossmeier :

> Due to the Wikimedia Technical Operations Team having their team offsite
> that week and generally being less than normally available there will be
> no non-emergency deploys the week of September 26th (aka: next week).
>
> See also:
> https://wikitech.wikimedia.org/wiki/Deployments#Week_of_September_26th
>
> Normal schedule will resume the following week.
>
> Teaser: The week of October 17th will be a "no train deploys" weeks as
> the Wikimedia Release Engineering Team will be at their offsite that
> week. I'll send a reminder about this as well closer to that date.
>
> Best,
>
> Greg
>
> --
> | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
> | Release Team ManagerA18D 1138 8E47 FAC8 1C7D |
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ORES article quality data as a database table

2016-09-22 Thread David Causse

Thanks Amir!

It's been a long since I wanted to include wp10 in our search indices to 
experiment with this data as a relevance signal.


This is now possible with your dataset and I've built a test index[1] 
which uses the following signals to rank results:


- incoming links

- weekly pageviews

- wp10

The weights for these signals have not been properly tuned yet but they 
can be adjusted at query time with uri query param:


- cirrusIncLinksW: weight for a value that ranges from 0 to 1

- cirrusPageViewsW: weight for a value that ranges from 0 to 1

- cirrusWP10W: weight for a value that ranges from 0 to 5

Examples:

- articles in category 'History_of_Essex' sorted by WP10 best first [1]

- articles in category 'History_of_Essex' sorted by WP10 worst first [2]

I'd love to make this data available in a more convenient way with query 
keywords like wp10:0 and then allow playing other signals like pageviews.


Concerning internal search ranking we will soon evaluate how wp10 
compares with existing signals (inclinks/pageviews) and I'd like to use 
it as a replacement for the naive scoring method we use for autocomplete 
searches.


Well... everything is at an early stage but I believe we can do very 
interesting things with wp10 and search, I still don't know exactly 
what, nor how :)


Thanks!


[1] http://en-wp-bm25-wp10-relforge.wmflabs.org/wiki/Special:Search

[2] 
http://en-wp-bm25-wp10-relforge.wmflabs.org/w/index.php?search=incategory%3A%22History_of_Essex%22=Special:Search=Go=10=0=0


[3] 
http://en-wp-bm25-wp10-relforge.wmflabs.org/w/index.php?search=incategory%3A%22History_of_Essex%22=Special:Search=Go=-10=0=0


Le 21/09/2016 à 11:11, Amir Ladsgroup a écrit :

One of ORES [1] applications is determining article quality. For example,
What would be the best assessment of an article in the given revision.
Users in wikiprojects use ORES data to check if articles need
re-assessment. e.g. if an article is in "Start" level and now good it's
enough to be a "B" article.

As part of Q4 goals, we made a dataset of article quality scores of all
articles in English Wikipedia [2] (Here's the link to download the dataset
[3]) and we are publishing it in figshare as something you can cite [4]
also we are working on publishing monthly data for researchers to track
article quality data change over time. [5]

As a pet project of mine, I always wanted to put these data in a database.
So we can query the database and get much more useful data. For example
quality of articles in category 'History_of_Essex' [6] [7]. The weighed sum
is a measure of quality which is a decimal number between 0 (really stub)
to 5 (a definitely featured article). We have also prediction column which
is a number in this map [8] for example if prediction is 5, it means ORES
thinks it should be a featured article.

I leave more use cases to your imagination :)

I'm looking for a more permanent place to put these data, please tell me if
it's useful for you.
[1] ORES is not a anti-vandalism tool, it's an infrastructure to use AI in
Wikipedia.
[2] https://phabricator.wikimedia.org/T135684
[3] (117 MBs)
https://datasets.wikimedia.org/public-datasets/enwiki/article_quality/wp10-scores-enwiki-20160820.tsv.bz2
[4] https://phabricator.wikimedia.org/T145332
[5] https://phabricator.wikimedia.org/T145655
[6] https://quarry.wmflabs.org/query/12647
[7] https://quarry.wmflabs.org/query/12662
[8]
https://github.com/wiki-ai/wikiclass/blob/3ff2f6c44c52905c7202515c5c8b525fb1ceb291/wikiclass/utilities/extract_scores.py#L37

Have fun!
Amir
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Debian and Ubuntu packages for MediaWiki now available

2016-09-22 Thread Legoktm
Hi,

I'm very excited to announce that Debian and Ubuntu packages for
MediaWiki are now available. These packages will follow the MediaWiki
LTS schedule and currently contain 1.27.1. If you've always wanted to
"sudo apt install mediawiki", then this is for you :)

For Debian users, you can get the mediawiki package for Jessie from the
official Debian repositories using jessie-backports, and it will be
included in the upcoming Stretch release.

Ubuntu users will need to enable a PPA for Xenial or Trusty to set up
the package.

Instructions, links, and help can all be found at
. Please let me
know if you have any questions or feedback.

Finally, thanks to Luke Faraone, Faidon Liambotis, Moritz Mühlenhoff,
Max Semenik, Chad Horohoe, Antoine Musso, and all of the beta testers of
the package for making this a reality.

-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki-Codesniffer 0.8.0-alpha.1 released

2016-09-22 Thread Legoktm
Hi,

As the subject line hints, we're doing something a little different with
this release. There was a lot of work done on MW-CS over the summer by
our GSoC student, Lethexie.

There are a significant number of changes, especially to do with
documentation comments. So we are releasing this as alpha quality to ask
for more feedback on the changes to the ruleset. You can upgrade in the
same manner as normal - by updating the version number in the
composer.json file.

Feel free to provide feedback by responding to this thread or by filing
a bug in the MediaWiki-Codesniffer Phabricator project. We would like to
release 0.8.0 by mid-October.

Thanks!
-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l