Re: [Wikitech-l] MediaWiki API pageview issue

2020-02-24 Thread bawolff
On Tue, Feb 25, 2020 at 1:27 AM MusikAnimal  wrote:

> Unfortunately there's no proper log of redirect changes (I recently filed <
> https://phabricator.wikimedia.org/T240065> for this). There are change
> tags
>  that identify redirect changes
> -- "mw-new-redirect" and "mw-changed-redirect-target", specifically -- but
> I am not sure if this is easily searchable via the action API. Someone on
> this list might know.
>

You can do
https://en.wikipedia.org/w/api.php?action=query=2019%E2%80%9320%20Wuhan%20coronavirus%20outbreak=revisions=timestamp|tags|ids|content=max=mw-new-redirect=2=main
or
https://en.wikipedia.org/w/api.php?action=query=2019%E2%80%9320%20Wuhan%20coronavirus%20outbreak=revisions=timestamp|tags|ids|content=max=mw-changed-redirect-target=2=main
(You cannot do both in one query, you can only specify one tag at a time).
Furthermore, it looks like given a revision id, you would have to determine
where it redirects yourself, which is unfortunate. I suppose you could look
at
https://en.wikipedia.org/w/api.php?action=parse=941491141=2=text|links
(taking the oldid as the revid from the other query) and either try and
parse the html, or just assume if there is only one main namespace link,
that that is the right one.

Also keep in mind, really old revisions won't have those tags.

--
Brian
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki API pageview issue

2020-02-24 Thread MusikAnimal
Unfortunately there's no proper log of redirect changes (I recently filed <
https://phabricator.wikimedia.org/T240065> for this). There are change tags
 that identify redirect changes
-- "mw-new-redirect" and "mw-changed-redirect-target", specifically -- but
I am not sure if this is easily searchable via the action API. Someone on
this list might know.

Redirects can be created directly, say as an alternate name or misspelling
of an article (i.e. "Barak Obama" redirects to "Barack Obama", it was never
an article on its own). Usually when a page is moved a redirect is left
behind at the old location ("20th Century Fox" was recently renamed to
"20th Century Studios"), but sometimes the redirects are suppressed. So you
could focus only on page moves, in which case you could query the page move
log using the logevents API ,
specifically with letype=move. From that you could piece together the
pageviews. You wouldn't be including traffic originating from redirects
that were never the target article, but from my experience this is the
minority, since Google and the like usually link to the target and not
redirects.

There's also a task to make the Toolforge tool go by the page move log
automatically , since it is such
a common need. The same caveats exist though; say articles Foo and Bar have
been moved back and forth a few times, you might need to check the move
logs of both and not just Foo. It can be quite tricky!

Overall I would say including all redirects is probably your best bet.
Allow me to clarify the Redirect Views tool does offer date filtering <
https://tools.wmflabs.org/redirectviews/> just as the main Pageviews tool
does. If you do need automation, you could write a script to the query the
redirects API  and then the
REST API , which is all that that tool does.

Hope this helps!

~ MA

On Mon, Feb 24, 2020 at 6:40 PM James Gardner 
wrote:

> Thanks for the clarification of how redirects work, and what we should
> keep in mind when trying to count pageviews. Do you know if there's a way
> to find the date(s) when a page is redirected using the API? We know we can
> get the 'old' page ids of redirected pages using the API, but we're not
> sure if using the creation date of these page ids would be accurate. Also,
> what's the difference between redirects and page moves if there is one?
>
> We may stick to including redirects without trying to avoid overcounting
> as this appears to be a more complicated issue that we thought. We are
> working to collect pageviews within a specific time frame, so relative
> dates isn't quite what we're looking for.
>
> Thanks again!
>
> Jackie, James, Junyi, Kirby
>
> On Mon, Feb 24, 2020 at 10:52 AM MusikAnimal 
> wrote:
>
>> > We attempted to use the wmflabs.org tool, but it only shows data from
>> a certain date
>>
>> I'm assuming you want relative dates, not exact dates? You can do this by
>> using the range=latest-N URL parameter (where N is the number of days). See
>>  and <
>> https://tools.wmflabs.org/redirectviews/url_structure/> for Redirect
>> Views. This mirrors the pvipdays parameter of the action API.
>>
>> I'm sorry there is no backend for these tools, so if you need automation
>> you'll have to scrape it or re-implement it's logic yourself.
>>
>> > In the end, we are trying to get an accurate count of view for a
>> certain page no matter the source.
>>
>> Keep in mind that redirects can change, and historically may have not
>> been the "same" page. For instance, if I create the article Foo, and
>> someone else creates Bar, and some months later Foo is redirected to Bar.
>> To accurately get the views of just Bar, you'll need to somehow exclude the
>> time when Foo was a different article. Page moves can also cause unexpected
>> results (Foo is moved to Baz, Bar is moved to Foo, etc.). Finally, page IDs
>> can change too, say if I delete Foo, then move Bar to Foo. There isn't a
>> foolproof solution, it seems, but simply including redirects is usually
>> enough to give you what you want.
>>
>> ~ MA
>>
>> On Mon, Feb 24, 2020 at 9:18 AM James Gardner via Wikitech-l <
>> wikitech-l@lists.wikimedia.org> wrote:
>>
>>> Hi all,
>>>
>>> Thanks for all the help and advice with this issue, especially with the
>>> wmflabs tool with the redirect view tool. We'll try using that tool to
>>> download the pageview data we need and manually filter by dates to map
>>> redirects to the page. We'll also look into the REST API that Wiki has to
>>> see if it can help us as well.
>>>
>>> Thanks again,
>>>
>>> Jackie, James, Junyi, Kirby
>>>
>>>
>>> On Sun, Feb 23, 2020 at 10:58 PM Gergo Tisza 
>>> wrote:
>>>
>>> > On Sun, Feb 23, 2020 at 4:17 PM James Gardner via Wikitech-l <
>>> > wikitech-l@lists.wikimedia.org> wrote:
>>> >
>>> 

Re: [Wikitech-l] Mediawiki codesniffer usage

2020-02-24 Thread Zoran Dori
Hello,
thank you so much for help! This looks awesome.

Best regards,

Zoran Dori
volunteer, Wikimedia Serbia
s: zoranzoki21.github.io e: zorandori4...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki API pageview issue

2020-02-24 Thread James Gardner via Wikitech-l
Thanks for the clarification of how redirects work, and what we should keep
in mind when trying to count pageviews. Do you know if there's a way to
find the date(s) when a page is redirected using the API? We know we can
get the 'old' page ids of redirected pages using the API, but we're not
sure if using the creation date of these page ids would be accurate. Also,
what's the difference between redirects and page moves if there is one?

We may stick to including redirects without trying to avoid overcounting as
this appears to be a more complicated issue that we thought. We are working
to collect pageviews within a specific time frame, so relative dates isn't
quite what we're looking for.

Thanks again!

Jackie, James, Junyi, Kirby

On Mon, Feb 24, 2020 at 10:52 AM MusikAnimal  wrote:

> > We attempted to use the wmflabs.org tool, but it only shows data from a
> certain date
>
> I'm assuming you want relative dates, not exact dates? You can do this by
> using the range=latest-N URL parameter (where N is the number of days). See
>  and <
> https://tools.wmflabs.org/redirectviews/url_structure/> for Redirect
> Views. This mirrors the pvipdays parameter of the action API.
>
> I'm sorry there is no backend for these tools, so if you need automation
> you'll have to scrape it or re-implement it's logic yourself.
>
> > In the end, we are trying to get an accurate count of view for a certain
> page no matter the source.
>
> Keep in mind that redirects can change, and historically may have not been
> the "same" page. For instance, if I create the article Foo, and someone
> else creates Bar, and some months later Foo is redirected to Bar. To
> accurately get the views of just Bar, you'll need to somehow exclude the
> time when Foo was a different article. Page moves can also cause unexpected
> results (Foo is moved to Baz, Bar is moved to Foo, etc.). Finally, page IDs
> can change too, say if I delete Foo, then move Bar to Foo. There isn't a
> foolproof solution, it seems, but simply including redirects is usually
> enough to give you what you want.
>
> ~ MA
>
> On Mon, Feb 24, 2020 at 9:18 AM James Gardner via Wikitech-l <
> wikitech-l@lists.wikimedia.org> wrote:
>
>> Hi all,
>>
>> Thanks for all the help and advice with this issue, especially with the
>> wmflabs tool with the redirect view tool. We'll try using that tool to
>> download the pageview data we need and manually filter by dates to map
>> redirects to the page. We'll also look into the REST API that Wiki has to
>> see if it can help us as well.
>>
>> Thanks again,
>>
>> Jackie, James, Junyi, Kirby
>>
>>
>> On Sun, Feb 23, 2020 at 10:58 PM Gergo Tisza 
>> wrote:
>>
>> > On Sun, Feb 23, 2020 at 4:17 PM James Gardner via Wikitech-l <
>> > wikitech-l@lists.wikimedia.org> wrote:
>> >
>> >> We attempted to use the wmflabs.org tool, but it only shows data from
>> a
>> >> certain date. (Example link:
>> >>
>> >>
>> https://tools.wmflabs.org/pageviews/?project=en.wikipedia.org=all-access=user=2019-07-01=2020-01-25=2019%E2%80%9320_Hong_Kong_protests|China
>> 
>> >> <
>> https://tools.wmflabs.org/pageviews/?project=en.wikipedia.org=all-access=user=2019-07-01=2020-01-25=2019%E2%80%9320_Hong_Kong_protests%7CChina
>> >
>> >> <
>> >>
>> https://tools.wmflabs.org/pageviews/?project=en.wikipedia.org=all-access=user=2019-07-01=2020-01-25=2019%E2%80%9320_Hong_Kong_protests%7CChina
>> >> >
>> >> )
>> >>
>> >
>> > There's a redirectview tool (see the "redirects" links at the bottom of
>> > the page you linked) but it can't be filtered by date so it probably
>> can't
>> > help you.
>> >
>> >
>> >> Then we attempted to use the redirects of a page and using the old page
>> >> ids
>> >> to grab the pageview data, but there was no data returned. When we
>> >> attempted to grab data for a page that we knew would have a long past,
>> but
>> >> the parameter of "pvipcontinue" did not appear (
>> >>
>> https://www.mediawiki.org/w/api.php?action=help=query%2Bpageviews
>> >> ).
>> >> (Example:
>> >>
>> >>
>> https://www.mediawiki.org/wiki/Special:ApiSandbox#action=query=json=pageviews=MediaWiki=pageviews=60=
>> >> )
>> >>
>> >
>> > That API displays a limited set of metrics and is focused on caching and
>> > being backend-agnostic. There is no way to get old data, pvicontinue is
>> for
>> > fetching data about more pages. If you need something more specific, you
>> > should use the Analytics Query Service (which the other APIs rely on)
>> > directly: https://wikitech.wikimedia.org/wiki/Analytics/AQS/Pageviews
>> >
>> > I think you'll have to piece the data together using the MediaWiki
>> > redirects API and AQS.
>> >
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>

Re: [Wikitech-l] [Wikimedia Technical Talks] Data and Decision Science at Wikimedia with Kate Zimmerman, 26 February 2020 @ 6PM UTC

2020-02-24 Thread Srishti Sethi
Hello folks,

Just a reminder that this talk will take place Wednesday 26 February 2020
at 6 PM UTC.

Hope to see you there!

Cheers,
Srishti
*Srishti Sethi*
Developer Advocate
Wikimedia Foundation 



On Tue, Feb 18, 2020 at 2:57 PM Sarah R  wrote:

> Hello Everyone,
>
> It's time for Wikimedia Tech Talks 2020 Episode 1! This talk will take
> place on *26 February 2020 at 6 PM UTC*.
>
> This month's talk will be in an interview format. You are invited to send
> questions ahead of time by replying to this email, or you can ask during Q
> & A section of the live talk by asking through IRC or the Youtube
> Livestream.
>
> Title: Data and Decision Science at Wikimedia
>
> Speaker:  Kate Zimmerman,  Head of Product Analytics at Wikimedia
>
> Summary:
>
> How do teams at the Foundation use data to inform decisions?
>
> Sarah R. Rodlund talks with Kate Zimmerman, Head of Product Analytics at
> Wikimedia, about what sorts of data her team uses and how insights from
> their analysis have shaped product decisions.
>
> Kate Zimmerman holds an MS in Psychology & Behavioral Decision Research
> from Carnegie Mellon University and has over 15 years of experience in
> quantitative and experimental methods. Before joining Wikimedia, she built
> data teams from scratch at ModCloth and SmugMug, evolving their data
> capabilities from basic reports to strategic analysis, automated
> dashboards, and advanced modeling.
>
> The link to the Youtube Livestream can be found here:
> https://www.youtube.com/watch?v=J-CRsiwYM9w
>
> During the live talk, you are invited to join the discussion on IRC at
> #wikimedia-office
>
> You can watch past Tech Talks here:
> https://www.mediawiki.org/wiki/Tech_talks
>
> If you are interested in giving your own tech talk, you can learn more
> here:
>
> https://www.mediawiki.org/wiki/Project:Calendar/How_to_schedule_an_event#Tech_talks
>
> Note: This is a public talk. Feel free to distribute through appropriate
> email and social channels!
>
> Many kindnesses,
>
> Sarah R. Rodlund
> Technical Writer, Developer Advocacy
> srodl...@wikimedia.org
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New, simple Docker development environment for MediaWiki core

2020-02-24 Thread Physikerwelt
This is great news. I was hoping for something like that for years.
I will see if I can align my provisional MediaWiki docker environment [1]
with your work.
Thanks a lot and keep on going with this great work.

Physikerwelt
[1] https://github.com/physikerwelt/mediawiki-docker


On Mon, Feb 24, 2020 at 5:44 PM Brennen Bearnes 
wrote:

> Hey all,
>
> TL;DR: `docker-compose up` gets you a Docker environment with which
> to develop.
>
> The Engineering Productivity group is happy to announce the
> availability of a new, official Docker environment for MediaWiki
> core. [0] This is a component of our work on improving developer
> productivity, as part of the Wikimedia Foundation's "Platform
> Evolution" [1] multi-year priority, looking to support faster, more
> reliable technical change for our communities. We've been exploring
> options for a year now, and we had a great deal of input,
> particularly at the TechConf 2019, where Kosta Harlan worked
> closely with us to move this forward. [2]
>
> This new environment has been built for simple experimentation,
> development, and testing of proposed changes to MediaWiki core. It
> is designed to be particularly simple and easy to use, and intended
> particularly to be a good option for newbies, be they testers,
> designers, developers, or others who have not yet invested a great
> deal of their time in setting up local environments.
>
> We intend for this environment to become the official, supported,
> and advertised entry point for small-scale development. If you find
> issues, or have suggestions for improvements, we'd love to hear
> from you. [3] We will be adjusting various bits of documentation
> over time to encourage futher use, but if you find some out-of-date
> instructions, please do fix them, or flag for us to do.
>
> We know that there are number of people who need a more complex,
> configurable, and powerful development and testing environment,
> even up to being a "Wikimedia production-like" state. This is not
> that environment; we plan to provide a more configurable and thus
> more complex, "heavy-weight" alternative for that use case in the
> future. You may wish to follow our work in Phabricator. [4]
>
> Our huge thanks to pioneering volunteer and staff colleagues who
> have provided support, testing, and advice, and who have explored
> different uses of Vagrant, Docker, and other techniques for
> providing better forms of MediaWiki testing, development, and
> hosting, which have inspired us on how to best provide this.  We
> all owe you a great debt. Thank you.
>
> Again, if you have questions, comments, or concerns, please do file
> a task so that we can help you! [3]
>
> [0] - https://www.mediawiki.org/wiki/Docker
> [1] -
>
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Medium-term_plan_2019/Platform_evolution
> [2] - https://phabricator.wikimedia.org/T238224
> [3] - https://phabricator.wikimedia.org/tag/mediawiki-docker/
> [4] - https://phabricator.wikimedia.org/tag/local-charts/
>
> Yours,
>
> --
> Brennen Bearnes (he/him)
> WMF Release Engineering
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New, simple Docker development environment for MediaWiki core

2020-02-24 Thread Tyler Cipriani
On Mon, Feb 24, 2020 at 9:44 AM Brennen Bearnes  wrote:
> TL;DR: `docker-compose up` gets you a Docker environment with which
> to develop.
>
> The Engineering Productivity group is happy to announce the
> availability of a new, official Docker environment for MediaWiki
> core. [0] This is a component of our work on improving developer
> productivity, as part of the Wikimedia Foundation's "Platform
> Evolution" [1] multi-year priority, looking to support faster, more
> reliable technical change for our communities. We've been exploring
> options for a year now, and we had a great deal of input,
> particularly at the TechConf 2019, where Kosta Harlan worked
> closely with us to move this forward. [2]

Very cool! I'm really happy to see projects that grew out of TechConf
coming to fruition!

Kudos and thanks to everyone involved in this work!

-- Tyler

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki codesniffer usage

2020-02-24 Thread Mukunda Modell
A pinch of javascript and a dash of comm[1] produced this list of
extensions that are included in the first list but not in the second:

https://phabricator.wikimedia.org/P10500

[1] https://en.wikipedia.org/wiki/Comm


On Thu, Feb 20, 2020 at 6:50 PM James Forrester 
wrote:

> On Thu, 20 Feb 2020 at 16:19, Zoran Dori  wrote:
>
> > Is anywhere available list of extensions which don't have
> > mediawiki/mediawiki-codesniffer?
> >
>
> Not easily, but this is a list of all extensions known to gerrit
> <
> https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/+/master
> >
> (860 of them), and this is LibraryUpgrader's list of repos it knows about
> that have CodeSniffer installed
> <
> http://libraryupgrader2.wmflabs.org/library/composer/mediawiki/mediawiki-codesniffer
> >
> (543 extension repos), so anything listed in the first but not the second
> would be a place to start.
>
> Note that a handful of extensions use WMDE's fork of CodeSniffer
> <
> http://libraryupgrader2.wmflabs.org/library/composer/wikibase/wikibase-codesniffer
> >,
> and some repo owners might intentionally not want to align to Wikimedia
> coding standards, for whatever reason.
>
> J.
> --
> *James D. Forrester* (he/him  or they/themself
> )
> Wikimedia Foundation 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki API pageview issue

2020-02-24 Thread MusikAnimal
> We attempted to use the wmflabs.org tool, but it only shows data from a
certain date

I'm assuming you want relative dates, not exact dates? You can do this by
using the range=latest-N URL parameter (where N is the number of days). See
 and <
https://tools.wmflabs.org/redirectviews/url_structure/> for Redirect Views.
This mirrors the pvipdays parameter of the action API.

I'm sorry there is no backend for these tools, so if you need automation
you'll have to scrape it or re-implement it's logic yourself.

> In the end, we are trying to get an accurate count of view for a certain
page no matter the source.

Keep in mind that redirects can change, and historically may have not been
the "same" page. For instance, if I create the article Foo, and someone
else creates Bar, and some months later Foo is redirected to Bar. To
accurately get the views of just Bar, you'll need to somehow exclude the
time when Foo was a different article. Page moves can also cause unexpected
results (Foo is moved to Baz, Bar is moved to Foo, etc.). Finally, page IDs
can change too, say if I delete Foo, then move Bar to Foo. There isn't a
foolproof solution, it seems, but simply including redirects is usually
enough to give you what you want.

~ MA

On Mon, Feb 24, 2020 at 9:18 AM James Gardner via Wikitech-l <
wikitech-l@lists.wikimedia.org> wrote:

> Hi all,
>
> Thanks for all the help and advice with this issue, especially with the
> wmflabs tool with the redirect view tool. We'll try using that tool to
> download the pageview data we need and manually filter by dates to map
> redirects to the page. We'll also look into the REST API that Wiki has to
> see if it can help us as well.
>
> Thanks again,
>
> Jackie, James, Junyi, Kirby
>
>
> On Sun, Feb 23, 2020 at 10:58 PM Gergo Tisza  wrote:
>
> > On Sun, Feb 23, 2020 at 4:17 PM James Gardner via Wikitech-l <
> > wikitech-l@lists.wikimedia.org> wrote:
> >
> >> We attempted to use the wmflabs.org tool, but it only shows data from a
> >> certain date. (Example link:
> >>
> >>
> https://tools.wmflabs.org/pageviews/?project=en.wikipedia.org=all-access=user=2019-07-01=2020-01-25=2019%E2%80%9320_Hong_Kong_protests|China
> 
> >> <
> https://tools.wmflabs.org/pageviews/?project=en.wikipedia.org=all-access=user=2019-07-01=2020-01-25=2019%E2%80%9320_Hong_Kong_protests%7CChina
> >
> >> <
> >>
> https://tools.wmflabs.org/pageviews/?project=en.wikipedia.org=all-access=user=2019-07-01=2020-01-25=2019%E2%80%9320_Hong_Kong_protests%7CChina
> >> >
> >> )
> >>
> >
> > There's a redirectview tool (see the "redirects" links at the bottom of
> > the page you linked) but it can't be filtered by date so it probably
> can't
> > help you.
> >
> >
> >> Then we attempted to use the redirects of a page and using the old page
> >> ids
> >> to grab the pageview data, but there was no data returned. When we
> >> attempted to grab data for a page that we knew would have a long past,
> but
> >> the parameter of "pvipcontinue" did not appear (
> >>
> https://www.mediawiki.org/w/api.php?action=help=query%2Bpageviews
> >> ).
> >> (Example:
> >>
> >>
> https://www.mediawiki.org/wiki/Special:ApiSandbox#action=query=json=pageviews=MediaWiki=pageviews=60=
> >> )
> >>
> >
> > That API displays a limited set of metrics and is focused on caching and
> > being backend-agnostic. There is no way to get old data, pvicontinue is
> for
> > fetching data about more pages. If you need something more specific, you
> > should use the Analytics Query Service (which the other APIs rely on)
> > directly: https://wikitech.wikimedia.org/wiki/Analytics/AQS/Pageviews
> >
> > I think you'll have to piece the data together using the MediaWiki
> > redirects API and AQS.
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New, simple Docker development environment for MediaWiki core

2020-02-24 Thread Brennen Bearnes

Hey all,

TL;DR: `docker-compose up` gets you a Docker environment with which
to develop.

The Engineering Productivity group is happy to announce the
availability of a new, official Docker environment for MediaWiki
core. [0] This is a component of our work on improving developer
productivity, as part of the Wikimedia Foundation's "Platform
Evolution" [1] multi-year priority, looking to support faster, more
reliable technical change for our communities. We've been exploring
options for a year now, and we had a great deal of input,
particularly at the TechConf 2019, where Kosta Harlan worked
closely with us to move this forward. [2]

This new environment has been built for simple experimentation,
development, and testing of proposed changes to MediaWiki core. It
is designed to be particularly simple and easy to use, and intended
particularly to be a good option for newbies, be they testers,
designers, developers, or others who have not yet invested a great
deal of their time in setting up local environments.

We intend for this environment to become the official, supported,
and advertised entry point for small-scale development. If you find
issues, or have suggestions for improvements, we'd love to hear
from you. [3] We will be adjusting various bits of documentation
over time to encourage futher use, but if you find some out-of-date
instructions, please do fix them, or flag for us to do.

We know that there are number of people who need a more complex,
configurable, and powerful development and testing environment,
even up to being a "Wikimedia production-like" state. This is not
that environment; we plan to provide a more configurable and thus
more complex, "heavy-weight" alternative for that use case in the
future. You may wish to follow our work in Phabricator. [4]

Our huge thanks to pioneering volunteer and staff colleagues who
have provided support, testing, and advice, and who have explored
different uses of Vagrant, Docker, and other techniques for
providing better forms of MediaWiki testing, development, and
hosting, which have inspired us on how to best provide this.  We
all owe you a great debt. Thank you.

Again, if you have questions, comments, or concerns, please do file
a task so that we can help you! [3]

[0] - https://www.mediawiki.org/wiki/Docker
[1] - 
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Medium-term_plan_2019/Platform_evolution

[2] - https://phabricator.wikimedia.org/T238224
[3] - https://phabricator.wikimedia.org/tag/mediawiki-docker/
[4] - https://phabricator.wikimedia.org/tag/local-charts/

Yours,

--
Brennen Bearnes (he/him)
WMF Release Engineering

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki API pageview issue

2020-02-24 Thread James Gardner via Wikitech-l
Hi all,

Thanks for all the help and advice with this issue, especially with the
wmflabs tool with the redirect view tool. We'll try using that tool to
download the pageview data we need and manually filter by dates to map
redirects to the page. We'll also look into the REST API that Wiki has to
see if it can help us as well.

Thanks again,

Jackie, James, Junyi, Kirby


On Sun, Feb 23, 2020 at 10:58 PM Gergo Tisza  wrote:

> On Sun, Feb 23, 2020 at 4:17 PM James Gardner via Wikitech-l <
> wikitech-l@lists.wikimedia.org> wrote:
>
>> We attempted to use the wmflabs.org tool, but it only shows data from a
>> certain date. (Example link:
>>
>> https://tools.wmflabs.org/pageviews/?project=en.wikipedia.org=all-access=user=2019-07-01=2020-01-25=2019%E2%80%9320_Hong_Kong_protests|China
>> 
>> <
>> https://tools.wmflabs.org/pageviews/?project=en.wikipedia.org=all-access=user=2019-07-01=2020-01-25=2019%E2%80%9320_Hong_Kong_protests%7CChina
>> >
>> )
>>
>
> There's a redirectview tool (see the "redirects" links at the bottom of
> the page you linked) but it can't be filtered by date so it probably can't
> help you.
>
>
>> Then we attempted to use the redirects of a page and using the old page
>> ids
>> to grab the pageview data, but there was no data returned. When we
>> attempted to grab data for a page that we knew would have a long past, but
>> the parameter of "pvipcontinue" did not appear (
>> https://www.mediawiki.org/w/api.php?action=help=query%2Bpageviews
>> ).
>> (Example:
>>
>> https://www.mediawiki.org/wiki/Special:ApiSandbox#action=query=json=pageviews=MediaWiki=pageviews=60=
>> )
>>
>
> That API displays a limited set of metrics and is focused on caching and
> being backend-agnostic. There is no way to get old data, pvicontinue is for
> fetching data about more pages. If you need something more specific, you
> should use the Analytics Query Service (which the other APIs rely on)
> directly: https://wikitech.wikimedia.org/wiki/Analytics/AQS/Pageviews
>
> I think you'll have to piece the data together using the MediaWiki
> redirects API and AQS.
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Commons app Android... and iOS!

2020-02-24 Thread Josephine Lim
Thank you, Sam and Pine! :)

P.S. Pine, I can't read Jawi, but "terima kasih" is correct. ;)

Best regards,
Josephine
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l