Re: [Wikitech-l] Git for idiots

2015-11-11 Thread S Page
On Wed, Nov 11, 2015 at 12:50 AM, Petr Bena <benap...@gmail.com> wrote:

> Ok, I will try to merge all useful stuff in here:
> https://www.mediawiki.org/wiki/Git_for_dummies
>

The problem is these are matters of widely-varying taste and background.
When I tried to clean up in early 2013, git experts didn't even agree on
whether the gerrit remote should be origin, or whether people should use
`git review` at all. (Aside: right now [[Git]] has the awful
community-splitting  comment
"To simply *browse & fork our code* you can use the GitHub mirror
<https://github.com/wikimedia>."
followed by the precisely wrong git clone command for someone trying to
make a patch:
   git clone https://...
Sigh.)

I would urge people to judiciously update the pages we have, and only
create very targeted new pages rather than yet another starting point.
https://www.mediawiki.org/wiki/Gerrit/Getting_started seems OK, it's
focused on a new gerrit contributor. Gerrit/Tutorial is huge because this
isn't simple.

So here's more taste and matters of opinion.

* "the minimal knowledge needed in order to push to all major repositories
we use,"  Surely Dummies only use gerrit.  Why explain anything else?
* "VCS" term is not necessary, nobody uses it.
* "master" explanation brings in irrelevant detail. It's "The thing you get
from a remote repo and should never change"
* The `svn commands` are irrelevant. Dummies don't know SVN.
* Try not to use angle brackets, like git clone ,
dummies will type them. (I realize the  tag won't let you italicize
:-( )
* git pull --rebase is a bad idea, it leads to surprises.
* The way to avoid merge conflicts is NEVER EVER make an edit without first
creating a branch.
* Then, the guaranteed reliable way to update is `git checkout master; git
pull --ff-only`. That will stop you from making a mistake, it will stop you
from having your own local master with local changes buried down the commit
log.
* The fact that git mergetool shows up so soon indicates this is no longer
a guide for dummies, it's now a guide for troubleshooting. A guide for
dummies should give commands and advice to avoid trouble.
* git diff shows up too soon. Instead, dummies should be running `git
status` all the time. The way pros do this is to change their shell prompt
to show the repo status and branch. Setting this up will *change your life*
[1]. But it's too advanced for a Dummies guide. (I think MW-Vagrant's shell
should do it for you.)

* "If you want to push your branch to origin  git push origin
". Huh? I don't see how someone working on WMF projects can
ever do that.  The whole point of gerrit is you don't own master and should
never ever edit it because that way lies misery.

I guess this is actually a guide for tool labs where people control their
own remote repo and often do keep it on github. Fine, so say so: please
please rename it Git/Guide_for_tool_labs_repo_users or whatever to avoid
confusion.

Again, I acknowledge how hard this is.

[1] I used http://briancarper.net/blog/570/git-info-in-your-zsh-prompt ,
but I use the zsh shell so that's no help for 95%


On Tue, Nov 10, 2015 at 8:26 PM, Nick Wilson (Quiddity)
> <nwil...@wikimedia.org> wrote:
> > On Tue, Nov 10, 2015 at 10:54 AM, Legoktm <legoktm.wikipe...@gmail.com>
> wrote:
> >>
> >> Hi,
> >>
> >> On 11/10/2015 09:06 AM, Petr Bena wrote:
> >> > Perhaps it would worth merging and putting to some central location?
> >>
> >> Yes, that sounds like a good idea. I typically recommend
> >> <https://www.mediawiki.org/wiki/User:Wctaiwan/Gerrit_cheatsheet> to
> >> people who are confused with git.
> >>
> >
> > +1, as someone who uses it rarely enough to /always/ have to consult the
> FAQs.
> > I compiled a list at https://www.mediawiki.org/wiki/Git/Tips#See_also
> > and posted at https://www.mediawiki.org/wiki/Topic:Ssg7cjp65lw1oc7y
>

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Skinning tutorial

2015-11-11 Thread S Page
On Tue, Nov 10, 2015 at 6:31 PM, Isarra Yos <zhoris...@gmail.com> wrote:

> On 11/11/15 02:06, S Page wrote:
>
>> I meant the on-wiki three-part skinning thing that's been
>> around for a while and introduces in a nutshell as "This page is part 1 of
>> a three-part tutorial".
>>
>
> Oh, I want to murder that. Have I mentioned that? I think I have, but if I
> haven't, I'D LIKE TO MURDER THAT. It doesn't even say how to do the
> important bits, while focussing on bits nobody in their right mind will
> ever even touch, with no real structure or sensible organisation. It's
> ludicrous.
>

I spent two hours adding a screenshot for the elements mentioned in
Skinning Part 1 [1]. Given that part 1 spends so much time describing
elements of a page, a picture seemed worth 1000 words. Is your thesis that
anyone who starts from the Example skin already has these elements and any
skin designer knows the ins and outs of wiki pages, so they don't need to
be mentioned? (
https://www.mediawiki.org/wiki/User:Isarra/How_to_make_a_motherfucking_skin
doesn't mention actions, personal tools, the page subheader, etc.) I'm
dubious.  Even if one had a set of test pages that exercise all these
elements, saying what they are seems useful.

[1] https://www.mediawiki.org/wiki/Manual:Skinning_Part_1

--
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Skinning tutorial

2015-11-10 Thread S Page
(A plug for https://phabricator.wikimedia.org/T114071 "Let's discuss the
skin creation process")

That and Isarra's tech talk inspired me to improve
https://www.mediawiki.org/wiki/Manual:Skinning : part 1 now has a picture
showing some components, part 3 no longer talks about old i18n, etc. I also
dragged https://www.mediawiki.org/wiki/Manual:QuickTemplate into 2015.

The tutorial is pretty good IMO, the problem is it forced repetition. part
1 "Here are some page elements you need to worry about"; part 2 "Let's
output some of those page elements"; part 3 "How to test these page
elements". But maybe that's OK.


On Tue, Nov 10, 2015 at 11:40 AM, Isarra Yos <zhoris...@gmail.com> wrote:

> As a bit of a follow up to the talk I did last week, I wrote up a tutorial
> on-wiki how to make a skin:
> https://www.mediawiki.org/wiki/User:Isarra/How_to_make_a_motherfucking_skin
> The plan is to eventually replace Manual:Skinning with that and some
> subpages with specific info, but if anyone wants to run through now, see if
> it's useful, try it out, see if anything is missing or wrong, I'd
> appreciate the help making it better.
>
> -I
>
> ps - Yes, I may have read motherfuckingwebsite.com a few too many times
> before writing that up. I, uh, sort of apologise.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Skinning tutorial

2015-11-10 Thread S Page
On Tue, Nov 10, 2015 at 2:39 PM, Isarra Yos <zhoris...@gmail.com> replied:

> On 10/11/15 21:59, S Page wrote:
>
>> The tutorial is pretty good IMO, the problem is it forced repetition
>
>
> This may just be because I did it at three in the morning; it probably
> doesn't need that much. I'll see if I can clean that up a bit. >.>


D'oh, I'm sorry, I meant the on-wiki three-part skinning thing that's been
around for a while and introduces in a nutshell as "This page is part 1 of
a three-part tutorial".

Re:
https://www.mediawiki.org/wiki/User:Isarra/How_to_make_a_motherfucking_skin

I have a dream where both extensions/BoilerPlate and skins/Example are a
script that prompts for your extension or skin name, clones it into
YourProject, sets up the example as a non-master remote (so you can track
updates to the skeleton), and does the mindless file renames and search and
replace to YourProjectName. [1]

> Also you can probably delete any random things that say 'composer' or
'grunt' or whatever in the filenames. Not really sure why those are in the
example skin. That's a bit confusing.
Nope, they're gold. It means you can run npm test and composer test and
automatically have a growing set of basic tests and coding checkers run for
you.  See BoilerPlate's README.md [1] , maybe the same instructions should
be copied to skins/Example.

The good advice in your Step 4 and Testing could fit well into Skinning
Part 3.

[1] Legoktm built such a thing for Wikimedia libraries,
https://github.com/wikimedia/generator-wikimedia-php-library#readme
[2] https://github.com/wikimedia/mediawiki-extensions-BoilerPlate#readme

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] documentation files in git and .{media, }wiki extension (bikeshedding!)

2015-11-04 Thread S Page
No bikeshedding? Then .wiki , brevity wins. [1], thanks.

On Wed, Nov 4, 2015 at 6:37 AM, Marcin Cieslak <sa...@saper.info> wrote:

> If both "README" and "README.mediawiki" were present


It feels odd to have both. We have this in core for historical reasons,
Krenair added:
   Symlink README.mediawiki to README so Github renders it as wikitext.



> ... I'd tend to think
> that this is some imported piece of software and README is the upstream
> README file and README.mediawiki contains MW-specific bits (like
> import/upstreaming instructions, whatever). Something we might start
> to see in vendor/ eventually.
>

Good point, though e.g. README_MediaWiki.wiki would be less confusing if
they have different content.


> Somehow funny we cannot use our own markup where we need to :)
>

We can: again, GitHub will render some MediaWiki syntax in files with the
extension.wiki [2]. Phabricator's diffusion renders them as text, which is
no worse than a file without extension.
Also, there's a chance [3] we'll gain the ability to transclude doc files
in git into wiki pages for some DRY [4] win, e.g. with the Android README
that Stephen Niedzielski mentioned.

[1]
https://www.mediawiki.org/w/index.php?title=Manual%3ACoding_conventions=revision=1931325=1931314
[2] https://help.github.com/articles/supported-mediawiki-formats/
[3] https://phabricator.wikimedia.org/T91626
[4] https://en.wikipedia.org/wiki/Don't_repeat_yourself

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MW 1.25 new extension registration - PHP constants

2015-11-02 Thread S Page
On Oct 30, 2015 11:06, "Florian Schmidt" 
wrote:

[Thanks for helping the developer community!]

> I'm wondering, if dependencies like yours are widely used and if
extending the requires section support to extensions would help to solve
your problem (with that you could specify a specific version, a range of
version or any version above a specific one, or all versions, too, so your
VIKIJS_VERSION wouldn't be needed anymore, too).

That sounds like
https://m.mediawiki.org/wiki/Requests_for_comment/Improving_extension_management
, also at  https://phabricator.wikimedia.org/T88596 from the task:

"on dependencies between extensions -- I don't think there is consensus on
whether to specify those dependencies in extension.json or composer.json
(TimStarling, 21:36:55)"

> Would you like to open a task in phabricator to discuss this?

  I mentioned T88596 there.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] DISPLAYTITLE

2015-10-27 Thread S Page
On Fri, Oct 23, 2015 at 8:19 AM, Cindy Cicalese <cin...@gmail.com> wrote:

> I recently proposed a MediaWiki core change
> to a) encapsulate access to the page_props table in a class and b) extend
> the usage of a page’s displaytitle property so that it is also used as the
> link text for links to the page.
>

Part b) feels like a Request For Comment because as I understand it this
will change the way links behave. You should write a phabricator ticket for
b) and add the #MediaWiki-RfCs tag to it [1]. As for a), it seems your
gerrit patch is well along and I'm not sure adding one class rises to the
level of an architectural change that would prompt an RFC.

[1] https://www.mediawiki.org/wiki/Requests_for_comment/Process

Cheers,
-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP CodeSniffer is now voting on MediaWiki core patches

2015-10-15 Thread S Page
If you're intrigued by all the code checking automation available,
Extension:BoilerPlate [1] implements both the `npm test` and `composer
test` entry points, and people have updated it to run jshint, jscs,
banana-checker, jsonlint, php-parallel-lint, and now PHP CodeSniffer.  Its
README [2] explains how to get rolling. So if you start a new extension
from BoilerPlate or copy over its files, your extension or skin benefits
from the same code checks.

Then you can copy its Continuous Integration configuration [3] so that
Jenkins run these tests on your extension or skin on each commit pushed to
gerrit.wikimedia.org.

[1] https://www.mediawiki.org/wiki/Extension:BoilerPlate
[2] https://gerrit.wikimedia.org/r/#/c/237058/4/README.md
[3] https://gerrit.wikimedia.org/r/#/c/226680/

On Sat, Sep 26, 2015 at 4:28 PM, Legoktm <legoktm.wikipe...@gmail.com>
wrote:

>
> Thanks to the hard work of a lot of different people, PHP CodeSniffer is
> now voting on all MediaWiki core patchsets! It checks for basic code
> style issues automatically


-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: [MediaWiki-l] An "advanced search" page for CirrusSearch?

2015-10-13 Thread S Page
On Mon, Oct 12, 2015 at 11:24 PM, Erik Bernhardson <
ebernhard...@wikimedia.org> wrote:

> We added a help link to Special:Search
> <https://lists.wikimedia.org/mailman/listinfo/wikitech-l>
>

Cool! It's blank on mediawiki.org , it seems just an admin decision to add
MediaWiki:search-summary on mw.org that links to
[[Special:MyLanguage/Help:CirrusSearch]]

I filed https://phabricator.wikimedia.org/T115429 for Special:Search to use
the standard addHelpLink() indicator for this help.

It seems a search expression builder would be hard unless it really knows
the exact syntax and escaping rules that the Cirrus back-end supports, but
a helper that inserts the prefixes could work.

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Port mw-vagrant to Raspberry Pi ( arm )

2015-09-29 Thread S Page
Congratulations on the container, mention it on mw.org somewhere. It seems
with that you've got half the battle won. MediaWiki-vagrant's roles let you
easily add features to a development wiki, but maybe instead of trying to
get all MediaWiki-vagrant's vagrant and puppet machinery running in your
image, you could publish different containers and your recipe for making
more of them. I imagine some of the roles like browsertests would bring an
RPi to its knees, so you would have to blacklist some of them.

The most useful day-to-day MW-vagrant feature IMO is `git-update`, it would
be good to offer that as standalone script.

(Last time I looked the docker catalog had dozens of MediaWiki containers,
I don't know how people choose one).
On Sep 29, 2015 10:05, "Tony Thomas" <01tonytho...@gmail.com> wrote:

> Hello,
>
> I was looking at some statistics of school students ( < 17 years )
>  participation from my state in Open Source program like Google Code In,
> and it is ~0. The Government here has initiated a project to distribute
> Raspberry Pi for school students[1], and it would be great to have them
> setup a Mediawiki development environment with the Pi so that they can
> contribute.
>
> The Pi's have 1 Gig ram, and I got a docker container of Ubuntu ( arm )
> running smooth. There are few blockers to install MW-Vagrant or the LXC
> container, which are:
>
>1.  The puppet config for mw-vagrant needs a 64-bit Ubuntu 14.04
>container to run inside
>2.  mv-vagrant has a lot of bells and whistles that make it
>really want a lot of ram and CPU
>3.  hhvm is too ram hungry
>
> and lot more. The other option will be to setup a LAMP stack, which would
> need to be automated ( need scripts ). I wanted to know if this porting
> would be feasible, and worth the development hours, and specifically - if
> someone is interested.
>
> [1]
>
> http://gadgets.ndtv.com/others/news/kerala-launches-learn-to-code-pilot-will-distribute-raspberry-pi-kits-662412
>
> Thanks,
> Tony Thomas 
> ThinkFOSS 
>
> *"where there is a wifi, there is a way"*
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Interested in working on a WikiWidget for algorithm visualization

2015-09-28 Thread S Page
Hi. You're getting advice on how to add a new "algorithm visualization"
content type to MediaWiki. But...

On Fri, Sep 11, 2015 at 2:52 PM, Daniel Moisset <dmois...@machinalis.com>
wrote:

>
> I'm one of the developers of thewalnut.io, a platform for authoring
> and
> sharing algorithm visualizations. ... creating a wikiwidget that can
> somehow integrate the content built in the walnut


"somehow" is the part you need to flesh out.

It seems a first step is simply to have algorithms on Wikipedias link to
your external platform. That's just adding a link to the == See also ==
section.

If you want a wiki widget to present a screenshot of the visualization that
when clicked open an external link to "See an interactive visualization of
this algorithm", that is just Template:External_Walnut_visualization that
has parameters for the animated GIF and the URL of the walnut.io
visualization. Using it extensively seems a per-wiki decision. You can
discuss and promote your idea in each wiki's technical area (e.g. "Village
Pump (technical)" [1]) and relevant WikiProjects [2].

External coordinate linking might be instructive. E.g. click the "37.787°N
122.4°W" in many articles [3], and a "GeoHack" window pops up that links to
interesting mapping services, both free and commercial. They didn't
implement "Offline animated image rendering solution" and such :-). In
general, wiki communities look more favorably on open presentation of
multiple services for data than blessing a single provider.

The third step is embedding the walnut.io interactive visualization of its
data format in the page, which could be a big on-wiki JavaScript gadget but
is probably best implemented as an extension implementing a new content
type. At that level everything that others said about FLOSS and licenses
and fallbacks does apply.

Cheers, sorry if I completely misunderstood what you're proposing... which
is a good reason to write it up as a subpage of your user page on meta-wiki
;-)

[1] https://en.wikipedia.org/wiki/Wikipedia:Village_pump_%28technical%29
[2]
https://en.wikipedia.org/wiki/Wikipedia:WikiProject_Directory/Science,_technology,_and_engineering_WikiProjects
[3] https://en.wikipedia.org/wiki/140_New_Montgomery

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Question: Should wikimedia wikis such as wikipedia be updated with a user frendly desgn

2015-09-09 Thread S Page
(adding Design list)

On Wed, Sep 9, 2015 at 9:06 AM, Magnus Manske <magnusman...@googlemail.com>
wrote:

> Whatever happened to Winter? It's not coming, AFAICT...
>

Yes. See the "Winter" thread on the design mailing list:
https://lists.wikimedia.org/pipermail/design/2015-July/thread.html

There are three avenues for appearance development
* User JS+CSS and gadgets.
* Iterate on Vector as Beta features (sadly "Compact personal bar" and
"Fixed header" are no longer offered as Beta Features due to bugs).
* Prototype and refine new skins.

Regards,


> On Wed, Sep 9, 2015 at 2:58 PM Ricordisamoa <ricordisa...@openmailbox.org>
> wrote:
>
> > There is some work about getting the mobile skin on desktop (T71366
> > <https://phabricator.wikimedia.org/T71366>), and Blueprint
> > <https://www.mediawiki.org/wiki/Skin:Blueprint> powers the Living Style
> > Guide.
> > However, no matter how ancient Vector may look, the little updates it
> > has received over the years make me think it isn't that bad for those
> > who use it.
> >
> > Il 08/09/2015 19:53, Thomas Mulhall ha scritto:
> > >   Hi this is a question but shoulden Wikimedia wikis such as Wikipedia
> > be updated with a user friendly design. Currently vector is coming out of
> > date because now a days you see sites with bruitiful colours not old ones
> > as they were in 2010 when vector came out. We could create another skin
> to
> > replace vector as we did with monobook or update vector with a new look
> > that has bold colours and goes along with mediawiki code and is also
> mobile
> > optimised even though we have the mobilefrontend extension some users may
> > not want to install instead hoping the skin is mobile optimised.
>



-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Rotating pictures idea

2015-09-07 Thread S Page
A popular enough topic that Google found "13 Best jQuery 360 Degree Image
Rotation Plugins" [1] :-)
I don't think there's a ResourceLoader module for this already in
core/resources; maybe some wiki already has a user script.

[1] https://codegeekz.com/best-jquery-360-degree-image-rotation-plugins/

On Fri, Sep 4, 2015 at 11:25 AM, Ole Palnatoke Andersen <palnat...@gmail.com
> wrote:

> I talked to Charlotte SH Jensen at the National Museum in Copenhagen
> yesterday.
>
> We've been talking and doing stuff for several years, so we have a
> huge backlog of ideas that didn't work, but we're still able to find
> new things to do.
>
> One idea that came up yesterday is rotating pictures on Wikipedia. We
> hope to get people to do something at the #hack4dk hackathon in early
> October.
>
> See the idea at
> https://hackdash.org/projects/55e99d6174d6ac1d21451575, and more about
> #hack4dk at http://hack4.dk/
>
>
> Regards,
> Ole Palnatoke Andersen, Wikimedia Danmark
>
> --
> http://palnatoke.org * @palnatoke * +4522934588
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Collaboration team reprioritization

2015-09-02 Thread S Page
1. Regarding Flow and LQT on mediawiki.org:

On Tue, Sep 1, 2015 at 11:21 PM, Federico Leva (Nemo) <nemow...@gmail.com>
wrote:

> Can we reverse the Flow conversion on mediawiki.org now,


Technically, I think that would be challenging. All LiquidThreads carefully
redirect to Flow topics, e.g.
https://www.mediawiki.org/wiki/Thread:Project:Current_issues/Adding_a_dev_namespace_for_%22Data_and_developer_hub%22_articles
.
Extension:LiquidThreads is still enabled on mw.org, but
$wgLiquidThreadsFrozen is true.


> so that the wiki stays on the luckiest side i.e. the extension which has
> most users and is most likely to survive in the future?
> (LQT is maintained by its non-Wikimedia users, otherwise it would have
> broken down years ago on all Wikimedia wikis as well.)


We can have a conversation about the future of talk pages on mediawiki.org
, where you can propose unfreezing LQT. I guess
https://www.mediawiki.org/wiki/Project:Current_issues is the place for it,
or a Phab task. I'm going to talk to Danny Horn first. FWIW I far prefer
Flow for feature discussions.


2. IMO, Flow fully achieved two points on
https://www.mediawiki.org/wiki/Flow?oldid=1870919 [1]; and it's more
efficient for me. The recent DWIM ("Do what I mean") improvements while
you're writing a post are a delight. Thanks Collaboration team <3 !

-- 
=S Page  WMF Tech writer


[1] > The main goals for the Flow project are:

   - to make the wiki discussion system **more accessible for new users**
   - to make the wiki discussion system **more efficient for experienced
   users**
- to encourage **meaningful conversations** that support collaboration
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Office Hour about mediawiki.org API pages Tuesday Sept 1

2015-08-28 Thread S Page
Tuesday 2015-09-01T18:00 UTC [1] (11am San Francisco time) we will be
having an IRC office hour [2] about the mw.org API: namespace.

* As Wikimedia adds other APIs such as Wikidata Query Service and RESTBase,
the structure and navigation needs to evolve.
* Also, as you may know we are writing pages to encourage third-party
developers to use Wikimedia data and APIs, and eventually want this Web
APIs hub [3] to become the front door rather than API:Main_page.
** and presented with a different appearance to new developers  [4]

If you have thoughts, opinions, or needs from the API pages, I hope you can
attend. The last bullet point is challenging and controversial, so it helps
to share ideas and approaches.

The Phab task https://phabricator.wikimedia.org/T110108 links to related
tasks. It's also a Phab event https://phabricator.wikimedia.org/E50 and on
the internal WMF Engineering calendar.

See you in irc:freenode.net#wikimedia-office

[1]
http://www.timeanddate.com/worldclock/fixedtime.html?hour=18min=00sec=0day=01month=09year=2015
[2] https://meta.wikimedia.org/wiki/IRC_office_hours#Upcoming_office_hours
[3] https://www.mediawiki.org/wiki/API:Web_APIs_hub
[4] http://devhub.wmflabs.org/wiki/API:Web_APIs_hub

-- 
=S Page  WMF Tech writer, and Quim Gil Engineering Community manager
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Issue with Gadget dependencies and ResourceLoader

2015-08-23 Thread S Page
On Thu, Aug 20, 2015 at 7:26 AM, Bartosz Dziewoński matma@gmail.com
wrote:

When you load a script through ResourceLoader, it's not executed in global
 context. This means that global variables you define are actually *not
 global* (they are local to the function your code is wrapped in), unless
 you explicitly assign them as `window` properties.


I added a section on this, Global variables are not global to
https://www.mediawiki.org/wiki/ResourceLoader/Migration_guide under
MediaWiki 1.26,
with a pointer to
https://www.mediawiki.org/wiki/Manual:Coding_conventions/JavaScript#Globals
where the latter says Only mediaWiki
https://www.mediawiki.org/wiki/RL/DM#MediaWiki and jQuery
https://www.mediawiki.org/wiki/RL/DM#jQuery should be used (in addition
to the browser's native APIs).

The latter doesn't suggest creating an object with mediaWiki/mw. I added

You should expose your code's functionality to other clients as functions
 and properties of an object within mediaWiki, e.g. mediaWiki.echo, and
 possibly as documented mw.config configuration variables.


but surely there's a page that talks about this idiom. Are there any
gadgets that add an object within mediaWiki ? If we were to rewrite
morebits.js from scratch, wouldn't it be better to create
mediaWiki.moreBits.{quickForm, simpleWindow, ...} rather than
window.MoreBits ?

Cheers,
-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Geohack tools

2015-08-17 Thread S Page
On Thu, Aug 6, 2015 at 11:01 PM, Pine W wiki.p...@gmail.com wrote:

 I just now realized how powerful these tools are when I started clicking
 around.


 https://tools.wmflabs.org/geohack/geohack.php?pagename=File%3AWhite-cheeked_Starling_perching_on_a_rock.jpgparams=34.610576_N_135.540542_E_globe:Earth_class:object_language=en


The Wikidata nearby map is pretty incredible, so. many. schools.
At the same magnification the coordinates popup  on
https://en.wikipedia.org/wiki/File:White-cheeked_Starling_perching_on_a_rock.jpg
shows fewer items, I assume it's a subset of enwiki articles with
coordinates that are nearby.

--
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Mediawiki-api] Bikeshedding a good name for the api.php API

2015-08-10 Thread S Page
tl;dr: PHP action API

I'm organizing content in the mediawiki.org API namespace,
https://phabricator.wikimedia.org/T105133 , and so back to this bikeshed
from August 2014.

We do now have the extra APIs. https://en.wikipedia.org/api/ names them
  *  PHP action API
  *  REST content API

I don't know who came up with the first name. I like it, it straddles Brad
Jorsch's
 seems like action API and api.php are the two contenders.

I'm changing the API navigation accordingly,
https://www.mediawiki.org/wiki/Template:API but the shed isn't going
anywhere :)

FWIW in writing documentation, I've found the core API is misleading
because extensions add API modules to it. Is Wikidata part of the core
API when only one wiki implements all its wbXXX modules? A lot of API
clients rely on the extracts and pageimages modules, but they're not part
of core.

Cheers,

 it was twelve months ago... 

On Fri, Aug 15, 2014 at 10:00 AM, Tyler Romeo tylerro...@gmail.com wrote:

 Agreed with Aaron. When these proposed additional APIs are actually
 implemented, then we can start arguing about what to call them.

 I know that I personally will continue to call the API the “core web API”
 or sometimes just the “web API”, if it is clear based on the context in
 which I am talking.
 --
 Tyler Romeo
 0x405D34A7C86B42DF

 From: Aaron Halfaker ahalfa...@wikimedia.org
 Cc: MediaWiki API announcements  discussion 
 mediawiki-...@lists.wikimedia.org
 Subject:  Re: [Wikitech-l] [Mediawiki-api] Bikeshedding a good name for
 the api.php API

 As a heavy user, I generally just refer to the things api.php does as the
 API. or MediaWiki's web API when I'm feeling verbose.

 I'd be confused about the action API since I generally use it to read
 which isn't really action -- even though it corresponds to action=query

 As for the proposed REST API, I don't think that proposed things should
 affect the naming scheme of things we already know and love.

 Also, I think that all bike sheds should match the color of the house to
 (1) denote whose bike shed it is and (2) help tie the yard together like
 furniture in a living room.


 On Fri, Aug 15, 2014 at 3:50 PM, Sumana Harihareswara 
 suma...@wikimedia.org
  wrote:

  I like action API.
 
  Sumana Harihareswara
  Senior Technical Writer
  Wikimedia Foundation
 
 
  On Thu, Aug 14, 2014 at 5:06 PM, Brad Jorsch (Anomie) 
  bjor...@wikimedia.org
   wrote:
 
   Summing up, it seems like action API and api.php are the two
   contenders.
  
   api.php is least likely to be confused with anything (only its own
  entry
   point file). But as a name it's somewhat awkward.
  
   action API might be confused with the Action class and its
 subclasses,
   although that doesn't seem like a big deal.
  
  
   As for the rest:
  
   Just API is already causing confusion. Although it'll certainly
  continue
   to be used in many contexts.
  
   MediaWiki API, Web API, and MediaWiki web API are liable to be
   confused with the proposed REST API, which is also supposed to be
   web-accessible and will theoretically part of MediaWiki (even though
 I'd
   guess it's probably going to be implemented as an -oid). MediaWiki web
   APIs may well grow to encompass the api.php action API, the REST API,
  and
   maybe even stuff like Parsoid.
  
   MediaWiki API and Core API are liable to be confused with the
 various
   hooks and PHP classes used by extensions.
  
   JSON API wouldn't be accurate for well into the future, and would
  likely
   be confused with other JSON-returning APIs such as Parsoid and maybe
  REST.
  
   Classic API makes it sound like there's a full replacement.
  
   All the code name suggestions would be making things less clear, not
  more.
   If it had started out with a code name there would be historical
 inertia,
   but using a code name now would just be silly.


-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikitech-ambassadors] [BREAKING CHANGE] Use of document.write no longer supported

2015-08-07 Thread S Page
On Thu, 06 Aug 2015 01:24:03 +0200, Krinkle krinklem...@gmail.com wrote:

TL:DR; Double-check your wiki's site scripts and your personal scripts
 to ensure document.write is no longer used.

...
 Check out the migration page for other deprecations and common issues you
 may encounter:
 https://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_(users)


I notice people have added document.write() migration advice to its Good
practices section [1] (thanks! 3 ). There's other good advice there.

Our JavaScript coding conventions barely mentioned ResourceLoader, I added
a use the RL, Luke! section [2] with a link to those Good practices.

Related, I added some tips on the somewhat black arts of RL debugging to
Developing with ResourceLoader [3].

[1]
https://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_%28users%29#Good_practices
[2] https://www.mediawiki.org/wiki/Manual:Coding_conventions/JavaScript
[3]
https://www.mediawiki.org/wiki/ResourceLoader/Developing_with_ResourceLoader#Debugging

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] compressed old_text

2015-07-29 Thread S Page
On Wed, Jul 29, 2015 at 3:48 PM, Kent/wp mirror wpmirror...@gmail.com
wrote:

 When I build a mirror, I would like to compress the text
 ...plaintext/text to get:

 old_text: ciphertext
 old_flags: utf-8,gzip

 I would like this done for every text revision, so as to save both disk
 space...


Maybe https://www.mediawiki.org/wiki/Manual:Reduce_size_of_the_database
will help. maintenance/storage/compressOld.php will compress older
revisions, optionally using gzip, and you can set the parameters to
compress every revision.

Did you set $wgCompressRevisions in your installation before importing? I'm
not sure if that has effect when building a mirror. It feels like it
should, and/or importDump.php should have some option to compress all
revisions imported; you could file a bug in Phabricator.

and communication bandwidth between web server and browser.


If I understand you correctly, that's a separate issue. MediaWiki doesn't
send compressed page data to the browser, it sends HTML. However, most
browsers send the
  Accept-Encoding: gzip, deflate
HTTP header, and in response most web servers will gzip the HTML of
MediaWiki pages and other web content. To verify, load a page from your
wiki in your browser and look in your web browser's developer tools'
Network tab for the request and response headers; the latter will probably
have
  Content-Encoding: gzip
Or you could do something like `curl -H 'Accept-Encoding: gzip, deflate'
--dump-header - http://localhost/wiki/Main_Page | less` and see what you
get.

2) Problem

 There is little relevant documentation on https://www.mediawiki.org. So
 I
 have run a few experiments.

 exp1) I pipe the plaintext through gzip, escape for MySQL, and build the
 mirror.


I wouldn't try to do this yourself. If import with $wgCompressRevisions =
true doesn't do what you want and you don't want to run a compressOld.php
maintenance step afterwards, I would suggest modifying some PHP somewhere
solely during the import to your mirror to encourage MediaWiki it to
compress every revision.


 Please provide documentation as to how mediawiki handles compressed
 old_text.
 a) How is plaintext compressed?


From looking at core/includes/Revision.php, if PHP's gzdeflate() exists
then MediaWiki will use this to compress the contents of old_text.
http://php.net/manual/en/function.gzdeflate.php has some documentation on
the function works.


 b) Is the ciphertext escaped for MySQL after compression?

No idea, old_text is a mediumblob storing binary data. As I understand it
escaping applies only to transfer in and out of the DB.

c) How does mediawiki handle old_flags=utf-8,gzip?
 d) How are the contents of old_text unescaped and decompressed for
 rendering?
 e) Where in the mediawiki code should I be looking to understand this
 better?


As above, PHP's gzdeflate/gzinflate in Revision::compressRevisionText() and
decompressRevisionText() in core/includes/Revision.php

Hope this helps. I didn't know anything about this 25 minutes ago :)

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Geolocation and WikiVoyage

2015-07-21 Thread S Page
There's an article about getting pages nearby with action=geosearch,
http://devhub.wmflabs.org/wiki/API:Showing_nearby_wiki_information [1], I
welcome feedback.

On Jul 21, 2015 10:40, Sylvain Arnouts sylvain.arno...@wanadoo.fr wrote:

 I know it's possible with Wikidata API to enter a position (longitude and
latitude), a range, and have all Wikipedia pages around.

How? I don't see such a feature in the wikidata modules for the MediaWiki
API at https://www.wikidata.org/w/api.php , and I wasn't aware of a
Wikidata query service for querying within a range of coordinate location.

[1] The same content is also at
https://www.mediawiki.org/wiki/API:Showing_nearby_wiki_information
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Data and Developer Hub protoype

2015-07-06 Thread S Page
On Sun, Jul 5, 2015 at 10:41 PM, Brian Wolff bawo...@gmail.com wrote:

 First of all, the links at the beginning, should not go directly to
 the projects in question, they should go to pages explaining how to
 use those projects in question on the outside.


Agreed, T104282 'Create landing pages for the free open knowledge sources
on the Data and Developer Hub.'


 If they wanted to just
 visit the wiki, they would have done that (Perhaps, this was already
 planned, and the current version is still just an early draft with not
 all the pieces in place yet?)

 Second, the existing showcased projects, seem to much like the sort of
 thing someone making a mobile Wikipedia App would want. Most people
 probably don't want article excerpts in their search results (I assume
 anyways). Most people aren't searching through a list of Wikipedia
 articles, unless they are wikipedia or related to wikipedia.

 But we do have one of the largest collections of (mostly) organized
 knowledge available for free (In both senses of the word). This is
 valuable, and quite unique on the internet. We should capitalize on
 this.

 Things like Show a short snippet about this topic from Wikipedia (+
 a link to more information) could be quite useful to many people.


Sure, that's Hovercards. It's a subset of
http://devhub.wmflabs.org/wiki/API:Page_info_in_search_results#Showing_useful_page_information
and I mention it at the end. Should we provide more than sample API calls?
Would a JS module that formats the results be useful?

The interesting challenge is how to identify this topic. Will websites
manually link to Michael Jackson (radio commentator)
https://en.wikipedia.org/wiki/Michael_Jackson_%28radio_commentator%29 or
just link Michael Jackson and hope for the best? Should we be
evangelizing wikidata numbers like Q6831566 for external apps?

Another thing sites can do is related content, I just learned about
CirrusSearch's morelike: search operator.
https://en.wikipedia.org/w/api.php?action=querylist=searchsrsearch=morelike:Michael_Jacksonsrlimit=10srprop=sizeformatversion=2
.  Such page retrieval and searches all tend to query for lead image and
textextract or wikitext description, so articles on them would overlap, but
that's OK.

Commons is another great resource because its information can be
 easily broken up into digestible parts like a single image (Which is
 much harder for a Wikipedia article). I think things like
 https://www.mediawiki.org/wiki/PhotoCommons which would allow a
 website operator to quickly allow their users to add stock photos to
 whatever it is their users do, is a good thing to focus on.

 Wikidata seems almost custom made for the type of user who would like
 to add cusom knowledge to their website.


https://www.mediawiki.org/wiki/Dev.wikimedia.org/Contributing , you can
propose these and if you know of examples doing this, even better.


 The other thing that should definitely be on the dev hub, is probably
 a link to our terms. We should emphasize that you can use your data,
 and we generally don't track you the way a facebook like button does.
 That you don't need an api key or anyone's permission. Of course we
 should also state what you do need to do (Give credit/follow license,
 set a user-agent header)


Yup T317  'Terms of Use must be prominently featured in the Developer
Hub'. I was thinking of mentioning it in the three adding it to the footer,
but maybe that's not prominent enough.

Great feedback, thanks.
-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Data and Developer Hub protoype

2015-06-26 Thread S Page
Thanks for the feedback (seriously).

On Fri, Jun 26, 2015 at 5:53 AM, MZMcBride z...@mzmcbride.com wrote:

 More to the point: why not just make dev.wikimedia.org a round robin that
 points to either https://meta.wikimedia.org/wiki/Research:Data or
 https://www.mediawiki.org/wiki/How_to_contribute? I think it would be a
 lot more valuable like that.


How to contribute is the golden arch for people who want to contribute to
free open knowledge. dev.wikimedia.org is for people who want to _take_ it;
as I wrote, we hope some of the latter shift to the former. It's tricky to
discern how and when to reel 'em in. After T101441 Integrate new Developer
hub with mediawiki .org, I think these pages can become the Web API column
under How to contribute, and we can reduce some redundant redundancy.

P.S. If it sounds as though I'm frustrated, it's probably due to the
 limited design resources we have being allocated pretty much exclusively
 to dubious microsites like the Transparency Report and this Data and
 Developer Hub, instead of having design resources devoted to, y'know,
 improving the real sites that receive billions of views per month.


Sure, that's a valid point (though exclusively is an exaggeration). The
absence of data beyond Luis' design matters makes the decision to do yet
another site hard and frustrating. I lack the Eloquence to present it
calmly; I can do levity and nobody dies if we made the wrong call, but I'm
not trivializing it.

Developing a new skin ought to be easier (obviously!). AIUI there isn't a
common approach underpinning Blueprint, Minerva (the mobile web skin),
Winter, and the Fixed Header and Compact Personal Bar beta features, though
the people who worked on them and other skins built up a lot of expertise.
https://www.mediawiki.org/wiki/Requests_for_comment/Redo_skin_framework
seems stalled .

Regards,
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Data and Developer Hub protoype

2015-06-25 Thread S Page
http://devhub.wmflabs.org is a prototype of the Data and developer hub, a
portal and set of articles and links whose goal is to encourage third-party
developers to use Wikimedia data and APIs. Check it out, your feedback is
welcome! You can comment on the talk page of the project page
https://www.mediawiki.org/wiki/dev.wikimedia.org , or file Phabricator
tickets in the project dev.wikimedia.org [1].

Since December 2013 Moiz Syed and others discussed creating a thing to
expose our APIs and data to developers. When S Page moved to WMF tech
writer, he wrote some articles for this on mediawiki.org and with Quim Gil
developed a landing page from the wireframe designs [2].

The prototype is using the Blueprint skin and running on a labs instance,
but the articles are all regular wiki pages on mediawiki.org that we
regularly import to http://devhub.wmflabs.org

Thanks to everyone who participated in the gestation of this idea!
  -- S Page and Quim Gil

== FAQ ==

Q: How can I feature my awesome API or data set?
A: Create a task in the #dev.wikimedia.org and #documentation projects [3]
with Article in the title. You can draft an article yourself, following
the guidelines [4].

Q: Yet another site? Arghh!
A: Agreed, T101441 Integrate new Developer hub with mediawiki.org [5].
It's a separate site for now in order to present a different appearance.

Q: But why a different appearance? Why a separate skin?
Our competition for developer mindshare is sites like
https://developers.google.com/ . We believe looking like a 2000s wiki page
is a *deterrent* to using Wikimedia APIs and data. We hope that many
third-party developers join our communities and eventually contribute to
MediaWiki, but How to contribute to MediaWiki [6] is not the focus,
providing free open knowledge is.

Q: Why the Blueprint skin?
A: The Design team (now Reading Design) developed it for the OOUI Living
Style Guide [7] and it has some nice features: a fixed header, and a
sidebar that gets out of the way and combines page navigation and the TOC
of the current page.

Q: So why not use the Blueprint skin on mediawiki.org?
A: Agreed, T93613 Deploy Blueprint on mediawiki.org as optional and
experimental skin is a blocker for T101441. We appreciate help with it and
its blockers.

Q: I hate the appearance.
A: That's not a question :) You can forget the prototype exists and view
the same content at
https://www.mediawiki.org/wiki/API:Data_and_developer_hub

Q: What is dev.wikimedia.org?
A: http://dev.wikimedia.org will be the well-known shortcut to the landing
page. And dev.wikimedia.org is the project name for this Data and
developer hub.

Q: I thought dev.wikimedia.org was going to integrate source
documentation/replace doc.wikimedia.org/enumerate all Wikimedia software
projects/cure cancer, what happened?
A: One step at a time. For now, its goal is, to repeat, to encourage
third-party developers to use Wikimedia data and APIs.

Q: Why are the pages in the API: namespace?
A: That's temporary, they will probably end up in a dev: namespace on
mediawiki.org that uses the Blueprint skin by default (T369).

Q: Where are the talk pages?
A: It's a bug that the sidebar doesn't have a Discussion link (T103785).
The talk pages on the prototype all redirect to the talk pages for the
original pages on mediawiki.org, and Flow is enabled on them.

[1]
https://phabricator.wikimedia.org/maniphest/task/create/?projects=dev.wikimedia.org
[2] https://www.mediawiki.org/wiki/Dev.wikimedia.org#Structure
[3]
https://phabricator.wikimedia.org/maniphest/task/create/?projects=dev.wikimedia.org,documentation
[4] https://www.mediawiki.org/wiki/dev.wikimedia.org/Contributing
[5] https://phabricator.wikimedia.org/T93613 and its blockers
[6] https://www.mediawiki.org/wiki/How_to_contribute (a fine general entry
point)
[7] http://livingstyleguide.wmflabs.org/
-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Modernizing our content platform: Kick-off meeting on Tuesday

2015-06-24 Thread S Page
On Tue, Jun 23, 2015 at 10:01 PM, Dan Garry dga...@wikimedia.org wrote:

 My takeaway from this was that there are strong arguments both for and
 against keeping the representation of an article as a single blob of
 wikitext/HTML.


In the Architecture focus discussion, Krinkle, tgr, and others expressed a
more optimistic idea, that the wikitext of an article is a sea of prose
with isles of non-prose content, which could come from structured data [1].

I wasn't properly aware that the graph [2] and templatedata [3] parser
tags are examples of this already. Their content is highly structured and
VisualEditor or dedicated code can provide a specialized editor for it. If
you edit source of a wiki page containing them and garble their content,
you get a syntax warning or fail.

Wiki pages need more of these, drumroll Structure Content Blobs™ (or
sPage Components? anything but widget). Are they necessarily parser tags,
or is a parser function like {{#graph: *some parameters*}} equivalent?

To be concrete, does this mean the way forward for specifying lead images
[5] is a parser tag
leadimage{
imagepage:   File:Einstein_1921_by_F_Schmutzer_-_restoration.jpg,
focalarea: {rect: [0.20, 0.20, 0.12, 0.12]}
}
/leadimage
in wikitext, with a MediaWiki API to add this to a document and a WYSIWYG
property editor in VisualEditor for humans?

AIUI, the *Architecture focus 2015* document discusses this under
Generalized transclusion [4] and comments:
Over the next months, the MediaWiki developer community and staff should
investigate how the different transclusion mechanism used with wikitext
content can be unified and extended to work with non-wikitext content.

Exciting stuff.

[1]
https://tools.wmflabs.org/meetbot/wikimedia-office/2015/wikimedia-office.2015-06-24-21.04.log.html
starting at 21:21:05
[2] e.g. https://www.mediawiki.org/wiki/Extension:Graph/Demo/Map?action=edit
[3] e.g. https://www.mediawiki.org/wiki/Template:Phabricator?action=edit
[4]
https://www.mediawiki.org/wiki/Architecture_focus_2015#General_architectural_concerns
[5] https://phabricator.wikimedia.org/T91683

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] which title to use in skin data?

2015-06-23 Thread S Page
The Blueprint skin uses data['title'], which doesn't work well [1] if a
page uses

{{DISPLAYTITLE:span style=blah blah{{FULLPAGENAME}}/span}}

to hide its title as suggested [2].

Manual:Skinning [3] doesn't explain where data['*somekey*'] comes from or
what the keys are. No problem, in a skin if you var_dump( $this-data )
there are a dozen other title-like keys to choose from. But how do skin
developers know which one to use? titletxt, titletext, titleprefixeddbkey,
thispage, ...  Some are set(), some are setRef(), is that significant?

It seems the best you can do is read the source code of
includes/skins/SkinTemplate.php and work back through its getOutput(),
getTitle(), and getRelevantTitle() to OutputPage and the maze of title
class methods to figure out which one does what you want. Too hard for me.

Thanks for any suggestions, I'll try to improve the documentation.

[1] https://phabricator.wikimedia.org/T103454
[2]
https://www.mediawiki.org/wiki/Manual:FAQ#How_do_I_hide_the_main_page_title.3F
[3] https://www.mediawiki.org/wiki/Manual:Skinning

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] API BREAKING CHANGE: Default continuation mode for action=query will change at the end of this month

2015-06-18 Thread S Page
On Thu, Jun 18, 2015 at 9:26 AM, Brian Gerstle bgers...@wikimedia.org
wrote:

 I guess it comes down to is this: if we're going to continue supporting
 old behavior, they should be accessible via the same old requests.  *This
 removes the need for existing clients to be updated in the first place*.
 If we eventually want to delete the old code keeping the old behavior
 separated from the new will make it clear  explicit what's being dropped
 and what to use instead. ...

 Lastly, changing the default behavior to make things sane for new
 developers is, IMO, a bad trade-off


That seems the crux of it. Because the MediaWiki API isn't versioned (and
there are good reasons for it), the recommended best practices form of an
API request evolves, something like

api.php?continue= formatversion=2 utf8= *your_actual_params*

and over time the best practices boilerplate at the front gets longer
unless we change a default and break old clients. Examples and
documentation should show best practices; T103015 is to use formatversion=2
and https in all example API requests. (We should have used continue= in
examples for the last year, it's too late now.)

The above is actually a real URL and shows three different approaches:
1. As the e-mail subject says we're going to make continue= the default in
a few weeks so you won't need to add it (but clients MUST add rawcontinue=
to get the old behavior).
2. formatversion=2 is the future but won't be the default for a while.
3. If you request formatversion=2 then results default to utf8, so you
don't need to specify utf8.  (Note formatversion=2 only applies to
format=json or php.)

Which approach to take is a judgement call, I'm interject-opinion-reluctant
:)



 because they'll eventually get tripped by us pulling the rug out from
 under their feet by *breaking backwards compatibility stable APIs*.


Or, over time the best practices boilerplate endlessly expands:
responselayout=clean reporterrors=schema facebookoverlordmode= ... Does
that make our API user-hostile? IMO it just makes it wordy.



 Those sorts of changes should be reserved for experimental or even beta
 APIs.  Continuing queries seems like a stable—and pervasive—part of the API.


Cheers,
-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Overriding Jenkins versus make-work patches

2015-06-17 Thread S Page
#1 sounds right, Jenkins serves us, not vice versa. The core change's
commit message should reference the two required extension changes.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Feedback requested on our search APIs

2015-06-08 Thread S Page
On Mon, Jun 8, 2015 at 4:16 PM, Brian Wolff bawo...@gmail.com wrote:

 The search api (by which I mean query=search in api.php) is somewhat
 poorly documented. You have to dig to find
 https://www.mediawiki.org/wiki/Help:CirrusSearch .


I recently added https://www.mediawiki.org/wiki/API:Search_and_discovery
which clarifies the connection with Help:CirrusSearch, and mentions other
kinds of searching like geosearch.


 I would much prefer
 that the relavent documentation was including in the normal api.php
 auto-generated help.


https://gerrit.wikimedia.org/r/216899 changes the
'apihelp-query+search-param-search message' in
https://www.mediawiki.org/wiki/Special:ApiHelp/query+search to
*srsearch*

Search for page titles and page content that match this value. You can use
the search string to invoke special wiki search features, depending on what
its search backend implements.
But API query search can only use CirrusSearch features if it's installed.
I think Extension:CirrusSearch could handle the 'APIGetAllowedParams' hook
to modified this help text. If I understand correctly, it might be easier
to interpose WMF-specific help text that links to mw:Help:CirrusSearch in a
'wikimedia-apihelp-query+search-param-search' key in
extensions/WikimediaMessages/i18n/wikimediaoverrides/en.json ; I tried it
locally and it didn't work.



 Even better would be if that api allowed users to
 specify the options using normal url parameters, (as a separate
 options from using operators in the search string). Its also not
 entirely the most clear from the api that the search options differ
 depending on which extensions you have installed.


What do you mean? Beyone special terms in srsearch I'm not aware of any
changes to query+search's sr parameters depending on extensions.


 Additionally, from the help page, its not entirely clear about some of
 the limitations. e.g. You can't do incategory:Foo OR intitle:bar.
 regexes on intitle don't seem to work over the whole title, only word
 level tokens (I think, maybe? I'm a bit unclear on how the regex
 operator works).


Yes it's not a full reference.

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Open Help Conference Sprints

2015-06-02 Thread S Page
Quim Gil wrote

 Hi, anybody interested in participating at the

 Open Help Conference  Sprints
 September 26-30
 Cincinnati, Ohio, USA
 http://conf.openhelp.cc/


 Lately we are focusing on users of our APIs, datasets, tools,
 infrastructure... Would it make sense to organize something at that event?


I'm moderately interested in going to see how other projects deal with
versioning documentation, auto-generating documentation, incorporating
sample code in documentation and vice-versa, etc.

If anyone working on mediawiki.org or wikitech documentation (documentation
of MediaWiki , APIs, developing, labs, tool labs, etc.)  plans to attend
that doubles my interest level, we could productively sprint on any of
those areas.

A Wikimedia expedition attended a couple of years ago with a Wikipedia 
 user help focus, and they were happy about the event.

 http://blog.wikimedia.org/2013/07/03/wikipedians-open-help-conference/


From the blog post I just learned the 2013 visit produced
https://www.mediawiki.org/wiki/Starter_kit , an alternative to How to
contribute.

On Wed, May 27, 2015 at 7:48 PM, hardik juneja hardikjuneja...@gmail.com
wrote:

 This event ​seems ​to be​ a really nice opportunity to showcase the
 ​recently built​ REST API​ (rest.wikimedia.org).​
 ​I was thinking something like a introductory session on the REST API, but
 it depends on the kind of audience we are targeting on because the API docs
 are self explanatory and are well received so far.


As I understand it, this is a hands-on conference about  writing better
help documentation. It's great that
https://en.wikipedia.org/api/rest_v1/page/html/Sheep
returns the fast cached HTML of the wiki page, and RESTBase's
Swagger-generated doc is nifty, but how do you see that relating to the
conference?

Cheers,
--
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] API returns empty strings for boolean fields?

2015-05-29 Thread S Page
On Thu, May 28, 2015 at 2:58 PM, Brian Gerstle bgers...@wikimedia.org
wrote:


 mainpage field is omitted when querying Barack Obama
 
 https://en.wikipedia.org/wiki/Special:ApiSandbox#action=mobileviewformat=jsonpage=Barack_Obamaprop=pagepropspageprops=mainpage
 
 :

  {
  mobileview: {
  pageprops: [],
  sections: []
  }
  }


API results often omit return parameters that have their default value in
order to save space.

Legoktm wrote:

If you pass formatversion=2 in the request, you'll get proper booleans (as
 well as other improvements).


Well, after API modules are changed to return booleans (which are turned
into empty string for backwards compatibility if you don't specify
formatversion=2). Legoktm fixed ApiMobileView.php with
https://gerrit.wikimedia.org/r/#/c/214636/ last night :
$resultObj-addValue( null, $moduleName,
*// was: array( 'mainpage' = '' )*
array( 'mainpage' = true )
);
so specifying formatversion=2 now gets you the boolean behavior on beta
labs, and soon on enwiki. http://en.wikipedia.*beta.wmflabs*
.org/w/api.php?action=mobileviewformat=jsonfmpage=Main_Pagepageprops=mainpage
*formatversion=2*
http://en.wikipedia.beta.wmflabs.org/w/api.php?action=mobileviewformat=jsonfmpage=Main_Pagepageprops=mainpageformatversion=2
works: returns:

{
mobileview: {
normalizedtitle: Main Page,
mainpage: true,
sections: [...

If you're writing an extension for the WMF cluster or 1.26 only, you should
always use formatversion=2 (with the proviso that some API modules may
change). It gives you cleaner return values and defaults to utf8 instead
of  ascii with unicode escape sequences.  More at
https://www.mediawiki.org/wiki/API:JSON_version_2#Using_the_new_JSON_results_format

Cheers,
-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Simplifying the WMF deployment cadence

2015-05-27 Thread S Page
Benito, Grossmeier! He made the trains run on time [1]

 Tuesday: New branch cut, deployed to test wikis
and mediawiki.org as before, I assume.

[1] Or not, http://www.transportmyths.co.uk/mussolini.htm

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FYI: HTMLForm is now available in OOUI format

2015-05-26 Thread S Page
On Mon, May 25, 2015 at 4:55 AM, James Forrester jforres...@wikimedia.org
wrote:

 To convert your code ...
- $form = HTMLForm::factory( 'ooui', … );
 … and everything Should Just Work™.


I mentioned this on the HTMLForm wiki page [1].

I tried converting the Example code from the HTMLForm wiki page, and the
result wasn't perfect [2]. Besides awaiting the OOjs UI  implementation of
HTMLMultiSelectField and HTMLRadioField, presumably developers ought to
change design, e.g. not use the 'section' border lines. I want to document
how to do the conversion [3].

Excellent stuff.

[1] https://www.mediawiki.org/wiki/HTMLForm
[2] https://phabricator.wikimedia.org/F169381
[3] https://phabricator.wikimedia.org/T100401

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] First impression with Differential

2015-05-22 Thread S Page
On Thu, May 21, 2015 at 12:05 PM, Stas Malyshev smalys...@wikimedia.org
wrote:


  Test Plan is required.
 
  Sounds like a good practice to me. Worst case scenario, type I didn't
 test
  this patch at all.

 I think it's not a good idea, as it trains people to do the opposite
 thing of what it's intended to train people for. I.e. mandatory, but
 put 'whatever' there IMHO is worse than non-mandatory but supported by
 common consensus. If we're setting up a new system, we shouldn't set it up
 so we constantly work around it.


I don't agree, or I don't understand, Test Plan: is fantastic and I've
wanted it in commit messages for years. Thinking about how to test is
essential to writing good code, and a reminder to express your thought
process is invaluable, whether it's qunit tests now pass, or it should
no longer break when you visit the special page, or obvious fix.
Expecting reviewers to be Nostradamus and infer what the engineer might
have thought about testing, or whether the engineer tested at all, is dumb.
If someone writes Test Plan: whatever, they'll get -1s until they respect
reviewers enough to write Test Plan: trivial untested fix in an area that
lacks tests.

In the last week my instance crashed on two checkins that were +2'd but
never run; both useful improvements to MediaWiki-Vagrant roles that looked
completely legit. I appreciate people writing needed code and others
reviewing it promptly, but typing: Test Plan: I didn't test this patch at
all might have saved me 20 minutes of wondering where I screwed up,
re-provisioning, hunting down log files, etc. (Both are fixed BTW.)

Can/will we add a Test Plan:  line to the default commit.message
https://git-scm.com/book/en/v2/Customizing-Git-Git-Configuration#idp25045696
just to get developers in the mood and encourage a standard format?
Independent of Differential and separately from whether we make it
mandatory. It seems developers have to do
  git config --global commit.template ~/path/to/better/git_commit.message
, but we could put the file in a MediaWiki project like core

Regards,
--
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Community project ideas

2015-05-19 Thread S Page
On Tue, May 19, 2015 at 6:52 AM, MZMcBride z...@mzmcbride.com wrote:

 What is Community Tech? How does it differ from the work the rest of the
 engineering and product team is doing?


Yes, everything WMF does has a community aspect (except facilities?), so
it's tricky to know when to highlight it. And community is an open-ended
pluralistic term, like User or Open. It's reasonable to want teams to
be more specific, give them a little more time as we work through the reorg.

I'm forming an Open Community Core Engagement team, dedicated to
experimenting on wiki users. -- joke, I kid

Peace,
-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [WikimediaMobile] Tracking Antifeature?

2015-05-18 Thread S Page
On Thu, May 14, 2015 at 6:13 PM, Dmitry Brant dbr...@wikimedia.org wrote:

 The mobile apps (Android and iOS) do indeed collect anonymous usage
 statistics [0].


That FAQ says `The app automatically collects certain sanitized user
statistics, but this can be disabled by going to the main app menu (top
left), selecting the More option, and unchecking the Send usage reports
checkbox.'

Presumably the Wikipedia Android app license [5] lets F-droid compile a
version of the Android app with the tracking code disabled or removed.
IANAL so can't tell from the license whether the resulting work has to be
renamed à la Debian Iceweasel
https://en.wikipedia.org/wiki/Mozilla_Corporation_software_rebranded_by_the_Debian_project
[6].

Can someone with an F-droid wiki account or who has a contact in that org
update the Wikipedia app page [7]? The Wikipedia Android app license [5] is
not GPLv2 but Apache 2.0, and the Site and Issues links are redirects.

[5] https://github.com/wikimedia/apps-android-wikipedia/blob/master/COPYING
[6]
https://en.wikipedia.org/wiki/Mozilla_Corporation_software_rebranded_by_the_Debian_project
[7] https://f-droid.org/wiki/page/org.wikipedia

For more information on what the usage reports contain, as well as other
 topics such as app permissions, please refer to our FAQ [1].

 [0]:

 https://github.com/wikimedia/apps-android-wikipedia/blob/b316b11ba9f70d3adc8fc80825f931590f4e476a/wikipedia/src/main/java/org/wikipedia/analytics/EventLoggingEvent.java#L69-L73
 [1]:
 https://www.mediawiki.org/wiki/Wikimedia_Apps/FAQ#Offline_reading_and_data


 On Thu, May 14, 2015 at 8:50 PM, Bryan Davis bd...@wikimedia.org wrote:

  On Thu, May 14, 2015 at 5:40 PM, Ricordisamoa
  ricordisa...@openmailbox.org wrote:
   FYI
   https://f-droid.org/wiki/index.php?title=org.wikipediadiff=54674
   https://f-droid.org/wiki/page/Antifeature:Tracking
 
  Maybe for the ACRA [0] integration that emails crash reports to OTRS
  [1]? I think it asks if you want to send the report though.
 
  [0]: http://www.acra.ch/
  [1]:
 
 https://github.com/wikimedia/apps-android-wikipedia/blob/b316b11ba9f70d3adc8fc80825f931590f4e476a/wikipedia/src/main/java/org/wikipedia/WikipediaApp.java#L50-L56
 
  Bryan
  --
  Bryan Davis  Wikimedia Foundationbd...@wikimedia.org
  [[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
  irc: bd808v:415.839.6885 x6855
 
  ___
  Mobile-l mailing list
  mobil...@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/mobile-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] API result formatting changes (being announced a little late), for developers

2015-05-04 Thread S Page
On Sat, May 2, 2015 at 10:39 PM, MZMcBride z...@mzmcbride.com wrote:

 My reading of https://phabricator.wikimedia.org/T76728 and
 https://gerrit.wikimedia.org/r/182858 is that this change is likely
 already live on Wikimedia wikis. It also looks like this change will be
 part of MediaWiki 1.25 and later.


Are you sure?  https://gerrit.wikimedia.org/r/182858 was merged April 15.

formatversion is in 1.25 but RELEASE-NOTES-1.25 say it's experimental, they
say
* Many modules have changed result data formats. While this shouldn't affect
  clients not using the experimental formatversion=2015, code using
  ApiResult::getResultData() and not using ApiResult::transformForBC() may
  need updating.

Brad, is it OK to use formatversion=2 in extensions running on 1.25?

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] API result formatting changes (being announced a little late), for developers

2015-05-01 Thread S Page
On Fri, May 1, 2015 at 7:53 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org
wrote:

 The general theme is that the ApiResult arrays now have more metadata,
 which is used to apply a backwards-compatible transformation for clients
 that need it

How does ApiResult know if a client needs this transformation? Do you mean
clients requesting in formats other than json?

Is there anything API client developers should check in their code? It
seems all the possible update work is in API module implementors.


 Relevant changes include:
 ...
- Actual booleans should now be added to ApiResult, and will be
automatically converted to the old convention (empty-string for true and
absent for false) when needed for backwards compatibility. Code that was
violating the old convention will need to use the new
ApiResult::META_BC_BOOLS metadata property to prevent this conversion.


Is this the cause of the EventLogging outage [1], [2]?
It seems EventLogging was just naively returning a JSON schema as its
results, could other API modules that return JSON structures containing
binary values also break clients?



 If developers need assistance with API issues or code review for API
 modules, please do reach out to me.


Let me know off-list how I can help update the docs. To start, I think this
change description should go on the wiki.

[1]
https://wikitech.wikimedia.org/wiki/Incident_documentation/20150428-EventLogging
[2] https://gerrit.wikimedia.org/r/#/c/207297/

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSoC and Outreachy results out!

2015-04-28 Thread S Page
Congratulations and welcome folks!

After setting off on the yellow brick road of MediaWiki development [1] you
may find yourself in a dark forest of 12-year-old inaccurate wiki pages.
You have a great opportunity to point out each twig you stumbled over as
you go, and thereby help future developers who don't have a mentor
strolling along.

   - If something in the documentation is obviously wrong, Be Bold and fix
   it,
   - If something doesn't seem right, start a topic on its talk page and
   e-mail me.
   - If there's a missing piece, file a Phabricator ticket tagged
   #documentation
   
https://phabricator.wikimedia.org/maniphest/task/create/?projects=documentation
   .

Thanks in advance, have fun, see you in Oz^w IRC. I'm spagewmf.

It's also extremely impressive to see volunteers from the community step up
 to serve as mentors

+1 !

[1] https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker


On Mon, Apr 27, 2015 at 1:00 PM, Niharika Kohli nko...@wikimedia.org
wrote:

 Hello everyone!

 After a long and intense selection process, Google and GNOME Foundation
 have finally announced the list of GSoC and Outreachy selects for the
 current round. I'm pleased to announce we have 9 interns for GSoC and 1 for
 Outreachy!

 Congratulations to everyone selected and to the mentors who have put in
 much hard work in selecting worthy candidates. Also thanks to those who put
 in hard work but could not be selected. We had limited projects and had to
 make some hard decisions. We hope you'll keep contributing to Mediawiki and
 try again for the next GSoC/Outreachy round.

 We have a general guide to being a successful intern at

 https://www.mediawiki.org/wiki/Outreach_programs/Life_of_a_successful_project
 It's
 still under development and we'll be adding more details to it as we move
 forward in the program. The main pages for the respective programs are at
 https://www.mediawiki.org/wiki/Google_Summer_of_Code_2015 and
 https://www.mediawiki.org/wiki/Outreachy/Round_10

 The students selected are:

 Outreachy:

 1. Dibya Singh (Phenix303) from India
 Project: One-stop Translation Search
 Mentors: Nemo_bis and Nikerabbit

 Google Summer of Code:

 2. Frédéric Bolduc (ferdbold) from Quebec
 Project: GraphData extension for VE
 Mentors: Mooeypoo and Mvolz

 3. Ankita Kumari (ankita-ks) from India
 Project: Unified language proofing tools integration framework
 Mentors: Aharoni and Eranroz

 4. Jiarong Wei (VcamX) from Hangzhou, Zhejiang
 Project: Implement OAuth support for Pywikibot
 Mentors: Jayvdb and Halfak

 5. Tina Johnson (tinajohnson) from India
 Project: Newsletter Mediawiki Extension
 Mentors: Quim Gil and Tony Thomas

 6. Vivek Ghaisas (polybuildr) from India
 Project: Extension to identify and delete spam pages
 Mentors: Yaron Koren and Jan

 7. Sumit Asthana (codezee) from India
 Project: Wikivoyage Pagebanner Extension
 Mentors: Nicolas Raoul and Jdlrobson

 8. Alexander Jones (happy5214) from United States
 Project: Implement Flow support in Pywikibot
 Mentors: Jayvdb and Mattflaschen

 9. Jan Lebert (sitic) from Germany
 Project: An enhanced cross-wiki watchlist
 Mentors: Yuvipanda, Legoktm

 10. Sarvesh Gupta (s1991) from India
 Project: Allow contributors to update their own details in tech metrics
 Mentors: Alvaro, Daniel

 Let's all welcome them into the Wikimedia family!

 Thanks,
 Niharika.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Designing a MediaWiki Skin

2015-04-27 Thread S Page
On Thu, Apr 23, 2015 at 1:44 PM, Jack Phoenix j...@countervandalism.net
wrote:


 A lot of the on-wiki
 documentation at MediaWiki.org is outdated, lacking or likely both.

...
 You'll need:
 *good explanation follows*


That's pretty much what https://www.mediawiki.org/wiki/Manual:Skinning
says. I made a few improvements. If you see anything outdated or lacking
there, Be Bold. Would it help if we developed a BoilerPlate skin like the
BoilerPlate extension?

On Fri, Apr 24, 2015 at 11:00 AM, Bartosz Dziewoński matma@gmail.com
wrote:

For example, to get rid of everything related to MediaWiki UI (replace
 'yourskinname' with the name of your skin):

 $wgResourceModuleSkinStyles['yourskinname'] = array(
 'mediawiki.ui' = array(),
 'mediawiki.ui.checkbox' = array(),
 'mediawiki.ui.radio' = array(),
 'mediawiki.ui.anchor' = array(),
 'mediawiki.ui.button' = array(),
 'mediawiki.ui.input' = array(),
 'mediawiki.ui.icon' = array(),
 'mediawiki.ui.text' = array(),
 );


I added this to
https://www.mediawiki.org/wiki/Manual:$wgResourceModuleSkinStyles and
mention this global in Manual:Skinning.

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] odd namespace number

2015-04-02 Thread S Page
On Thu, Apr 2, 2015 at 3:01 AM, Amir E. Aharoni 
amir.ahar...@mail.huji.ac.il wrote:

 Flow has a new namespace: Topic. It has an even namespace number by
 default: 2600.

 Talk pages till now have had odd numbers, and some scripts relied on that.


And that is still the case. For example,
https://www.mediawiki.org/wiki/Talk:Search is a Flow-enabled talk page, and
its wgNamespaceNumber is 1.

So it's still a reasonable assumption that the talk page for a page is in
the odd namespace N+1.
Now the topics that appear on every Flow board page are in this special
Topic: namespace and that is not odd but 2600.  But this is not the talk
namespace for the regular pages in namespace 2599, it's a specialized
namespace like Graph or Worfklow. A Flow topic like
https://www.mediawiki.org/wiki/Topic:Savgw0ltqlin3m69 is not the talk page
of anything.

I don't think there's a technical difficulty in making NS_TOPIC odd, but
it's implying things that aren't so and I'm not sure what the win is.



 Also, a program that wants to handle
 Flow posts will probably need APIs and queries that are different from
 those that are needed for handling old-style wiki pages in any case.


Yes, some documentation at [2] and [3].   There's been discussion whether
to make existing naive operations like Add a new section to this talk
page or Add a comment or category to this talk page Do The Right Thing
on a talk page that's actually a Flow board. See [1] and its phab tasks;
the consensus is tools need to understand they're dealing with a Flow board
to make the right choices.   Anyway, neither of those actions make sense on
an individual Flow topic.



 But still, I'll ask to be on the safe side: Does anybody know how prevalent
 that is? Is anybody else familiar with scripts, gadgets or extensions that
 rely on this?


I don't know. It depends how tools discover a page in the Topic namespace
and decide to operate on it.


 (No, this is not an April fools joke. Really, it isn't. Yes, I admit that
 it sounds a lot like xkcd 1172. And like WP:BEANS. But no, it's not an
 April fools joke.)


We have some time to figure this out, all talk pages are being converted to
Flow boards on May 8th.  (Am I too late? :-) )

[1]
https://www.mediawiki.org/wiki/Flow/Architecture/API#Should_existing_API_calls_that_interact_with_a_page_.22just_work.22_on_a_Flow_board.3F
[2] https://www.mediawiki.org/wiki/Flow/Architecture/API#Use_cases
[3] https://www.mediawiki.org/wiki/Extension:Flow/API

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Org changes in WMF's Platform Engineering group

2015-03-24 Thread S Page
A reorg that makes sense without TLA buzzword decoding \o/
(Are you sure you didn't leave out the MediaWiki Platform Community API
Core Features Development team  :) )

Best wishes and high hopes for y'all.
-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki-Vagrant now has support for install in Linux Containers (LXC)

2015-03-11 Thread S Page
On Mon, Mar 9, 2015 at 3:11 PM, Brion Vibber bvib...@wikimedia.org wrote:

 vagrant with the lxc provider inside Ubuntu 14.10 in a Parallels VM ...has
 the plus that you can use a stock VM if you're going to run Linux anyway.


But MediaWiki-Vagrant uses a stock VM. Its Vagrantfile loads a
trusty-server-cloudimg-amd64-vagrant-disk1.box which contains a vmdk that
as far as I know is much like other Ubuntu VMs.


 This is a plus for me as I use VMs for a lot of testing and
 prefer Parallels for its much better/faster graphics support.


Did you try Parallels instead of VirtualBox? The Parallels provider for
Vagrant is a plugin officially supported by Parallels. The plugin allows
Vagrant to power Parallels Desktop for Mac based virtual machines
http://parallels.github.io/vagrant-parallels/

LXC on a Linux host should be faster than Vagrant VM, but it's hard to see
how running Vagrant as an LXC in a VM on a Mac can be faster than running
Vagrant as its own VM. 8-)

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Announcing service-runner, a startup module / supervisor for node services

2015-02-23 Thread S Page
On Mon, Feb 23, 2015 at 1:48 PM, Gabriel Wicke gwi...@wikimedia.org wrote:

 Service-runner [1] is a small module that we moved out of restbase. It
 generalizes some simple start-up, monitoring and supervision facilities
 ...


I'm surprised there isn't something like this already in nodejs that you
get for free when you use forever[6] to run a node command. Did you
consider forever-service [7] ?  It sounds similar:


1. Make an universal service installer across various Linux distros
and other OS.
2. Automatically configure other useful things such as Logrotation
scripts, port monitoring scripts etc.
3. Graceful shutdown of services as default behaviour.

 It's _great_ that https://github.com/wikimedia/service-runner#see-also
mentions similar packages, I made a pull request to add forever-service to
the README though I didn't compare its features.

For small (third party) installs with limited memory, we also added the
 capability to cleanly run multiple services in a single process.


Yay. Should MediaWiki-Vagrant use this, or does it only benefit when you're
running more -oids than just Parsoid?

I'm so excitoid

[6] https://github.com/foreverjs/forever
[7] https://github.com/zapty/forever-service
--
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Video Uploads to Commons

2015-02-06 Thread S Page
On Wed, Feb 4, 2015 at 1:28 AM, Nkansah Rexford seanmav...@gmail.com
wrote:

 Its very interesting for me to know similar tools exist in the system. I
 wish a tool like https://tools.wmflabs.org/videoconvert/ was listed on the
 ways to upload videos on commons page. I would have used it right away.


I added a section Online conversion tools to
https://commons.wikimedia.org/wiki/Help:Converting_video

--
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Video Uploads to Commons

2015-02-06 Thread S Page
On Fri, Feb 6, 2015 at 12:07 PM, Nkansah Rexford nkansahrexf...@gmail.com
wrote:

 Is it possible to run/host a python based application on wmflabs?


Sure, there are a bunch there.
https://wikitech.wikimedia.org/wiki/Help:Tool_Labs#Quick_start
(get a labs account, create a new tools account or maybe join the existing
videoconvert tool account).

-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Converting debug logging to PSR-3

2015-01-30 Thread S Page
On Fri, Jan 30, 2015 at 12:00 PM, Gergo Tisza gti...@wikimedia.org wrote:

  On Thu, Jan 29, 2015 at 11:45 PM, Bryan Davis bd...@wikimedia.org
 wrote:
  The wfDebug* methods are not being deprecated officially yet but it
  would be great if people started treating them like they were
  deprecated when writing new classes.

 So why not actually deprecate them?


As Bryan said I think we should, for new classes in core. They're unlikely
to need backporting.
https://www.mediawiki.org/wiki/Manual:How_to_debug#Logging seems to be the
place to deprecate, plus the various $wg pages.

For extensions, our compatibility guidelines [1] say both
* trunk extensions only have to support current trunk MediaWiki
* and don't break support for old MediaWiki versions unless the
compatibility code is causing actual quantifiable problems.
So an extension like Flow which depends on fixes to core in 1.25 might as
well switch. Whatever extension developers decide, the Extension: page must
express the reality of the MW version(s) it requires to work.

For sample code, it's tricky. Should mw.org have code that shows developers
how to write extensions that run on latest git or on Download MediaWiki
1.24.1, or both?  (And even for the legacy LTS releases 1.23.8 and
1.19.23.)

[1]
https://www.mediawiki.org/wiki/Backward_compatibility#Trunk_extensions.27_compatibility_with_old_MediaWiki_versions
[2] https://www.mediawiki.org/wiki/Extension_registration
-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Brion's role change within WMF

2015-01-20 Thread S Page
On Tue, Jan 20, 2015 at 9:22 AM, Tei oscar.vi...@gmail.com wrote:

 a open door to things that can freak us again.

^ world's greatest job title for Brion's business card

Robla wrote

 Brion,let me know how I can help!

Ditto.

Great news indeed.
-- 
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread S Page

 One of the bigger questions I have about the potential shift to requiring
 services is the fate of shared hosting deployments of MediaWiki.


Seems like an opportunity. Deploy your MediaWiki-Vagrant instance to our
cloud infrastructure cheap.  It's not $2/month, but Digital Ocean can host
a  1GB VM for $10/month. Maybe https://atlas.hashicorp.com/ will host
vagrant boxes cheaper.

Docker allows many Linux containers on the same physical system which
should lead to lower pricing. Some MediaWiki dockerfiles install Parsoid
and nodejs as well as MediaWiki and PHP, e.g. CannyComputing
https://github.com/CannyComputing/*frontend-mediawiki
https://github.com/CannyComputing/frontend-mediawiki* , or you'd have
separate container for each service. I can't figure out the pricing.

--
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bugzilla-Phabricator migration (almost) completed

2014-11-24 Thread S Page
phantastic! I suggest phuket https://en.wikipedia.org/wiki/Phuket_Province
for the devtools team party/recovery/planning offsite.

So far the Phabricator T task number of every BZ URL in my browser
autocompletion is its bug number + 2000. Is this a happy coincidence?

On Sun, Nov 23, 2014 at 11:41 PM, Quim Gil q...@wikimedia.org wrote:

 It has been really exciting to deploy a Phabricator instance
 with 75k tasks, the biggest Maniphest container we are aware of.


That explains the new Mercedes purchased by our sales rep at Facebook :)

And well, this is only the beginning. Right now we will focus in the most
 urgent post-Bugzilla-migration tasks, while starting to prepare the RT
 migration. Your tasks, comments, and tokens are welcome. Check the current
 backlog at
 https://phabricator.wikimedia.org/maniphest/query/y2CdmZwKv3oZ/#R


I'll be converting some of Flow's tasks in Trello to Phabricator, using
string and glue or whatever. I invite anyone else attempting project
migration to https://phabricator.wikimedia.org/T36 and the team practices
mailing list https://lists.wikimedia.org/mailman/listinfo/teampractices

--
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [EE] Is simplicity is the key to success of Echo and Watchlist?

2014-09-08 Thread S Page
On Fri, Sep 5, 2014 at 3:33 AM, Michał Łazowik mlazo...@me.com wrote:

 Wiadomość napisana przez Danny Horn dh...@wikimedia.org w dniu 5 wrz
 2014, o godz. 02:00:

  Rather than going directly to the topic page, the bundled new topic
  notifications will take you to the board, sorted by newly-created topics.
  The plus, obviously, is that you get fewer notifications; the minus is
 that
  you may need to scroll down on the board to make sure you don't miss
  something.

 Maybe mark unread messages somehow? Some vertical line on the right/left
 maybe?


You mean like the way the UserName and three others responded
notification of new posts on a topic highlights the new posts? (try it e.g.

https://www.mediawiki.org/wiki/Topic:S12prb6trf24nhkz?fromnotif=1#flow-post-s17n6dd2f6sfe7he
)

The 3 new topics on Talk:Sandbox notification could do something similar,
showing a vertical line next to the title bars of new posts.

We could add a Highlight all posts newer than this to each post's action
menu, but it feels a bit esoteric. User JS or a gadget could easily do it.

Regards,

--
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Template RFC

2014-08-22 Thread S Page
On Fri, Aug 22, 2014 at 11:34 AM, Jon Robson jdlrob...@gmail.com wrote:


 1) We need some JavaScript API for using templates (in current form
 all we have is a standard way to ship templates from the server to the
 frontend).

 Well, Mantle does have a basic template.add(), get(), and render() API. It
seems OK.

Before we know the API is right I would like to see proof-of-concept
template compilation. Handlebars templates can/could be compiled in advance
at commit-time, during a build process, on the server at request time, or
on the client. Developers should not have to change code for this.  (
https://bugzilla.wikimedia.org/show_bug.cgi?id=64735 for handlebars.)


 Trevor is working on a template widget for oojs which will
 make this possible

Great, though I don't understand what this is. Is this a specific widget
that uses a specifc template, or an platform widget that handles
templating for other OOUI widgets?


 2) We need a standard template language
 3) Better namespacing for templates - we identified that we will need
 to find better ways of uniquely identifying templates [3].

 We questioned whether point 2 should be blocking, as we recognised
 that even if we decide to standardise on Knockoff in future, it should
 be trivial to write a script that converts Hogan/Handlebars template
 language to Knockoff. The Mantle ResourceLoader module in current form
 is template agnostic and currently works with both Hogan and
 Handlebars, so in theory should make it easy to transition from one
 template language to another.

 On this basis, I think the next step would be for the oojs development
 team to get a template widget up that can be used by Mantle.


Great, again I'm confused :)

FWIW the pressing need of several teams is a dialog component with the
Agora appearance to replace homebrew and jQuery UI dialogs.  As I
understand it, besides VE of course Media Viewer has an OOUI dialog in its
Use this file pop-up.

-- 
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bikeshedding a good name for the api.php API

2014-08-08 Thread S Page
tl;dr: It already is the MediaWiki web API.  A codename as well is fine.

Searching mediawiki.org, web API is almost as popular as MediaWiki API,
and some of the uses are correctly for extension APIs and Parsoid's API.
So the MediaWiki web API is a descriptive name that requires minimal
changes.

That doesn't preclude us giving it a cool codename:
   The MediaWiki web API, codename Unicorns Are Go, ...
Bots use Jimbo, the MediaWiki web API, to..
Another possibility is Starbright, an homage to Messrs. Starling and Brion,
and 80s Madonna.  Whatever we use should have awesome ASCII art in
http://en.wikipedia.org/w/api.php

Some inline responses below.

Tim Starling wrote:

Note that Nemo bis changed the name from MediaWiki API to WebAPI
 on the basis of disambiguation, in this revision:

I don't like smash words, WebAPI already sounds dated, and Mozilla uses it
https://developer.mozilla.org/en-US/docs/WebAPI for advanced browser
features so developers might think it describes ResourceLoader modules like
mediaWiki.Api, mediaWiki.loader, etc.

https://www.mediawiki.org/w/index.php?title=API
 :Main_pagediff=644948oldid=642646

 I have previously suggested web API as a term to distinguish it from
 the set of PHP classes and hooks used by extensions. API stands for
 application programmer interface, and traditionally refers to
 functions and classes -- using the term for a non-RPC HTTP interface
 is really rather awkward.

 Neither MediaWiki API nor Web API distinguishes it from the
 proposed REST API.

Using MediaWiki REST API for the latter works.  Or cool codenames for
both.


On Thu, Aug 7, 2014 at 3:47 AM, Erik Moeller e...@wikimedia.org wrote:

 Core seems a reasonable qualifier, though, no? Seems like the
 content API and a lot of other proposed interfaces are by definition
 outside the core. So why not MW core API or just core API for short?


But extensions also provide this API.  api.php?action=echomarkreadall=true
http://en.wikipedia.beta.wmflabs.org/w/api.php?action=echomarkreadall=true
isn't part of core but is the same MediaWiki web API.  Also see my
signature below :)
-- 
=S Page  Core features team (not MediaWiki core or platform) engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Roadmap and deployment highlights - week of July 21st

2014-07-22 Thread S Page
On Sat, Jul 19, 2014 at 4:11 PM, Minh Nguyen m...@1ec5.org wrote:

 On 2014-07-18 13:05, Greg Grossmeier wrote:


 ** https://www.mediawiki.org/wiki/MediaWiki_1.24/wmf14
 ** Topic: namespace added (for Flow)


 Is it expected that the Topic: namespace will be localized into languages
 other than English?

Yes.


 [1] I ask because the Vietnamese translation would be chủ đề and the
 Vietnamese Wikipedia already has a Chủ đề: namespace (corresponding to
 Portal: at the English Wikipedia).


The localized name for the Topic namespace will have to be different from
the localized name for the Portal namespace. So it will be a bit
inconsistent with the use of Chủ đề for topics in the rest of Flow.
However the Topic: name is not highly visible. The actual topic page titles
are user-unfriendly UUIDs like Topic:Ry6o4jowf78f34gp,
https://www.mediawiki.org/wiki/Topic:Ry6o4jowf78f34gp so Flow tries to
display whatever users gave as the topic title in the page heading,
notifications, recent changes, watchlists, etc. rather than the page title.

Thanks for helping to localize Flow!

--
=S Page WMF Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating progress; Knockout Components curly brace syntax

2014-07-09 Thread S Page
Both handlebars (JS) and lightncandy (handlebars reimplemented in PHP)
support pre-compilation.  Are the times in
https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library#Performance
for pre-compilation?

Flow's handlebars templates are pre-compiled into PHP for server-side
templating. We haven't yet tried pre-compiling them in JS. Ideally the API
in Mantle[1] to render a template wouldn't change.

...  if compilation  caching happens dynamically in ResourceLoader


Would be nice. The Handlebars compiler (into JS)  and Knockoff/T-assembly
compiler (into JSON) are both Node.js programs.

[1] https://www.mediawiki.org/wiki/Extension:Mantle#Developer_features
--
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Front-End Standardization Chat

2014-06-25 Thread S Page
Developing a better way to generate interactive HTML with consistent UX is
important. I support this work, along with 
https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library.
But...

On Wed, Jun 25, 2014 at 11:53 AM, Erik Moeller e...@wikimedia.org wrote:

 This proposal will only address parts of UX standardization. It does not
 currently focus on look and feel itself, though it would lay the
 groundwork for more CSS standardization as well.


No it won't (!), at least not for year or two. We already have the
under-resourced haphazard efforts to apply and refine a consistent UX
(Agora) to the HTML generated by random PHP, special pages, HTMLForm,
jQuery UI, various templating systems, and OOjs UI. This proposal adds
another, better, approach to generating that HTML, but until we drop the
other approaches it's increased the work (cue https://xkcd.com/927/  :) ).

2) The proposal on the table is to implement this new skin framework, port

 existing skins in MW core, and port it to mobile as a skin

Which is good but doesn't help other projects and their inconsistent UX.
The supposition is that because this new framework is in core it'll be a
tool available to all developers that will deliver UX consistency. I
propose a simple use case to prove that supposition:

* Extensions and core right now need a replacement for jquery.dialog() that
will render a modal dialog with Agora styling.

If this effort produces such a dialog with the requisite browser support (a
thorny issue as Erik says) and doesn't require rewriting the rest of the
extension's UX code, we'll know this is on the right track for more than
just skins and a mobile convergence. And there will be much rejoicing :)

(Maybe OOjs UI already provides that dialog, I think MultimediaViewer's
share dialog uses it.)

Regards,
-- 
=S Page  WMF Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] LiquidThreads - how do we kill it?

2014-06-17 Thread S Page
On Mon, Jun 9, 2014 at 11:33 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org
 wrote:


 I personally find the topic history page[4] to be horrendous, both ugly
 and nearly unusable.

We're going to revise topic history.


 Yes, I'm probably atypical in that I like reading wikitext diffs for
 discussion pages.

You can view the diff between revisions of a post, but editing posts is
somewhat rare. Generating a diff of an entire topic with new posts,
updates, hide operations, etc. seems challenging.

-- 
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] How to create login / signup links cleanly?

2014-06-16 Thread S Page
If I want to show anonymous users a message including You can _login_ or
_create an account_  with returnto support, what's the best way to do it?


Background:

All over MediaWiki we display messages to anonymous users with _login_ or
_create an account_ links in them. In most cases we want the user to
easily get back to what they were doing, so Special:UserLogin supports
returnto and returntoquery parameters which after login or signup make it
display a link to go back to the page the user was on, or silently redirect
back to it.

The display of login/signup links seems haphazard:

1) Many messages don't bother with returnto, and just have wiki links to
the special page.  E.g. anontalkpagetext has:

please [[Special:UserLogin/signup|create an account]] or
[[Special:UserLogin|log in]] ...

2) Some messages try to handle returnto with complicated magic word
processing, , e.g. https://en.wikipedia.org/wiki/MediaWiki:Anoneditwarning
has:

If you '''[{{fullurl:Special:UserLogin|returnto={{FULLPAGENAMEE log
in]''' or
'''[{{fullurl:Special:UserLogin/signup|campaign=anoneditwarningreturnto={{FULLPAGENAMEE
create an account]''' ...

This is fiddly, the magic words don''t work in JavaScript, and the
resulting HTML requires you put class=plainlinks on the enclosing tag to
disable the display of the external link icon. On the plus side it's
generated entirely in the message string, and wiki admins can tweak
parameters, like the signup campaign parameter in this example.

3) Often code builds the login/signup links in PHP and passes them to the
message.

3a) OutputPage- showPermissionsPage() builds a login URL with
Linker:LinkKnown. It doesn't provide a sign up link, and the code to build
the login link can't be reused. Which may be why
WatchAction-checkCanExecute() and SpecialPage::requireLogin copy and paste
the code.

3b) SkinTemplate.php generates the Login / Create account URLs at the top
right of most skins using Skin::makeSpecialUrl. But you can't pass this to
a message because it may not start with //myserver.com and if it doesn't
then it won't work in a [$1 link text] link.

Thanks for any advice,

-- 
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Requests with data URIs appended to proper URLs

2014-06-06 Thread S Page
On Thu, Jun 5, 2014 at 3:44 PM, Matthew Flaschen mflasc...@wikimedia.org
wrote:


 I did a quick check, and Jigsaw (the W3C's validator) does complain about
 our data URLs on that page:

 http://jigsaw.w3.org/css-validator/validator?uri=https%
 3A%2F%2Fbits.wikimedia.org%2Fes.wikipedia.org%2Fload.php%
 3Fdebug%3Dfalse%26lang%3Des%26modules%3Dext.gadget.a-commons-directo%
 252Cimagenesinfobox%252CrefToolbar%7Cext.uls.nojs%7Cext.visualEditor.
 viewPageTarget.noscript%7Cext.wikihiero%7Cmediawiki.legacy.
 commonPrint%252Cshared%7Cmediawiki.skinning.interface%7Cmediawiki.ui.
 button%7Cskins.vector.styles%26only%3Dstyles%26skin%
 3Dvector%26*profile=css3usermedium=allwarning=1vextwarning=lang=en

 -
  .filehistory a img, #file img:hover

 Value Error : background url(data:image/png;base64,iVBO[blahblah]) is an
 incorrect URL url(data:image/png;base64,iVBO[blahblah]) repeat


The Jigsaw CSS validator complains about any data URL inside url() unless
it's in quotes.  The snippet
.filehistory a img,#file img:hover{
  background:white
url('data:image/png;base64,iVBORw0KGgoNSUhEUgAAABAQCAA6mKC9GElEQVQYV2N4DwX/oYBhgARgDJjEAAkAAEC99wFuu0VFAElFTkSuQmCC')
repeat;
  background:white url(//
bits.wikimedia.org/static-1.24wmf7/skins/common/images/Checker-16x16.png?2014-05-29T15:05:00Z)
repeat!ie
}
passes with the added quotes.

Stackoverflow [1] thinks this a bug in Jigsaw, but regardless why would the
CSS generate bogus requests in a cross-section of browsers?
some less forgiving browsers doesn't normally include 60% Firefox 29 and
31% Chrome 35.

If it's only es and pt, I wonder if it's something else in the
bits.wikimedia response that makes the browser try to interpret the charset
in the data URI as other than charset=US-ASCII . I don't know of a charset
that would not interpret these ASCII characters as ASCII.

Besides the other theories advanced, might it be sporadic ResourceLoader
mis-minification?

What's the HTTP Referer for these requests, can we tell if it's coming from
an external link rel  to CSS or CSS a mw.loader.implement() piece of
JavaScript inserting the CSS?

[1] 
http://stackoverflow.com/questions/15481088/are-unquoted-data-uris-valid-in-css


-- 
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] LiquidThreads - how do we kill it?

2014-06-06 Thread S Page
On Thu, Jun 5, 2014 at 4:24 PM, Juergen Fenn schneeschme...@googlemail.com
wrote:

 You might like to
 know, though, that on German Wikipedia most discussions about Flow
 seem to focus on how to turn it off or how to keep it out of the
 project altogether. Switching to Flow would require a community
 consensus anyway. So could you please consider a global switch for
 communities that would rather like to disable these new features
 completely.


Technically, the Flow extension first has to be installed on a wiki, and
then Flow is enabled on particular talk pages or classes of talk pages on
that wiki (currently 18 pages on production wikis). After a mis-step on
meta-wiki, we seek consensus on both steps. Currently both are PHP changes;
the Flow team is figuring out how the second will work in the future. We
have no plans to install Flow on German Wikipedia let alone on any
particular talk page, so your discussions can focus on other issues.

The tone of your message made me want to cry, quit my job, and punch the
wall in frustration  :(  But I appreciate you being open about your dislike
and suspicion of Flow, and can only hope that over time Flow's growing
feature set leads users to *want* it enabled on the talk pages they visit.

--
=S Page  Core Features (Flow) engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating decision soon?

2014-06-03 Thread S Page
On Jun 3, 2014 2:56 PM, Sumana Harihareswara suma...@wikimedia.org wrote:
 Some folks were testing [Knockoff]
 out and need to report back to the list with their verdicts.

Who? Tell us more!

 In the
 meantime, some developers (such as the Mobile and Flow teams) have
 short-term needs that can't wait up for Knockoff to become a complete
 solution, and so are working out interim standardizations outside of
 this mailing list so that they can move forward while Knockoff work
 continues. (Not sure what all of them are.)

MobileFrontend has been using Hogan JS templating since January 2013.

Flow recently chose handlebars JS templating because it has a working
fast PHP re-implementation (lightncandy) to support no-JavaScript
clients.

For client-side templating you need ResourceLoader to supply the
templates to the client. Jon Robson has developed the Mantle
extension[1] that implements
* a ResourceLoaderTemplateModule that does this
* JS functions that abstract getting a template, compiling and caching
it, and rendering it
* specific implementations of these functions for the handlebars and
hogan JS libraries.

MobileFrontend and Flow will start using this shared code in
production in the next few weeks or so.

In order for Flow to share templates between front-end JS and server
PHP, Flow has had to write helper functions in both JS and PHP[2].
Some like message i18n, human-friendly timestamps, escaping, etc. are
more generic than others.

These experiences in generalized JS template support and developing
helper functions across JS and PHP could inform Knockoff development.
So far the Flow team is doing well with handlebars/lightncandy
templating but we're not advocating it over Knockout/Knockoff. The
reactive model-view updating of Knockout (in JavaScript) is an
attractive additional feature missing from Hogan and handlebars
templating; again, Flow couldn't wait.

 Should I be saying Knockoff or Knockout?
From the RFC page, Gabriel WIcke  Matthew Walker's Knockoff
templates are KnockoutJS compatible. AIUI, GWMW have a JS compiler
that compiles them into GWMW's Knockoff - Tassembly intermediate
representation, and their goal is to to render templates in the latter
format from both PHP and JavaScript. In JavaScript you'd still load
the Knockout JS for its reactive model-view updates.

Hope this helps. No slight intended to any others working on GWMW MW KO code :)

[1] https://www.mediawiki.org/wiki/Extension:Mantle#Templates
[2] https://www.mediawiki.org/wiki/Flow/Architecture/Templating

--
=S Page  WikiMedia Features engineer

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VisualEditor template editing interface

2014-04-28 Thread S Page
On Tue, Apr 22, 2014 at 11:28 AM, Trevor Parscal tpars...@wikimedia.orgwrote:

 I don't quite understand your question [linking templatedata easily to
 Wikidata], sorry. I'm talking about adding fields to TemplateData which
 VisualEditor uses to build a user interface around templates. TemplateData
 is defined on-wiki as a JSON blob.


It confused me too at first, since Wikidata is envisioned as a way to
supply  data to infobox templates.  Maybe call it a TemplateSpec.

I was also confused by its [Manage template documentation] button, I
thought at first clicking it would let you edit the /doc subpage of a
template, but it brings up the interface for editing the templatedata
block. The block does  replace a lot of hand-written template
documentation; might templatedata be internationalized some day?

It's a really cool feature! It saves me opening Templates used on this
page to open a template in a new tab just to remind me of its parameters.

-- 
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Abandoning -1 code reviews automatically?

2014-04-15 Thread S Page
On Sun, Apr 13, 2014 at 1:37 PM, Marcin Cieslak sa...@saper.info wrote:

 I agree that -1 is practically a death penalty to a change.
 But that's not a positive development, because
 even a mild -1's completety discourages anybody to post
 a positive review (I wonder how many +1 or neutral
 comments were posted *after* some of the WMF reviewers
 posted a -1).


That seems dramatic and hasn't been my experience. A -1 does mean others
will be less likely to review, but it doesn't discourage positive comments
about the change. Often people will -1 a beneficial change because they see
how it can be improved further, but the answer is to respond to their
comment with I've filed bug N for that suggestion, and invite them to
remove the -1.

-- 
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Free fonts and Windows users

2014-04-08 Thread S Page
In https://gerrit.wikimedia.org/r/#/c/124475/ (go back to sans-serif)
Legoktm claims There was a consensus that listing only non-free fonts was
not acceptable, that's not my recollection.  Was a bug ever filed?

Kaldari valiantly tried to put non-free fonts first, that caused bug
63512.  Now as I understand it, we're back to:
* Mac users get Helvetica Neue
* Windows users get Arial unless they have Helvetica Neue (unlikely) or
Helvetica (I can't reproduce bug 63662)
* Linux users get whatever F/OSS font fontconfig supplies for the
well-known string Helvetica, I get Nimbus Sans L
* Android users ?? (Nobody responded.)

quoting Isarra Yos


 Given that no objective and verifiable issues with this were ever provided
 ... Why? All this effort, and for what?


BECAUSE DESIGN. (I begged and pleaded with the talented designers who work
next to me to put something emphatic in the Typography refresh FAQ.) It's a
better design. It makes MediaWiki web sites look better for millions of our
users by mentioning proprietary fonts that 90+% of them have. That's not
objective and verifiable, it just is. Is it worth mentioning non-free
fonts?  People disagree. But I'm saddened by the implicit and overt
hostility towards the art of design here (its debatable whether this
actually represents progress, it seems like things have shifted more to
managers at WMF make the decisions, etc.).

Regards,
-- 
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating systems MediaWiki - is this summary right?

2014-04-02 Thread S Page
On Wed, Apr 2, 2014 at 11:09 AM, Sumana Harihareswara suma...@wikimedia.org
 wrote:

 TL;DR: who's testing out the non-Knockout approaches?


Besides those listed at [1]
The Flow discussion system needs to render templates on both the client and
server[2]. The Flow team is going to use handlebars.js and its lightncandy
PHP implementation; we wanted to try KnockOff/TAssembly but the timing
isn't right. We will be ripping off :) MobileFrontend's integration of
Hogan.js client-side templates.

(Gabriel Wicke wrote I know that for example handlebars is used in a few
teams right now. -- who else?)


 oojs - https://www.mediawiki.org/wiki/OOjs_UI -- could use this
 toolkit with one of the other template approaches, or maybe this is
 enough by itself!


As I understand it, OOjs UI is more a rich widget library rather than a
templating system. You would compose a page out of widgets that render what
you want, and yes you could use OOjs UI with a templating engine (it
operates on jQuery elements).


 Currently used inside VisualEditor and I am not sure
 whether any other MediaWiki extensions or teams are using it?


The Multimedia team is using OOjs UI for the About this file dialog in
the Media Viewer[3] (currently a beta feature). They haven't styled it to
use Agora controls.

Mobile is using VisualEditor with the beginnings of an Agora theme.

Hope this helps, corrections welcome.

[1]
https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library#Existing_implementations_in_MediaWiki_extensions
[2] https://www.mediawiki.org/wiki/Flow/Epic_Front-End#Templating
[3] https://www.mediawiki.org/wiki/Multimedia/About_Media_Viewer

-- 
=S Page  Features engineer on the Flow team
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] MediaWiki-Vagrant can run MediaWiki under HHVM!

2014-01-23 Thread S Page
On Sun, Jan 19, 2014 at 1:38 PM, Ori Livneh o...@wikimedia.org wrote:


 To switch MediaWiki from PHP to HHVM, simply run 'vagrant enable-role
 hhvm', followed by 'vagrant provision'


I did this and `vagrant provision` failed with

err: /Stage[main]/Hhvm/Package[hhvm-nightly]/ensure: change from purged to
present failed: Execution of '/usr/bin/apt-get -q -y -o
DPkg::Options::=--force-confold install hhvm-nightly' returned 100: Reading
package lists...
Building dependency tree...
Reading state information...
Some packages could not be installed. This may mean that you have
requested an impossible situation or if you are using the unstable
distribution that some required packages have not yet been created
or been moved out of Incoming.
The following information may help to resolve the situation:

The following packages have unmet dependencies:
 hhvm-nightly : Depends: libmemcached6 but it is not going to be installed
E: Unable to correct problems, you have held broken packages.

 (you have held broken packages in your arms, so poetic :-) )

The issue might be old version of php5-memcached using libmemcached10 which
conflicts with libmemcached6.  I ran `sudo apt-get install php5-memcached`
which updated to libmemcached11 and then `vagrant provision` worked fine. I
didn't file a bug.

(I normally run `sudo unattended-upgrade` to update MW-Vagrant with
security updates but don't otherwise upgrade packages, which AIUI is
similar to how puppet updates labs instances.  I added Update system
software? to
https://www.mediawiki.org/wiki/MediaWiki-Vagrant#How_do_I3F , Be Bold.)

if you aren't sure which interpreter is running, you can simply check under
 Installed software (or localized equivalent) in Special:Version. 
 HHVMappears as '5.4.999-hiphop (srv)'


My Special:Version says PHP  5.3.10-1ubuntu3.9 (fpm-fcgi) :-(
I've tried killing php5-fpm, restarting apache2, ` sudo /etc/init.d/hhvm
start`  etc.  More details in
https://bugzilla.wikimedia.org/show_bug.cgi?id=60384

--
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia mobile app rewrite early test builds available

2013-11-22 Thread S Page
On Wed, Nov 20, 2013 at 6:33 PM, Brion Vibber bvib...@wikimedia.org wrote:


 Note that there's separate work going to revitalize the Wikipedia app for
 Firefox OS, which will remain HTML5-based as that's how native Firefox
 apps work!


I was going to ask if this could be a Wikipedia app for Firefox in general,
since most of Mozilla's WebAPIs are available on Firefox for Android and
desktop Firefox[1]. But the current version already is!,
https://marketplace.firefox.com/app/wikipedia installs on desktop Firefox.

Since all the second-tier phone OS are saying write HTML apps for us,
with some app manifest munging might the Firefox app work on them too? At
what point does a Web app become a modern HTML5 web site, or is it vice
versa?  :)

[1] https://wiki.mozilla.org/WebAPI#APIs

--
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Design] Should MediaWiki CSS prefer non-free fonts?

2013-10-29 Thread S Page
On Tue, Oct 29, 2013 at 4:07 PM, Matthew Flaschen

I set up http://jsfiddle.net/UPBUH/ as a quick testing ground.

Nice, I tweaked it to make
http://jsfiddle.net/UPBUH/7/http://jsfiddle.net/UPBUH/6/, you can
see what your browser + O.S. picks for each font name in the font
stack.

 When I check the Fonts tab of Firefox's Web Console (not Firebug), it
 shows Nimbus Sans L Bold system.  Used as:  Nimbus Sans L.

I get the same.
FWIW Chromium does something different for me, it's matching Helvetica but
not using Nimbus Sans L. something more like Liberation Sans.


  we're saying we want to use that free font (because it's a free, and fits
 our intended design well).

If it's as good or better than Helvetica Neue, I think everyone agrees the
free font should come first. Yo, designers...? I think Quim goes further to
argue our sans-serif font list should be
   Nimbus Sans L, Liberation Sans, sans-serif
I have no idea what that font stack does on Windows/Mac/iOS/Android. It's
the last thing in http://jsfiddle.net/UPBUH/7/ , can people report back?

Kaldari:
Nimbus Sans L is the files n019003l.{afm,pfb,pfm} in
http://packages.ubuntu.com/saucy/all/gsfonts/download , the command to
extract one file is
   $ dpkg --fsys-tarfile /path/to/gsfonts_blahblah.deb | tar xOf -
./usr/share/fonts/type1/gsfonts/n019003l.pfb  /tmp/NimbusSansL.pfb
and I sent the files to you on a 3 1/2 floppy ☺

Cheers,
-- 
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Should MediaWiki CSS prefer non-free fonts?

2013-10-28 Thread S Page
On Sun, Oct 27, 2013 at 11:11 PM, Faidon Liambotis fai...@wikimedia.orgwrote:

 On Mon, Oct 28, 2013 at 01:32:30PM +1100, Tim Starling wrote:

 Yes, we should prefer to use free software. We should also strive to
 ensure that our support for users on non-free platforms is optimal, as
 long as that doesn't negatively impact on users of free platforms. So
 I don't think it is a problem to specify non-free fonts in font lists.


 It's a bit more complicated than that. Linux distros ship with fontconfig
 (which is used by Cairo, which in turn is used by at least Firefox).
 Fontconfig aliases fonts via a set of rules and the default rules map
 popular non-free fonts to their free metric equivalents, or generics. e.g.
 $ fc-match Helvetica
 n019003l.pfb: Nimbus Sans L Regular
 ...

 This effectively means that, for Linux, having the free fonts at the end
 of the CSS font selection is probably[1]  a no-op: the browser will never
 fallback via the CSS, but match the first font on the list to an equivalent
 found on the system via fontconfig's fallback mechanisms.

Almost. fontconfig will use the first font in the font stack that has a
positive match. Helvetica Neue doesn't mean anything (so alone it would
give Deja Vu Sans), but the following Helvetica has a alias to Nimbus
Sans L with binding=same in /etc/fonts/* , so Firefox uses that.


 It will be an educated guess and possibly do the right thing but it won't
 be what the web designer intended.


For the 2012 Login and Create account form redesign, the web designer
(Munaf Assaf and others) intended Helvetica Neue for text and Georgia for
some numbers. fc-match lets free software get close to that intended look.
The right thing happens! (The Login and Create account forms looked good on
my Ubuntu for the time when they specified a font stack.[*]) Free OSes
sometimes improve their supplied fonts and matching rules, so it's possible
they'll later ship something that matches even better. For example Google's
new Roboto is a nice Helvetica Neue. Brave users can make the decision
themselves by hacking /etc/fonts/*.

This basically strengthens your point: free fonts should be first in the
 list.


Only if the free font looks better.

[1]: I say probably, because I vaguely remember the interactions between
 Firefox  fontconfig to be complicated. Maybe they're being smarter --
 someone should test :)

Firefox works this way. It seems my Chromium prefers Nimbus Sans L even for
'sans serif'; it could be my setup, or
https://code.google.com/p/chromium/issues/detail?id=242046  I would love to
know what Android tablets do.

[*] The local improvement to fonts on those forms made them inconsistent
with the rest of MediaWiki, so their font stack was removed. The VectorBeta
feature applies better typography everywhere. It's really nice IMO.

--
=S Page  Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Which MariaDB engine does Wikimedia use?

2013-10-16 Thread S Page
On Wed, Oct 16, 2013 at 11:53 AM, David Gerard dger...@gmail.com wrote:

 Does Wikimedia use InnoDB, Maria or something else as the database
 engine for its MariaDB servers?


mediawiki-config (where most of http://noc.wikimedia.org/conf/
comes from) gives no indication that we override it, so  AFAICT the
MediaWiki default in includes/DefaultSettings.php line 1541
 $wgDBTableOptions = 'ENGINE=InnoDB';
applies.

WMF uses puppet to configure servers, so actual DB config settings come
from puppet manifests and roles, such as

http://git.wikimedia.org/blob/operations%2Fpuppet.git/18f719ef3d297878e198a864f5bb31dd9cb047af/manifests%2Frole%2Fcoredb.pp


-- 
=S Page WMF Features engineer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ContentHandler hook deprecations

2013-07-23 Thread S Page
On Tue, Jul 23, 2013 at 12:31 PM, Ori Livneh o...@wikimedia.org wrote:

 I see now that setting $wgDevelopmentWarnings = true will manifest usage of
 deprecated methods in PHP's error log, ...



 I'll do my bit by enable this setting on MediaWiki-Vagrant :)


I did my bit by mentioning it in
https://www.mediawiki.org/wiki/Manual:How_to_debug#PHP_errors :)

-- 
=S Page  software engineer on Editor Engagement Experiments
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] Announcement: C. Scott Ananian joins Wikimedia as Senior Features Engineer

2013-07-11 Thread S Page
Nell + OLPC/Litl nth-gen + (Wikipedia + WikiBooks) + WP:Zero =  the Young
Lady's Illustrated Primer: a Propædeutic Enchiridion for us thetes!

( https://en.wikipedia.org/wiki/The_Diamond_Age )

Welcome C. Scott and thanks for all your past, present, and future
contributions.

-- 
=S Page  software engineer on Editor Engagement Experiments
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal regarding a distinct Extension infobox status color for status unmaintained

2013-07-04 Thread S Page
On Wed, Jul 3, 2013 at 10:35 PM, Thomas Gries m...@tgries.de wrote:

 Am 04.07.2013 04:19, schrieb Matthew Flaschen:
  Someone proposed to have a _distinct color for status unmaintained._
  Status unmaintained is currently signalled with gray color, the same
  as for status unknown
  This makes sense to me.  Go for it.

 We as users cannot change the colors, because these are defined as
 classes ext-status-extensionstatus.
 and are _not_ defined in https://www.mediawiki.org/wiki/Template:Extension .

They're defined in https://www.mediawiki.org/wiki/MediaWiki:Gadget-site.css

 Can someone perform a global wikitext source search on mediawiki on
 which pages the string ext-status- is used ?

I don't think you need to worry about someone using
class=ext-status-unmaintained elsewhere.

 00 BF FF (DeepBlueSky) may be a suitable background colour for
 unmaintained, see
 https://en.wikipedia.org/wiki/Web_colors#X11_color_names .

DeepSkyBlue is pretty cheery on my screen, would sir prefer something
in a decaying brown or green? I added ext-status-unmaintained to
Gadget-site.css and made it #8B4513 (SaddleBrown ).  Whatever works.

BTW the template puts unmaintained extensions in Category:Not_LTS_ready .

Cheers,
--
=S Page  software engineer on Editor Engagement Experiments

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Semantic MediaWiki at the Danish National Library

2013-06-24 Thread S Page
I like Josef Assad's explanation of SMW in place of the big load of
do-not-care on its Wikipedia page :

   Semantic Mediawiki allows you to attach properties to wiki pages. And to
 query the pages based on these properties.


It doesn't quite capture the elephant (e.g. you add these properties with
wiki markup around existing links and values on the page), but it's short
and sweet.

How soon will this EAR equivalent come in a cardboard box to sell for
amounts of money so vast that the base license includes the very souls of
[at least one] fairly recent management school grads?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki errors on Wikimedia wikis

2013-06-20 Thread S Page
On Wed, Jun 19, 2013 at 1:36 PM, Brian Wolff bawo...@gmail.com wrote:


 Is there any *public* list of which exceptions/errors they are. Seeing
 how many isn't all that helpful unless we know which ones. (yeah yeah
 I know, there's concerns about data leakage with backtraces, but just
 the exception names w/o backtrace should be safe (?))


Maybe, e.g. the current one I see if I tail -f
fluorine:/a/mw-logs/fatals.log is:

[20-Jun-2013 18:54:45] Fatal error: Call to a member function getCode() on
a non-object at
/usr/local/apache/common-local/php-1.22wmf8/includes/GlobalFunctions.php on
line 1288

Seems OK to display, but meanwhile in exceptions.log:

2013-06-20 18:30:45 mw1076 bswiki: [6d110124] /wiki/[redacted]   Exception
from line 3303 of
/usr/local/apache/common-local/php-1.22wmf7/includes/User.php:
User::addToDatabase: hit a key conflict attempting to insert user
'[redacted], but it was not present in select!

So the exception/error alone can reveal stuff. And I guess it could hint at
an exploit (I hope neither of those do :-/ ).

If there's a problem on a WMF site, unless it's reproduceable on a stock
test wiki, I think it'll need someone with access to the fluorine logs
machine.  For those that have access, 
https://wikitech.wikimedia.org/wiki/How_to_deploy_code#Test_and_monitor_your_live_code
and https://wikitech.wikimedia.org/wiki/Logs have advice about monitoring
logs and graphs.

--
=S Page engineer on Editor Engagement Experiments
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki errors on Wikimedia wikis

2013-06-20 Thread S Page
On Thu, Jun 20, 2013 at 12:06 PM, S Page sp...@wikimedia.org wrote:

 On Wed, Jun 19, 2013 at 1:36 PM, Brian Wolff bawo...@gmail.com wrote:

  just the exception names w/o backtrace should be safe (?))



 [20-Jun-2013 18:54:45] Fatal error: Call to a member function getCode() on
 a non-object at
 /usr/local/apache/common-local/php-1.22wmf8/includes/GlobalFunctions.php on
 line 1288


And that error alone is useless without the stack trace identifying the
caller. (BTW it's bug 49897.)

-- 
=S Page  software engineer on E3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Announcing the Wikimedia technical search tool

2013-06-18 Thread S Page
On Sun, Mar 10, 2013 at 7:50 AM, Waldir Pimenta wal...@email.com wrote:

  Test it here: http://hexm.de/mw-search


Nice.  Is there a way to pass a query string to it, e.g.
http://hexm.de/mw-search?q=%s
? Then we could store this as a bookmarklet with keyword 'ts'[1] and type
`ts 49604` to search. It works if you do a custom search for foo and
replace the q=foo in the long www.google.com/cse URL with q=%s.

Nemo commented:

 gitblit is more robust and faster than gitweb, so it allows crawling by
 search engines.


It's working but gitblit pages have generic title tags and no meta
description or keywords, so the results don't show the title of a patch.
http://support.google.com/webmasters/bin/answer.py?hl=enanswer=35624suggests
how HTML pages should be structured (though Google is deliberately
vague to hinder search result spammers) and
https://developers.google.com/custom-search/docs/structured_data talks
about rich snippets available to custom search (I've never tried it).


[1] like the essential jump to Wikipedia page 'w' bookmarklet
https://en.wikipedia.org/wiki/%s . Why search when you can go direct.

-- 
=S Page  software engineer on E3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Search documentation

2013-06-17 Thread S Page

   https://www.mediawiki.org/wiki/Requests_for_comment/CirrusSearch


This made me realize I have a poor sense of the features in search, so I
read the documentation.

http://www.mediawiki.org/wiki/Help:Searching is bare-bones, and
https://en.wikipedia.org/wiki/Help:Searching disagrees. Perhaps the former
describes the default MediaWiki search features, while WMF has enabled more
and different features on its wikis (such as intitle: and incategory:
searches).

* mw help doesn't mention the * suffix to search for partial matches, or
the ~ prefix. Both work on my unmodified local wiki.

* enwiki says Hello dolly in quotes gives different results, mw directly
contradicts this. Even on my local wiki, quotes make a difference.

* enwiki disagrees with itself what a dash in front of a word does.

I fixed mw search's explanation of the two-button search box (no longer the
default) but I don't know the details on the above.

--
=S Page  software engineer on Editor Engagement Experiments
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] migrating hooks doc to doxygen?

2013-06-08 Thread S Page
This is great. I think building HTML from source files is the way to go for
dry reference material like this.

You need links both ways so people know the other format is available.  The
hooks.txt should say The documentation at
https://doc.wikimedia.org/mw-hooks/hooks_mainpage.htmlis regularly
generated from the master branch of the gerrit repository.

If mediawiki.org's extension template linked hooks in use to this doc
instead of mediawiki.org/Hooks:xyz pages then we could retire the latter
pages and have less stuff to maintain.
Except...

* It's sometimes useful that the Hooks pages are searchable along with the
rest of the documentation on mediawiki.org.

* The mw.org hooks pages have additional information. E.g. comparing
https://doc.wikimedia.org/mw-hooks/page_hooks_list.html#UserCreateForm and
http://www.mediawiki.org/wiki/Manual:Hooks/UserCreateForm , the mw.org page
has
** version = 1.6.0
** source = SpecialUserlogin.php
** These categorize it as [Hooks added in MediaWiki 1.6.0]
and [MediaWiki hooks included in SpecialUserlogin.php]

If we add this information to hooks.txt maybe there's a way doxygen can
show the information and produce tables similar to the categories.

- The hooks are documented in a separate file (still docs/hooks.txt),
 when we might want to have the doc near the wfRunHooks() call.


Hmm. On the one hand if they're all in one place it's easier to do
cargo-cult pattern matching when adding new hook documentation.  But if the
doc is in the PHP near the wfRunHooks call it's more likely people will
update it when making changes.

Now that we have mw.hook() in JavaScript we have a second universe of hooks
to think about :)

-- 
=S Page  software engineer on Editor Engagement Experiments
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Doxygen tags - a proposal

2013-06-08 Thread S Page
Perhaps Ori is pointing out that doxygen (and jsduck) require needless
verbiage. The tools aren't smart enough to infer obvious information from
source on their own (or maybe they are but you're not sure and you see
other comments using these symbols so you copy and paste), so you wind up
repeating information in doc generation syntax.

And to what end?  I view doxygen as an external test that we're being
consistent in comments (quick, is it @param String $mystr  or @param $myStr
{string}  ?) but I never actually refer to the generated output. Does
anyone? Until someone builds a browser bridge that automatically scrolls
the right HTML into view as you move around in your editor and
automatically regenerates the HTML as you type, I don't see my habits
changing.

If web search engines could understand the generated documentation and
ranked it higher in search results it would be more useful and used more.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New git-review lets you configure 'origin' as the gerrit remote

2013-06-07 Thread S Page
This is great! Origin and gerrit remotes disagreeing could potentially
cause havoc.  saper and I went through some tutorials telling people to use
`git clone -o gerrit` when starting development, but this is much cleaner.
I added it to
https://www.mediawiki.org/wiki/Gerrit/Tutorial#Prepare_to_work_with_gerrit


For existing repos you wrote:

 You'll need to perform an additional step to migrate existing repositories.
 In each repository, run the following commands:

   git remote set-url origin $(git config --get remote.gerrit.url)
   git remote rm gerrit
   git review -s


If you don't do this, your first git review in each project after adding git
-
review.conf will do the git review -s step for you.

If you have local branches tracking a gerrit remote (because you followed
the git advice/lore to always begin feature work with
   git checkout -b my_new_feature -t gerrit/master
) then they'll now be disconnected, and you'll
have empty sections for them in the project's ~/.config.

When you try to update your local master branch, git nicely tells you how
to fix this:

MyProjRepo% git pull --ff-only
There is no tracking information for the current branch.
Please specify which branch you want to merge with.
See git-pull(1) for details

git pull remote branch

If you wish to set tracking information for this branch you can do so with:

*git branch --set-upstream-to=origin/**branch** master*

So do this for your local branches tracking the remote master on gerrit:

MyProjRepo% git branch --set-upstream-to origin/master master
MyProjRepo% git branch --set-upstream-to origin/master my_new_feature

etc.

--
=S Page
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Just noticed GitBlit

2013-06-06 Thread S Page
On Wed, Jun 5, 2013 at 9:49 PM, Tyler Romeo tylerro...@gmail.com wrote:

 So when did we get this new repo browser? Looks pretty nice, but I should
 note that all on-wiki gitweb links have now been broken.


I updated mediawiki.org's {{git file}} and {{git summary}}.  I left
templates like {{gitweb}} unchanged because extension boilerplate like
{{WikimediaGitCheckout}} uses it for download this version/that snapshot,
and gitblit doesn't have an equivalent. But if
https://gerrit.wikimedia.org/r/gitweb is shut down then the templates
should be modified.

IMO {{WikimediaGitCheckout}}that template add a lot of verbiage to
Extension pages, and should just output a link to a
git.wikimedia.org/summary/ URL and a link to Help downloading an
extension.  But I'm not familiar with its history.

-- 
=S Page  software engineer on E3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Quo Vadis, Vagrant

2013-06-03 Thread S Page
I added this to
https://www.mediawiki.org/wiki/MediaWiki-Vagrant/Customizing , please
add to it if you have real-world experience

On Fri, May 31, 2013 at 4:41 PM, Ori Livneh o...@wikimedia.org wrote:
 On Fri, May 31, 2013 at 4:16 PM, Yuri Astrakhan 
 yastrak...@wikimedia.orgwrote:

 I so wish there was a way not to mess with files under git's control. Can
 that option be exposed through the local user config file?


 Yep. Create 'Vagrantfile-extra.rb' in the same directory as the Vagrantfile
 and add these lines:

 Vagrant.configure('2') do |config|
 config.vm.provider :virtualbox do |vb|
 vb.gui = true
 end
 end


--
=S Page  software engineer on E3

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Git for idiots

2013-05-10 Thread S Page
 I think that it might be a good idea to add another tutorial for
complete newbies.

Note Mediawiki.org doesn't have a Git tutorial. There are tons of those
on the web.  Thanks to recent work (by Quim and others I think) we have
three fairly rational pages,
https://www.mediawiki.org/wiki/Gerrit/Getting_started
https://www.mediawiki.org/wiki/Gerrit/Tutorial
https://www.mediawiki.org/wiki/Gerrit/Advanced_usage

Improve those. I'm certain more pages won't help.  Git+Gerrit is
fundamentally hard and complicated with lots of steps and commands, so the
tutorial is going to be long with lots of sections. Additional pages
writing down Stuff I found difficult before and after going through the
tutorial just add to the confusion.

Petr's document is useful for the dwindling band of people familiar with
svn, and I'm not sure why it mentions git push (I never use it, I use git
review with gerrit).

A big problem with the documents is inconsistent setup.  They don't even
agree whether the remote should be called origin, gerrit, or review,
because the experts who add to them have different opinions.

-- 
=S Page  software engineer on E3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Coding style: Language construct spacing

2013-05-09 Thread S Page
 we all copied it...

I learned the idiom require_once( 'path.php' ); from

* My wiki's LocalSettings.php , generated by
includes/installer/LocalSettingsGenerator.php
* The LocalSettings.php of labs instances, in puppet labs-localsettings
* The installation instructions for every extension on mediawiki.org

All should change.

php.net has a comment agreeing with this change: it will prevent your
peers from giving you a hard time and a trivial conversation about what
require really is  :-)


On Wed, May 8, 2013 at 5:26 PM, Krinkle krinklem...@gmail.com wrote:

 Hi,

 Since there appears to have been a little bit of trivia around fixing
 these phpcs warnings, I'll open a thread instead.

 Both in javascript and PHP there are various keywords that can be used
 as if they are functions. In my opinion this is a misuse of the
 language and only causes confusion.

 I'm referring to code like this (both javascript and php):

 delete( mw.legacy );

 new( mw.Title );

 typeof( mw );

 echo( $foo . $bar );

 print( $foo . $bar );

 return( $foo . $bar );

 … and, wait for it..

 require_once( $foo . $bar );

 I think most experienced javascript developers know by now that using
 delete or new like it is a function is just silly and looks like
 you don't know what you're doing.

 To give a bit of background, here's why these work at all (they aren't
 implemented both keywords and functions, just keywords). Though I'm
 sure the implementation details differ between PHP and javascript, the
 end result is the same: Keywords are given expressions which are then
 evaluated and the result is used as value. Since expressions can be
 wrapped in parenthesis for readability (or logic grouping), and since
 whitespace is insignificant to the interpreter, it is possible to do
 `return(test)`, which really is just `return (test)` and
 eventually `return test`.

 I'm obviously biased, but I think the same goes for require_once
 (and include, require etc.). Right now this is causing quite a few
 warnings in our php-checkstyle report.

 I didn't disable that rule because it appears (using our code base as
 status quo) that we do want this. There's 0 warnings I could find in
 our code base that violate this, except for when the keyword is
 include|require(_once)?

 The check style sniffer does not (and imho should not) have a separate
 rule per keyword. Either you use constructs like this, or you don't.

 But let's not have some weird exception just because someone didn't
 understand it[1] and we all copied it and want to keep it for no
 rational reason.

 Because that would mean we have to either hack the sniffer to exclude
 this somehow, or we need to disable the rule, thus not catching the
 ones we do use.

 See pending change in gerrit that does a quick pass of (most of) these
 in mediawiki/core:

 https://gerrit.wikimedia.org/r/62753


 -- Krinkle

 [1] Or whatever the reason is the author originally wrote it like
 this. Perhaps PHP was different back then, or perhaps there was a
 different coding style.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
=S Page  software engineer on E3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who uses MobileFrontend?

2013-05-02 Thread S Page
WikiApiary (the _Check usage and version matrix_ link on the
extension's page) suggests the following wikis use MobileFrontend
beyond all of the WMF sites.  Oddly, this list doesn't include
Uncyclopedia, though its page
http://wikiapiary.com/wiki/Uncyclopedia#tab=Extensions lists
MobileFrontend.

15Mpedia 0.7.0
Amahi Wiki 0.7.0
BattleTechWiki
BattleTechWiki
Bulbapedia 0.7.0
Deimos.fr 0.7.0
Document Foundation Wiki 0.7.0
Espiral 0.7.0
Future Of Mankind 0.7.0
Future Of Mankind 0.7.0
IAMMediaWiki 0.7.0
IAMMediaWiki 0.7.0
Laub-Home.de Wiki 0.7.0
MarketsWiki 0.7.0
...
Nablawiki 0.7.0
OpenStack 0.7.0
Slacky.eu 0.7.0
The Business Model Project 0.7.0
Traadita Wiki by Jan  Co. 0.7.0
Wiki Aventurica 0.7.0

On Tue, Apr 30, 2013 at 12:36 PM, Max Semenik maxsem.w...@gmail.com wrote:
 Hi, as one of WMF developers who work on MobileFrontend[1], I'd love to
 know how third parties use this extension. How does your caching
 works? How are you detecting mobile devices? Do you have any problems
 with running it? Finally, just tell us if you tried it at all:)


 
 [1] https://www.mediawiki.org/wiki/Extension:MobileFrontend

 --
 Best regards,
   Max Semenik ([[User:MaxSem]])


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
=S Page  software engineer on E3

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Testing and monitoring deployed code

2013-04-30 Thread S Page
On Tue, Apr 23, 2013 at 3:23 PM, Ori Livneh o...@wikimedia.org wrote:

 I'll add [ganglia exception graphing] to the post-deployment instructions if 
 people find it useful.

Seasoned deployment professionals, please edit the updated
https://wikitech.wikimedia.org/wiki/How_to_deploy_code#Test_and_monitor_your_live_code

I think we can express best practices better than stay on IRC in case
people yell 'Fire!' (though that's a good start :)

Cheers,
--
=S Page  software engineer on Editor Engagement Experiments

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Clicktracking being phased out

2013-04-12 Thread S Page
On Fri, Apr 12, 2013 at 11:39 AM, Yury Katkov katkov.ju...@gmail.com wrote:

 Yes, [ClickTracking] is one of the dependencies of ArticleFeedback.

Note the latest version of AFT drops the dependency.  Earlier versions
of AFT (and AFTv5, WikiLove, the Vector extension, and a few others)
have ResourceLoader modules that depend on ClickTracking modules, but
their actual invocations of ClickTracking functionality are usually
buried inside
   if ( bucketing  bucketCookie  trackOn  UNLIKELY ) {
   // invoke $.getBuckets(), $.trackAction(), etc.
   }

so I suspect all wikis using ClickTracking have it installed to meet
the requirements of some other extension but aren't set up to track
anything. If you are using it, speak up!  Most commercial wikis use
Google Analytics and other third-party tools.

  By the way, I
 think this project is a-we-so-me because it allows you
 to check who use the extension:
 http://wikiapiary.com/wiki/Extension:Click_Tracking

Very useful, thanks.

--
=S Page  software engineer on Editor Engagement Experiments

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New git-review version revives year-old bug

2013-04-11 Thread S Page
I added the workaround and background to
https://www.mediawiki.org/wiki/Gerrit/resolve_conflict , which also
covers gerrit rebase failures.

On Mon, Apr 8, 2013 at 2:54 PM, Roan Kattouw roan.katt...@gmail.com wrote:
 About a year ago, we were struggling with a git-review bug that caused
 lots of bogus warnings to appear. When running git review, you'd get
 a warning saying you're about to submit multiple commits, followed by
 a list of lots of other people's commits that have already been
 merged. I fixed this in https://review.openstack.org/#/c/6741/ last
 year.

 This bug is now back in the latest release of git-review that came out
 over the weekend. I complained about this at
 https://review.openstack.org/#/c/20450/ , which is the change that
 reintroduced the broken behavior. We are suffering from it
 disproportionately because we have defaultrebase=0 set on most of our
 projects, and the bug only triggers when rebasing is disabled (using
 either that setting or the -R flag).

 The workaround is the same as last year: if git-review complains and
 you see bogus commits in the list, respond no to abort, run git
 fetch gerrit, then rerun git-review. This will ensure git-review has
 an up-to-date view of the remote master.

 Roan

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
=S Page  software engineer on E3

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit actively discourages discussion

2013-04-08 Thread S Page
On Mon, Apr 1, 2013 at 8:31 PM, Jeremy Baron jer...@tuxmachine.com wrote:
 Another big one is (1 comment) which gives no information about the
 comment or where it was made and no easy way to jump straight to it.
 (a link or AJAX or whatever).

This one drives me crazy, especially after I've responded to existing
comments on Patch Set N. When someone new comments on the same patch
set, how can I find the new comments?  The best I can do is visit
every file in the patch set that has comments and Ctrl+F/Ctrl+G to
search for the new reviewer's name.

It's already filed as https://bugzilla.wikimedia.org/show_bug.cgi?id=46792

--
=S Page  software engineer on E3

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Mediawiki-api] Provisional API extension for CAPTCHA on action=createaccount

2013-03-25 Thread S Page
On Thu, Mar 14, 2013 at 3:55 PM, Brion Vibber bvib...@wikimedia.org wrote:

 MediaWiki core: https://gerrit.wikimedia.org/r/53793
 ConfirmEdit ext: https://gerrit.wikimedia.org/r/53794

 So far I've tested it with the default 'math captcha' mode, with this test
 rig: https://github.com/brion/mw-createaccount-test

This is great to see.

Using your test rig or Special:APISandbox, the API return warns about
Unrecognized parameters: 'wpCaptchaId', 'wpCaptchaWord when I get
the captcha wrong.

It seems if the user gets the captcha wrong, there's no explicit
indication like captcha-createaccount-fail ('Incorrect or missing
confirmation code.'). Instead the API reports a generic Failure
result, and the UI presents a new captcha.

ConfirmEdit has a getMessage() to provide action-specific text like
fancycaptcha-createaccount. Perhaps the API should pass that back as
well. Otherwise the UI has to know the details of the captcha in use
so it can get a message for it.

The current CreateAccount form submission to Special:UserLogin reports
many form errors like username exists, password wrong, etc. before it
runs the AbortNewAccount hook where ConfirmEdit checks the captcha.
But APICreateAccount runs the APICreateAccountBeforeCreate hook early,
before it dummies up a login form and calls the same validation. So
users will go through the frustration of getting the captcha right
before being told their username isn't available or their password
isn't long enough.

There's also the weirdness that ApiCreateAccount winds up checking the
CAPTCHA twice. AIUI, here's the program flow:

ApiCreateAccount()
Runs APICreateAccountBeforeCreate hook (captcha may abort)
Creates a login forms and call $loginForm-addNewaccountInternal();
addNewaccountInternal():
Does a bunch of form validation
Runs AbortNewAccount hook (captcha may abort, also
TitleBlacklist, AntiSpoof, etc. may abort)

If  ApiCreateAccount() could tell there was a captcha failure within
addNewaccountInternal and could ask the captcha to addCaptchaAPI() to
the result, then we wouldn't need the new APICreateAccountBeforeCreate
hook.

It would be nice if captcha was always checked on its own hook instead
of sharing a hook with other extensions. That would let a future
validation API run the username past TitleBlacklist and AntiSpoof
without getting shot down by the captcha.

Cheers,
--
=S Page  software engineer on E3

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deployment Highlights - 2013-03-15 (NOW WITH LIGTNING DEPLOYS!)

2013-03-18 Thread S Page
On Fri, Mar 15, 2013 at 9:38 PM, MZMcBride z...@mzmcbride.com wrote:

 Looks pretty good. My only tweak was changing the IRC channel to
 #wikimedia-tech [from #wikimedia-deployments]. It's an old and established
 channel and seems like a perfect fit for this. :-)


How to deploy[1] tells us to Join #wikimedia-operations and
#wikimedia-tech on Freenode and be available before and after all changes,
so I added #wikimedia-operations. It's where bots show and log deploy
messages, and it's where the people who'll have to clean up the wreckage
hang out.

Feel better, Greg.

[1] https://wikitech.wikimedia.org/wiki/How_to_deploy_code

--
=S Page  software engineer on E3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 40 million links to Wikipedia

2013-03-11 Thread S Page
On Sat, Mar 9, 2013 at 6:32 PM, Peter Kaminski kamin...@istori.com wrote:
 Here's big data dataset from Google Research and UMass IESL, 40 million
 links to Wikipedia pages where the anchor text of the link closely matches
 the title of the target Wikipedia page, from 10 million web pages, for the
 purposes of contextualized disambiguation:

I wonder how many disambiguation links to Wikipedia fail to
disambiguate. People assume Wikipedia has a link and it never lets you
down, but it's often the wrong thing. E.g. _John McLaughlin_ formed
Mahavishnu Orchestra  (links to a disambiguation page) or gerrit is
written in _Java_ (links to the island, not the language).  _John
Howard_ will no longer link to the Australian politician if someone
more famous comes along.

how to find out if different web pages are talking about the same
person or other entity
Wikidata removes all doubt, http://www.wikidata.org/wiki/Q164757 ! I
assume that other knowledge projects have noticed these entities, and
that Q numbers are becoming a lingua franca.  I'm reserving Q42666789
for my talented sure-to-be famous offspring. :-)

Google clearly enjoys the fruits of Wikipedians' hard work.

--
=S Page  software engineer on Editor Engagement Experiments

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] deployment of the first phase of Wikidata on enwp

2013-02-11 Thread S Page
On Mon, Feb 11, 2013 at 10:49 AM, Antoine Musso hashar+...@free.fr wrote:

  http://www.wikidata.org/wiki/Q7186

 From there our awesome community can fill information about Marie
 Curie such as the interwiki links but also her place of birth, spouse
 or image ...

I assume the [ add source ] is the equivalent of {{citation needed}}
for facts. I couldn't find any help or guidance on filling it in,
though the FAQ says Some pieces are not yet working, however,
including the sources interface.

 That project is going to rock'n roll once it is deployed everywhere. I
 cant wait for the English community to start using it :-]

Yes, so many facts!

--
=S Page  software engineer on E3

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-11 Thread S Page
On Thu, Feb 7, 2013 at 1:31 PM, Daniel Barrett d...@vistaprint.com wrote:
 ...
 1. A desire for a department to have their own space on the wiki.

I assume you looked at enabling subpages in the main namespace?[1]
That way Human Resources/Payroll/Show_me_the_money gets a nice
breadcrumb up to Payroll and Human Resources landing pages.  You can
encourage people to create subpages rather than making yet another
top-level page by putting [Create page] forms on landing pages  that
use a local template[2] and prepend the local hierarchy.

 I'm not talking about access control, but (1) customized look  feel, and (2) 
 ability to narrow searches to find articles only within that space.

(1) Code could infer subpage hierarchy and apply CSS from a
corresponding CSS hierarchy.

(2) Add prefix: to the searches to search subpages, you can make a
form for it[3].  Also Special:PrefixIndex can be helpful, e.g. just
listing all subpages of the current landing page:
  == Subpages of {{FULLPAGENAME}}==
  {{Special:PrefixIndex/{{FULLPAGENAME}}/}}


Cheers,

[1] http://www.mediawiki.org/wiki/Manual:$wgNamespacesWithSubpages

[2] something like
inputbox
type=create
preload=Template:Human Resources meeting
buttonlabel=Create a new page for a Human Resources meeting
default=Human Resources/Meetings/{{CURRENTYEAR}}-{{CURRENTMONTH}}-{{CURRENTDAY}}
width=40
bgcolor=#f0f0ff
/inputbox

[3] http://en.wikipedia.org/wiki/Template:Search_box and similar.

--
=S Page  software engineer on Editor Engagement Experiments

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] varying ResourceLoader module dependencies

2013-01-27 Thread S Page
How can an extension have a soft requirement on a module, so that it
only adds that module to its ResourceLoader dependencies if the
extension providing that module is loaded?

For example, there's no sense in depending on ext.eventLog if the wiki
doesn't load Extension:EventLogging, but if it's available then you
want the module sent to the browser along with your code. (Obviously
your JS would check to see if the module's functions are available
before invoking them.)

I think your extension can't change dependencies when specifying its
module in $wgResourceModules[], it has to do it later. Maybe in a
ResourceLoaderRegisterModules hook (although RL runs that before it
registers the static wgResourceModules array), or possibly in a setup
function you add to $wgExtensionFunctions

Or, in each onHook handler where you call addModule(), you could test
there and add the optional module.

Thanks for any insight, I'll add it to mediawiki.org.


--
=S Page  software engineer on Editor Engagement Experiments

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] varying ResourceLoader module dependencies

2013-01-27 Thread S Page
I mean
How can an extension's ResourceLoader module have a soft dependency on
a module provided by another extension?

If ext.eventLogging is available on this wiki, I want to use its
functionality in my extension's JS. But my JS can live without it.
Extensions with similar soft dependencies on ClickTracking express it
as a hard requirement, which makes them harder to use on other wikis.

Thanks.


On Sun, Jan 27, 2013 at 5:07 PM, Brian Wolff bawo...@gmail.com wrote:
 I'm confused. You want your extension to load the module only if the
 extension is loaded? If your extension isnt loaded than what is loading
 your module.

 If you only want to load the module for certain configurations, that sounds
 like a job for $wgExtensionFunctions
 -bawolff
 On 2013-01-27 8:58 PM, S Page sp...@wikimedia.org wrote:

 How can an extension have a soft requirement on a module, so that it
 only adds that module to its ResourceLoader dependencies if the
 extension providing that module is loaded?

 For example, there's no sense in depending on ext.eventLog if the wiki
 doesn't load Extension:EventLogging, but if it's available then you
 want the module sent to the browser along with your code. (Obviously
 your JS would check to see if the module's functions are available
 before invoking them.)

 I think your extension can't change dependencies when specifying its
 module in $wgResourceModules[], it has to do it later. Maybe in a
 ResourceLoaderRegisterModules hook (although RL runs that before it
 registers the static wgResourceModules array), or possibly in a setup
 function you add to $wgExtensionFunctions

 Or, in each onHook handler where you call addModule(), you could test
 there and add the optional module.

 Thanks for any insight, I'll add it to mediawiki.org.


 --
 =S Page  software engineer on Editor Engagement Experiments

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Advertising changes to MediaWiki messages in core

2012-12-04 Thread S Page
Bug 42215[1] discussed this particular change, and I sought the advice
of the mediawiki-i18n list about it (the consensus was a new message
rather than fiddling with the semantics of the old one).  But I don't
think mediawiki-i18n list is involved in per-wiki message changes, I
assume it's something an admin does.

If a wiki hasn't overridden the welcomecreation message then there's
nothing to do except check to see if the default messages have been
translated into its language (55  64 translations so far).[3]

Steven Walling swall...@wikimedia.org wrote:
 We can go look and see which wikis have written
 custom content, and inform them they need to migrate it to the new message.

I wrote a little script last night to check for the existence of the
old and new messages, and it seems there are 205 wikis that did
override welcomecreation (!!?), so ideally someone should find the
[[Wikipedia:MediaWiki messages]] or [[Village pump (technical)]] page
for those 205 wikis, or otherwise contact their admins.  The list of
wiki messages is at
https://www.mediawiki.org/wiki/User:S_Page_%28WMF%29/welcomecreation_messages

Matthew Flaschen mflasc...@wikimedia.org wrote:
 Yeah, if there's a required action item (you *have* to do this) for each
 wiki, no matter how brief the action, I think it's generally worth the
 blast.

My instructions are in the parent of the list of wikis,
https://www.mediawiki.org/wiki/User:S_Page_%28WMF%29

I'm curious if any of those 205 wikis are doing inventive Welcome
aboard, now read a tutorial/say hi/fix a page/play in the sandbox
things. It's an area our team (Editor Engagement Experiments) cares
about.

Thanks for your insights.

[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=42215
[2] 
https://translatewiki.net/w/i.php?title=Special:Translationsmessage=MediaWiki:Welcomeuser
[3] Translatewiki suggests
* the new Welcome, $1! title message welcomeuser has been translated into 64
languages[4]
* the new Your account has been created. \ Do not forget to change
your preferences has been translated into 55 languages[5].

[4] 
https://translatewiki.net/w/i.php?title=Special%3ATranslationsmessage=Welcomecreationnamespace=8
[5] 
https://translatewiki.net/w/i.php?title=Special%3ATranslationsmessage=Welcomecreation-msgnamespace=8

--
=S Page  software engineer on E3

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Customization of welcomecreation on different wikis

2012-12-04 Thread S Page
On Tue, Dec 4, 2012 at 12:17 PM, Amir E. Aharoni
amir.ahar...@mail.huji.ac.il wrote:
 2012/12/4 S Page sp...@wikimedia.org:

 https://www.mediawiki.org/wiki/User:S_Page_%28WMF%29/welcomecreation_messages

 This made me feel very \o/.

 Research about current customizations of messages on projects is
 probably the number one thing that I'd like to see happen in the
 Wikimedia dev community.

Glad to help, and it's not hard to do as a one-off job.  I understand
these are the sorts of jobs that toolserver runs but I've no
experience wit it.  Maybe there's a way to for transwiki to show for
language X, here are the Wikipedias, Wikibooks, Wikiquotes, etc. that
have customized this message (as of last run at 2012-12-03 19:50
UTC).

 The most inventive is probably the Hebrew Wikipedia (
 https://he.wikipedia.org/wiki/MediaWiki:Welcomecreation ), which has a
 lot of graphics. It has three big buttons ...

And in this case it doesn't look like anyone has copied that
customized message to
he.wikipedia.org/wiki/MediaWiki:Welcomecreation-msg  1.21wmf5 will
launch on wikis tomorrow (today) December 4th.

 Most customized welcome pages say a few words that are specific to the
 project, such as Welcome to the Russian Wikiversity. Sign up to your
 school.

The first part could be handled by using {{SITENAME}} in the default message.

--
=S Page  software engineer on E3

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Jenkins now lints javascript!

2012-11-21 Thread S Page
I have vim set up so pressing
  :Jtabenter
runs jshint within vim, and I can step through the error lines.  It's a
huge time saver.

The relevant lines from my ~/.vimrc:

 Use pathogen in order to use vim-jshint
 per https://github.com/tpope/vim-pathogen :
call pathogen#infect()
 The above means my `git clone https://github.com/walm/jshint.vim.git`
 in ~/.vim/bundle is loaded.

Timo, should we be using .jshintrc per directory or lines like
/*jshint multistr:true */
at the top of files?  Or both?

grunt.js build, love it

-- 
=S Page  software engineer on E3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


  1   2   >