Re: [Wikitech-l] What is "revision slot" and why is it necessary to specify it now?

2020-04-06 Thread Brad Jorsch (Anomie)
On Mon, Apr 6, 2020 at 1:06 PM Jeroen De Dauw 
wrote:

> Related question: is it possible to edit existing slots and add new ones
> via the API?
>

Not yet. That's tracked as T208801
<https://phabricator.wikimedia.org/T208801>.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What is "revision slot" and why is it necessary to specify it now?

2020-04-06 Thread Brad Jorsch (Anomie)
On Mon, Apr 6, 2020 at 10:51 AM Petr Bena  wrote:

> Or is that this part of result which indicates they are being
> supported?
>
> ...
>
> "limit": 500,
> "deprecatedvalues": [
> "parsetree"
> ]
> },
> {
> "index": 2,
> >>>>"name": "slots",  <<<<
> "type": [
> "main"
> ],
>
> ...
>

Yes, that's the part. Further up in the data structure you can see that it
indicates all parameters for the module are prefixed by "rv".
```

"group": "prop",
>>>>"prefix": "rv", <<<<
"source": "MediaWiki",

```

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What is "revision slot" and why is it necessary to specify it now?

2020-04-06 Thread Brad Jorsch (Anomie)
On Sun, Apr 5, 2020 at 5:24 PM Daniel Kinzler 
wrote:

> Am 05.04.20 um 23:18 schrieb Petr Bena:
> > But still, I am curious what is the recommended approach for someone
> > who wants to develop their application "properly" in a way that it's
> > backward compatible? First query the MediaWiki version via API, and
> > based on that decide how to call APIs? I can't think of any other way.
>
> Yes, that's indeed the only way if the client is to be entirely generic.
> Though
> deprecated API behavior tends to stay supported for quite a while,  much
> longer
> than deprecated PHP methods. So checking the MediaWiki version first would
> only
> be needed if you wanted your code to work with a wide range of MediaWiki
> versions.
>

Rather than checking the MediaWiki version, I'd generally recommend
checking against the action=paraminfo endpoint for the module in question
for something like this. If e.g.
https://www.mediawiki.org/w/api.php?action=paraminfo=query+revisions
reports that rvslots is available then use it, otherwise use back-compat
code.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] BREAKING CHANGES to Action API parameter validation

2020-02-04 Thread Brad Jorsch (Anomie)
This notice is being sent to wikitech-l for the benefit of technical
subscribers who aren't subscribed to that list, due to concern that these
changes may affect many API clients. For notification of all breaking
changes to the Action API, please subscribe at
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce.

Error codes for parameter validation errors are changing. Among others,
"noX" becomes "missingparam" and "unknown_X" becomes "badvalue". See the
full announcement at
https://lists.wikimedia.org/pipermail/mediawiki-api-announce/2020-February/000151.html
for details.

Various unusual values for integer-type parameters will no longer be
accepted, basically anything that isn't an optional ASCII sign ('+' or '-')
followed by ASCII digits. See
https://lists.wikimedia.org/pipermail/mediawiki-api-announce/2020-February/000152.html
for details.

Both of these changes will most likely go out to Wikimedia wikis with
1.35.0-wmf.19. See https://www.mediawiki.org/wiki/MediaWiki_1.35/Roadmap
for a schedule.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Reason for actionthrottledtext API blocking?

2020-01-13 Thread Brad Jorsch (Anomie)
On Mon, Jan 13, 2020 at 9:57 AM Baskauf, Steven James <
steve.bask...@vanderbilt.edu> wrote:

> 1. The bot account doesn't have a bot flag.
> https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/DanmicholoBot
> implies that the lack of a bot flag might be the problem. But the script is
> not an autonomous bot and is still under development, so I didn't think a
> bot flag was necessary under those circumstances.
>

This is relevant only in that the bot group includes the "noratelimit"
right, which bypasses rate limiting in many cases.


> 2. I'm not running the script from paws.wmflabs.org .  Is it a problem
> that the API calls are coming from an unknown IP address? Does that matter?
> 3. I'm not using Pywikibot.  But that shouldn't be necessary.
>

Neither of these should be relevant.

On the other hand, a relevant factor is that only certain actions are
subject to the rate limits, and different actions have different limits.
The other user you discussed it with may have been using different actions
that aren't limited, or that have limits he wasn't hitting.


> The other thing that is different about what I'm doing and what is being
> done by the other user who is not encountering this problem is that I'm
> authenticating directly by establishing a session when the script starts
> (lines 347-349).  The other user apparently somehow doesn't have to
> authenticate when using Pywikibot as long as he's logged in to
> paws.wmflabs.org .  But I haven't dug in to find out how Pywikibot works,
> so I don't really understand how that is possible or whether that's
> important in establishing that the bot is legitimate and not a vandal/spam
> bot.
>

He may be using OAuth. See
https://www.mediawiki.org/wiki/OAuth/For_Developers and
https://www.mediawiki.org/wiki/OAuth/Owner-only_consumers for some details.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Composer does not know mediawiki/mediawiki on dry-runs

2019-11-15 Thread Brad Jorsch (Anomie)
On Fri, Nov 15, 2019 at 11:06 AM Lucas Werkmeister <
lucas.werkmeis...@wikimedia.de> wrote:

> That was six years ago, predating extension registration – I would assume
> that nowadays extensions are supposed to declare their compatible MediaWiki
> versions in extension.json
> <
> https://www.mediawiki.org/wiki/Manual:Extension_registration#Requirements_(dependencies)
> >
> (details
> <https://www.mediawiki.org/wiki/Manual:Extension.json/Schema#requires>).
>

It also predated T467 RfC: Extension management with Composer
<https://phabricator.wikimedia.org/T467>, which rejected the proposal to
manage extensions via composer.

I think you assume correctly.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Pre-MCR database fields are no longer being written on group 0 (and soon on the rest of the Wikimedia wikis too)

2019-11-12 Thread Brad Jorsch (Anomie)
We've reached the point in deployment of the Multi-Content Revisions
changes where we're turning off the writing of data to deprecated fields.
In particular, rev_text_id and ar_text_id will be 0 in all revisions, and
rev_content_model, rev_content_format, and corresponding fields in the
archive table will always be null.

This has been the case on testwiki since October 29, and on the remaining
group 0 wikis since a few minutes ago. Assuming no issues turn up, we plan
to roll it out to group 1 on Thursday and group 2 early next week.

If you find any MediaWiki bugs that seem as if they might be related to
this change, please file them in Phabricator and tag #Core-Platform-Team.

Thanks!

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dropping PHPUnit 4 support

2019-10-07 Thread Brad Jorsch (Anomie)
On Sun, Oct 6, 2019 at 4:35 AM Daimona  wrote:

> And in turn, this happens because MediaWikiUnitTestCase validates the
> directory structure, but some repos do not comply.


That is not the only reason a test might use PHPUnit\Framework\TestCase.
Another is that libraries (and proto-libraries in core's includes/libs/
directory) intended for use independent of MediaWiki can't use a
MediaWiki-specific test case base class.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Book referencing proposal

2019-09-30 Thread Brad Jorsch (Anomie)
On Mon, Sep 30, 2019 at 11:56 AM Ryan Kaldari 
wrote:

> While we're at it, can we add a separate  tag,
> so that we can add footnotes with their own references (rather than having
> to resort to templates)? See https://phabricator.wikimedia.org/T7265
> (originally
> requested in 2006!).
>

The bug you linked there was correctly marked as a duplicate of T8271. You
seem to have gotten it confused with T3310
<https://phabricator.wikimedia.org/T3310>, which is a separate issue with
the parser in general.

Also there's no actual need for templates for the current workaround, you
can use {{#tag:ref}} directly to nest them.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] TechCom Radar 2019-09-18

2019-09-20 Thread Brad Jorsch (Anomie)
On Fri, Sep 20, 2019 at 7:21 AM Daniel Kinzler 
wrote:

> The purpose of the Frontent Architecture Working Group is to propose an
> architecture
> for a more modern front-end to MediaWiki, enabling a richer user
> experience.
> Efforts to modernize the user interface have often struggled with
> limitations
> imposed by MediaWiki core and the overall system architecture.


When I read things like this, I worry that "limitations" that people want
to get rid of include things like "basic functionality if the browser lacks
JS/CSS (or has them disabled)" and "basic functionality on a fairly generic
LAMP webhost, without running a bunch of bespoke services (even via
containers)."

Note neither of those limitations preclude requiring more advanced
technology for "a richer user experience", but IMO we should carefully
consider the tradeoff each time we lock some functionality behind a
"richer" wall.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Help merge patch (Interwiki)

2019-07-22 Thread Brad Jorsch (Anomie)
On Sat, Jul 20, 2019 at 1:05 PM Martin Urbanec 
wrote:

> my method is to add a few of people who +2'ed last patches on the
> extension/component I was working on and if that doesn't help, ping them on
> IRC to either merge or advise who I should ask.


I usually do that too when poking at an extension that doesn't have clear
maintainers (and when people I usually work with don't seem appropriate
reviewers). Plus I also tend to add the people who wrote the patches if
they seem like they've been around for a while.

But in both cases I do that with the caveat that I try to ignore "global
cleanup" commits. We have people who do valuable work in replacing
deprecated code, fixing minor code style sniff errors, and so on that don't
actually maintain all the code they touch in the process and I don't want
to bombard them with spurious review requests.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dealing with composer dependencies in early MediaWiki initialization

2019-07-19 Thread Brad Jorsch (Anomie)
On Fri, Jul 19, 2019 at 1:09 AM Kunal Mehta  wrote:

> So in the patch I added an optional --server parameter to the CLI
> installer, with it defaulting to <http://localhost> if none is
> provided. Does that seem acceptable enough? I'm not sure what other
> behavior would be sensible.
>

The other options I could think of would be to make --server a required
parameter to the CLI installer, or to let the CLI installer generate a
LocalSettings.php that does not result in a usable wiki (since it will give
the error that $wgServer needs to be set in LocalSettings.php).

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Issues with Cross-Origin Resource Sharing (CORS) JQuery Request

2019-05-06 Thread Brad Jorsch (Anomie)
On Mon, May 6, 2019 at 7:14 AM Egbe Eugene  wrote:

> After looking at [1]Manual:CORS and trying to perform a request with JQuery
> from an external application, I still get the error message saying "Request
> from origin has been blocked by CORS policy: No
> 'Access-Control-Allow-Origin' header is present on the requested resource.
>
> This is from a simple GET request to get imageinfo from Commons.
>

Without seeing the actual code you tried, I can only guess.

If you set the `origin` parameter to match the Origin header a browser
sends from your external site, and your external site is not listed in
$wgCrossSiteAJAXdomains,[1][2] the attempt to use CORS will be rejected. If
you inspect the response received, you should see a header
"MediaWiki-CORS-Rejection: Origin mismatch".

If you didn't set the `origin` parameter to so match, but just copied the
example at Manual:CORS, you should have received an HTTP 403 with a message
"'origin' parameter does not match Origin header".

If you set the `origin` parameter to "*" (that's the single character
U+002A) and set withCredentials = false in jQuery's xhrFields, it should
work from any remote site. But since cookies are neither being sent nor
used, the response will be served to you as an IP user. The code for that
could look something like this:

$.ajax( {
url: 'https://en.wikipedia.org/w/api.php',
data: {
action: 'query',
meta: 'userinfo',
format: 'json',
origin: '*'
},
xhrFields: {
withCredentials: false
},
dataType: 'json'
} ).done( function ( data ) {
console.log( 'Foreign user ' + data.query.userinfo.name + ' (ID ' + data
.query.userinfo.id + ')' );
} );

It looks like https://www.mediawiki.org/wiki/Manual:CORS could use updating
to include the origin=* option, and perhaps to make it clearer that
logged-in accesses only work from whitelisted sites.

[1]: Docs: https://www.mediawiki.org/wiki/Manual:$wgCrossSiteAJAXdomains
[2]: Config:
https://gerrit.wikimedia.org/r/plugins/gitiles/operations/mediawiki-config/+/6cdae859db1611ffba7f6507faf8c54c6d38d217/wmf-config/CommonSettings.php#631

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fetching edit notices via API

2019-04-29 Thread Brad Jorsch (Anomie)
On Sun, Apr 28, 2019 at 10:03 PM Bryan Davis  wrote:

> On Sun, Apr 28, 2019 at 5:02 PM Huji Lee  wrote:
> >
> > Hi all,
> > Is there an API call that would allow you to retrieve the edit notice
> for a
> > page?
>
> I did not find a specific Action API that does this, but the edit
> notices themselves are found at predictable URLs per
> <https://www.mediawiki.org/wiki/Manual:Interface/Edit_notice>:
> * MediaWiki:Editnotice- (where N is the namespace id)
> * MediaWiki:Editnotice-- (where any '/' in the title is
> replaced with '-')
>
> You could also find all editnotice messages for a wiki using a prefix
> search generator:
>

This is your best bet. Note that, in namespaces with subpages, a page at
"Project:Foo/Bar/Baz" would display any editnotices at
MediaWiki:Editnotice-4-Foo and MediaWiki:Editnotice-4-Foo-Bar in addition
to MediaWiki:Editnotice-4-Foo-Bar-Baz.

There's also the 'TitleGetEditNotices' hook, which extensions can use to
add arbitrary other messages as edit notices. For example, GlobalUserPage
uses this to add a notice when editing the page used as a global user page
for a user, and TitleBlacklist uses it to add a notice when the tboverride
right is allowing the user to create/edit a blacklisted page.

There's also https://phabricator.wikimedia.org/T45683 about this question.


BTW, some incidental history: The per-page edit notices were removed in
2009 (r48276 <https://www.mediawiki.org/wiki/Special:Code/MediaWiki/48276>),
but for some unspecified reason only for namespaces without subpages, and
those were added back in 2011 (r97686
<https://www.mediawiki.org/wiki/Special:Code/MediaWiki/97686>) because the
inconsistency was seen as a bug rather than an incomplete removal. That
removal is why enwiki doesn't use the MediaWiki:Editnotice--
style notices, instead having every MediaWiki:Editnotice- invoke a
template that transcludes subpages of Template:Editnotices/
<https://en.wikipedia.org/wiki/Special:PrefixIndex/Template:Editnotices/>
(plus that system has evolved to become a bit more flexible).

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] mwoauth.errors.OAuthException in Flask

2019-04-09 Thread Brad Jorsch (Anomie)
On Tue, Apr 9, 2019 at 10:17 AM David Barratt 
wrote:

> Unfortunately (afaik) there is no way to test the workflow without a
> getting a real consumer.
>

That shouldn't be too much of a problem, since the owner can use the
consumer for testing while it's still in the "proposed" state.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] mwoauth.errors.OAuthException in Flask

2019-04-09 Thread Brad Jorsch (Anomie)
On Tue, Apr 9, 2019 at 6:00 AM Egbe Eugene  wrote:

> *MediaWiki response lacks token information: {b'Consumer is owner-only,  class': [b'"external"
> href="https://www.mediawiki.org/wiki/Help:OAuth/Errors#E010
> <https://www.mediawiki.org/wiki/Help:OAuth/Errors#E010>">E010']}*
>
> I am not sure why i got no response token.


You got no response token because you got an error response instead. As
stated at the link in the error message, owner-only consumers are
pre-authorized so the client should be configured with the token key and
secret directly instead of using the authorization endpoints.

I don't know how to do that using Flask; if you need help with that you'd
probably do better to ask in a forum specific to Flask.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] User/actor storage changes

2019-04-03 Thread Brad Jorsch (Anomie)
As you may have been aware, we've been working on changing how MediaWiki
stores "actors", meaning the user account or IP address that performed the
edit, logged action, and so on. Instead of having user ID and name fields
in each revision (rev_user+rev_user_text), log entry
(log_user+log_user_text), and so on, we're storing the ID and name in a
central "actor" table and referring to them by the actor ID from other
tables (log_actor and so on).

We've been writing to the new fields and tables since mid-December 2018,
and have back-populated them for old revisions, log entries, and so on.
We're about to start changing Wikimedia's production wikis to start reading
the new fields instead of the old.

For the most part wiki users shouldn't notice any changes, however if you
notice something being newly slow or incorrectly displaying the user,
please let me know.

For users of the Data Services replicas, such as Toolforge, the views do
still include the old columns and they will be simulated even after
MediaWiki stops writing them. But, for the non-compat views, this *will*
change in the future as it recently did for the comment columns, so you may
want to begin your migration process soon rather than waiting.

MediaWiki developers should make sure code accessing user fields makes use
of the ActorMigration class that was introduced in MediaWiki 1.31.

You can watch https://phabricator.wikimedia.org/T188327 (and any subtasks)
for more information on the deployment process.

Note that accesses to the actor table may be slow, as are accesses to the
comment table. Improving that situation is being tracked at
https://phabricator.wikimedia.org/T215445.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Unbreak now! problem in this week train Watchlist

2019-03-14 Thread Brad Jorsch (Anomie)
‪On Thu, Mar 14, 2019 at 1:29 PM ‫יגאל חיטרון‬‎ 
wrote:‬

> From time to time the API post query "mark this revision as read" does not
> work.


Nothing in the reproduction steps you list does such a post query.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ManualLogEntries::publish do not store tags when published to `udp`

2019-03-12 Thread Brad Jorsch (Anomie)
On Mon, Mar 11, 2019 at 9:48 AM Piotr Miazga  wrote:

> I noticed that ManualLogEntry items could have tags only when those
> entries are published to `rc` or `rcandudp`.


Hmm. Yes, it looks like the tags aren't being added in the `udp` case.
Looks like it was broken in
https://gerrit.wikimedia.org/r/c/mediawiki/core/+/312743. Filed T218110
<https://phabricator.wikimedia.org/T218110> for that.


> Then the extensions can attach tags via RecentChange_save hook and
> everything works perfectly.


I note this is not really related to the fact that tags set on the
ManualLogEntry aren't stored when entries are published to `udp`, as that
hook doesn't directly deal with log entries at all.

To support tagging published log entries directly rather than via the
associated RecentChange entry, you'd probably want to add a
"ManualLogEntryPublish" hook or something like that.


> Additionally, I'd like to introduce a Taggable interface[6], that provides
> a one way to tag objects (right now RecentChange exposes addTags() method
> but the ManualLogEntry exposes setTags() method).
>

"Taggable" seems like it may be too generic.
"MediaWiki\ChangeTags\Taggable" could work.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki PHP coding conventions for declaring return types

2019-02-13 Thread Brad Jorsch (Anomie)
I've started a discussion at
https://www.mediawiki.org/wiki/Manual_talk:Coding_conventions/PHP#Declaring_of_return_types.
Please comment there if interested.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia production excellence (January 2019)

2019-02-13 Thread Brad Jorsch (Anomie)
On Tue, Feb 12, 2019 at 10:54 PM Krinkle  wrote:

> Brad fixed the code a few hours later, and it was deployed by Roan later
> that same day.
> Thanks! —  https://phabricator.wikimedia.org/T213168
>

Correction: It was Gergő Tisza who submitted the patch to fix the code for
this one, not me.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The mw.ext construct in lua modules

2019-02-05 Thread Brad Jorsch (Anomie)
On Mon, Feb 4, 2019 at 5:26 PM Eran Rosenthal  wrote:

> > What is the problem with the ".ext" part?
> 1. It adds unnecessary complexity both in the extension (need to init
> mw.ext if it doesn't exist)


It's one line in the boilerplate. That's not much complexity.


> and more important - in its usage when the Lua
> extension is invoked (longer names)
>

It's 4 characters. Also not much to be concerned about. You're also free to
do like

local foo = mw.ext.foo;

if you want shorter access within your code.


>(there is very small risk of name collision -  mw.ModuleA and mw.ModuleB
> are unlikely to clash as different extensions, and mw.ModuleA and mw.FUNC
> are unlikely to clash because function names
> <
> https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#Base_functions
> >
> are usually verbs and extensions
> <https://www.mediawiki.org/wiki/Category:All_extensions> are usually
> nouns)
>

Scribunto has its own built-in packages too, which are also usually nouns.
What if, for example, Extension:Math
<https://www.mediawiki.org/wiki/Extension:Math> added a Scribunto module at
"mw.math" and then we also wanted to add a Scribunto-specific version of Lua's
math library
<https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#Math_library>?
Or Extension:CSS <https://www.mediawiki.org/wiki/Extension:CSS> and a
Scribunto counterpart to mw.html
<https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#HTML_library>?
Or if Extension:UserFunctions
<https://www.mediawiki.org/wiki/Extension:UserFunctions> did its thing at
"mw.user" and then we got around to resolving T85419
<https://phabricator.wikimedia.org/T85419>?

Having mw.ext also makes it easier to identify extensions' additions,
avoiding confusion over whether "mw.foo" is part of Scribunto or comes from
another extension. And it means you can look in mw.ext to see which
extensions' additions are available rather than having to filter them out
of mw.

BTW, we have "mw" in the first place to similarly bundle Scribunto's
additions away from things that come with standard Lua. If someday standard
Lua includes its own "ustring" or something else Scribunto adds a module
for (and we upgrade from Lua 5.1), we won't need to worry about name
collision there either.


> 2. Practically the convention is to not use mw.ext - the convention (based
> on most of the Lua code - e.g wikibase) is to not use mw.ext
>

Of extensions in Gerrit (as of a few days ago when I last checked),
Wikibase and LinkedWiki seem to be the only two extensions not using
mw.ext, while Cargo, DataTable2, DisplayTitle, DynamicPageListEngine,
FlaggedRevs, JsonConfig, ParserFunctions, and TitleBlacklist all do.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Help: setting a property on page save?

2018-12-05 Thread Brad Jorsch (Anomie)
On Tue, Dec 4, 2018 at 6:52 PM FreedomFighterSparrow <
freedomfighterspar...@gmail.com> wrote:

> - Is the editor a human*?
>

That's not as straightforward a question as you might think. How will you
tell if the editor is a human rather than an unflagged bot? Or a human
blindly clicking "Save" on edits proposed by a program such as AWB?

You might also want to consider what happens if someone vandalizes a page
and then is reverted. Does either the vandalism or the revert really count
as an update?


> I really don't want to have to create my own table just for this.


The page_props table already exists for storing properties of pages. It
generally gets its values from the ParserOutput object for the page's
latest revision, although you could use the 'LinksUpdate' hook to influence
the actual database updates instead.

If you want to tag every revision, you could use a change tag. See
https://www.mediawiki.org/wiki/Manual:Tags for an overview.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Coming soon: Find your way back from multi-referenced footnotes

2018-11-27 Thread Brad Jorsch (Anomie)
I went to https://en.wikipedia.beta.wmflabs.org/wiki/Barack_Obama to try
this out, but unfortunately it does not seem to be active there for testing.

On Tue, Nov 27, 2018 at 5:13 AM Johanna Strodt 
wrote:

> // sorry for cross-posting
>
> Hello everyone! I’d like to draw your attention to a software change coming
> to wikis this week:
>
> When you click on a footnote which is referenced multiple times in an
> article, it can be hard to find your way back to your reading position in
> the text.
>
> This will soon become easier: If you have jumped to a multi-referenced
> footnote and want to go back to your previous reading position,
>
> 1) you can now click on the *jump mark* at the beginning of the footnote
> (in most wikis it's an arrow, in some, like enwiki, it's a caret ^). The
> tooltip says "Jump back up".
>
> 2) or you can click on the *superscript jump mark* in the footnote. The one
> leading you back to your original position is now highlighted *bold*.
>
> This second part of the change doesn’t work for wikis where these
> superscript jump marks are bold by default, e.g. enwiki. If those wikis
> want this highlighting as well, they would need to change their default
> style for these superscript jump marks to regular.
>
> Deployment of this change is scheduled for this week’s train [1].
> Originating from the German community’s Technical Wishlist, it was made by
> Wikimedia Germany’s Technical Wishes team [2].
>
> Feedback is always appreciated. The best place for it is the project talk
> page [3]. More information is available on the project page [4] and on
> Phabricator [5].
>
> Best,
>
> Johanna for the Technical Wishes team
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] If users can't edit an article, don't encourage them to edit its Wikidata item

2018-10-30 Thread Brad Jorsch (Anomie)
On Mon, Oct 29, 2018 at 6:35 PM Amir E. Aharoni <
amir.ahar...@mail.huji.ac.il> wrote:

> Of course, it's more difficult to extend cascading protection to Wikidata
> because Wikidata is a different wiki (even Commons images are not included
> in cascading protection last time I checked). Nevertheless, it should be a
> goal.
>

I'm not so sure of that. Some admin on a tiny, little-watched wiki then
could cascade-protect arbitrary Commons images and Wikidata items by
overusing cascade protection despite not having adminship on Commons or
Wikidata.


> And maybe—just maybe—getting blocked on one wiki could make one
> automatically blocked on wikis that are common repositories, such as
> Commons and Wikidata (and perhaps Meta), although this should be
> reversible.
>

Same problem.

We'd likely wind up having to have Stewards start policing the use of
cascade protection and blocking on all wikis to adjudicate whether one
wiki's use of cascade protection or blocking was really trying to disrupt
Commons/Wikidata/Meta.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Comment/summary storage changes

2018-10-16 Thread Brad Jorsch (Anomie)
This has just been deployed to group 0 wikis. Again, if you notice
something not displaying a comment or edit summary for new entries on these
wikis let me know.

The remaining wikis will likely follow later this week or next, assuming no
bugs are reported.

Thanks.

On Mon, Sep 17, 2018 at 11:07 AM Brad Jorsch (Anomie) 
wrote:

> As you may have been aware, we've been working on changing how MediaWiki
> stores comments: instead of having them as fields in each revision
> (rev_comment), log entry (log_comment), and so on, we're storing the text
> in a central "comment" table and referring to them by ID from other tables
> (log_comment_id and so on).
>
> We've been writing to the new fields and tables since the end of February
> 2018, and have back-populated them for old revisions, log entries, and so
> on. Now we're starting to look at stopping the writes to the old fields,
> starting soon with testing wikis such as test.wikipedia.org and also with
> mediawiki.org. Other wikis will follow, likely in October after the DC
> switch-back.
>
> For the most part wiki users shouldn't notice any changes, however if you
> notice something not displaying a comment or edit summary for new entries,
> on these wikis let me know.
>
> For users of the Data Services replicas, such as Toolforge, the views
> currently simulate the old columns for you. However this may change in the
> future. See https://phabricator.wikimedia.org/T181650#4581384 for details.
>
> MediaWiki developers should make sure code accessing comment fields makes
> use of the CommentStore class that was introduced in MediaWiki 1.30.
>
> You can watch https://phabricator.wikimedia.org/T166733 (and any
> subtasks) for more information on the deployment process.
>

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikitech-ambassadors] Translations on hold until further notice

2018-09-27 Thread Brad Jorsch (Anomie)
On Thu, Sep 27, 2018 at 11:39 AM C. Scott Ananian 
wrote:

> Wouldn't "delete all local overrides" (or "all local overrides added since
> 2018-09-27") be a reasonable step to include in migration after the "new
> translatewiki" is turned back on?
>

"Delete all local overrides" would be horrible, you'd lose all local wiki
customizations. See
https://en.wikipedia.org/w/index.php?title=Special%3AAllMessages==modified=en=5000
for enwiki's customizations for example (although English might not be
included in the purge), or
https://de.wikipedia.org/w/index.php?title=Spezial:MediaWiki-Systemnachrichten==modified=de=5000
for dewiki's.

"Added since 2018-09-27" would only be marginally better. Old overrides
wouldn't be lost, but you'd still wipe out new overrides that are intended
to actually be overrides rather than workarounds for this issue.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikitech-ambassadors] Translations on hold until further notice

2018-09-27 Thread Brad Jorsch (Anomie)
On Thu, Sep 27, 2018 at 9:37 AM C. Scott Ananian 
wrote:

> Couldn't you also make sure to edit things in both places?  That is, make
> the edit on translatewiki, then manually copy it over to your local wiki,
> and expect that it will be overwritten when updates from translatewiki are
> turned back on?
>

The "expect that it will be overwritten" part isn't how it works. The local
copy will remain in effect until someone deletes it.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Comment/summary storage changes

2018-09-17 Thread Brad Jorsch (Anomie)
On Mon, Sep 17, 2018 at 11:07 AM, Brad Jorsch (Anomie) <
bjor...@wikimedia.org> wrote:

> We've been writing to the new fields and tables since the end of February
> 2018, and have back-populated them for old revisions, log entries, and so
> on.
>

Correction, the back-population is yet to be done. First we have to stop
writing the old fields.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Comment/summary storage changes

2018-09-17 Thread Brad Jorsch (Anomie)
As you may have been aware, we've been working on changing how MediaWiki
stores comments: instead of having them as fields in each revision
(rev_comment), log entry (log_comment), and so on, we're storing the text
in a central "comment" table and referring to them by ID from other tables
(log_comment_id and so on).

We've been writing to the new fields and tables since the end of February
2018, and have back-populated them for old revisions, log entries, and so
on. Now we're starting to look at stopping the writes to the old fields,
starting soon with testing wikis such as test.wikipedia.org and also with
mediawiki.org. Other wikis will follow, likely in October after the DC
switch-back.

For the most part wiki users shouldn't notice any changes, however if you
notice something not displaying a comment or edit summary for new entries,
on these wikis let me know.

For users of the Data Services replicas, such as Toolforge, the views
currently simulate the old columns for you. However this may change in the
future. See https://phabricator.wikimedia.org/T181650#4581384 for details.

MediaWiki developers should make sure code accessing comment fields makes
use of the CommentStore class that was introduced in MediaWiki 1.30.

You can watch https://phabricator.wikimedia.org/T166733 (and any subtasks)
for more information on the deployment process.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Additional European-appropriate MediaWiki train window starting week of July 9th (also SWAT change)

2018-07-11 Thread Brad Jorsch (Anomie)
On Wed, Jul 11, 2018 at 12:19 PM, Kunal Mehta 
wrote:

> (un)relatedly:
> * EU survey to remove summertime/DST:
> <https://ec.europa.eu/eusurvey/runner/2018-summertime-arrangements?surve
> ylanguage=EN>
> * California Proposition 7 (2018) to institute a permanent DST:
> <https://ballotpedia.org/California_Proposition_7,_Permanent_Daylight_Sa
> ving_Time_Measure_(2018)>
>
> Hopefully we can get rid of this problem at the root cause as well :)
>

You forgot about most of the rest of the US ;)


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Help needed on fixing discrepancies in db schemas

2018-07-09 Thread Brad Jorsch (Anomie)
On Mon, Jul 9, 2018 at 1:10 PM, Amir Ladsgroup  wrote:

> Some are changes that partially implemented like page_no_title_convert
> field on page table that only appear in 32 hosts (around two third of the
> hosts) and there is mention of adding it in HISTORY but I can't find any
> mention of removing it neither its existence so I have no idea how to
> proceed here.
>

Looks like it was added in r16524
<http://mediawiki.org/wiki/Special:Code/MediaWiki/16524> and reverted
(except for the release note entry) in r16526
<http://mediawiki.org/wiki/Special:Code/MediaWiki/16526>.


> Or text table on 21 hosts (around half of them) has an extra field called
> inverse_timestamp which I can't find any mention of it in the code but
> there used to be a field with this name in revision table that got dropped
> in 2005 and I have no idea how to proceed here.
>

Revisions used to be stored much like how images still are: there was the
"cur" table that had data about the current revision, including its actual
content, and the "old" table for previous revisions.

When all this was redone to have "page" and "revision" and "text" (r6710
<http://mediawiki.org/wiki/Special:Code/MediaWiki/6710>), the "old" table
was just renamed to "text" since most of the text was already in that
table. Which is why all the fields in text use "old_" as a prefix. It seems
that the extraneous columns weren't dropped on all wikis.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC discussion: the future of rev_parent_id

2018-06-04 Thread Brad Jorsch (Anomie)
On Mon, Jun 4, 2018 at 3:22 PM, Brian Wolff  wrote:

> I dont view the parent as what revision came first but what revision was
> edited to make this revision (i.e. where the current revision was "forked"
> off from)
>

Although that implies that an automatically-resolved edit conflict should
have a "parent" as the revision that was edited rather than the current
revision, again different from the current behavior (and both of the
options Daniel discusses in the task).

It similarly implies that an edit starting with an old revision should
record that old revision as the "parent" instead of the current revision,
again different from the current behavior. And maybe the same for rollbacks
and at least some undos.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introducing Quibble, a test runner for MediaWiki

2018-04-30 Thread Brad Jorsch (Anomie)
On Mon, Apr 30, 2018 at 10:57 AM, Jaime Crespo <jcre...@wikimedia.org>
wrote:

> > MediaWiki currently doesn't even try to support UTF-8
>
> I thought the installer gave the option to chose between binary and utf8
> 83-bytes)?


Hmm. Yes, it looks like it does. But if all fields are varbinary, does it
matter? Maybe it should be removed from the installer.

There's also a $wgDBmysql5 configuration setting, which controls whether
MediaWiki does "SET NAMES 'utf8'" or "SET NAMES 'binary'". I don't know
what difference this makes, maybe none since all the columns are varbinary.


> innodb_large_preffix cannot be enabled anymore because it is enabled
> (hardcoded) automatically on MySQL 8.0.
>

That's good, once we raise the supported version that far. Currently it
looks like we still support 5.5.8, which at least has the setting to enable.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introducing Quibble, a test runner for MediaWiki

2018-04-30 Thread Brad Jorsch (Anomie)
On Mon, Apr 30, 2018 at 9:05 AM, Jaime Crespo <jcre...@wikimedia.org> wrote:

> * Support "real" (4-byte) UTF-8: utf8mb4 in MySQL/MariaDB (default in the
> latest versions) and start deprecating "fake"  (3-byte) UTF-8: utf8
>

MediaWiki currently doesn't even try to support UTF-8 in MySQL. The core
MySQL schema specifically uses "varbinary" and "blob" types for almost
everything.

Ideally we'd change that, but see below.


> * Check code works as intended in "strict" mode (default in the latest
> versions), at least regarding testing
>

While it's not actually part of "strict mode" (I think), I note that
MariaDB 10.1.32 (tested on db1114) with ONLY_FULL_GROUP_BY still seems to
have the issues described in
https://phabricator.wikimedia.org/T108255#2415773.


> Anomie- I think you were thinking on (maybe?) abstracting schema for
> mediawiki- fixing the duality of binary (defining sizes in bytes) vs. UTF-8
> (defining sizes in characters) would be an interesting problem to solve-
> the duality is ok, what I mean is being able to store radically different
> size of contents based on that setting.
>

That would be an interesting problem to solve, but doing so may be
difficult. We have a number of fields that are currently defined as
varbinary(255) and are fully indexed (i.e. not using a prefix).

   - Just changing them to varchar(255) using utf8mb4 makes the index
   exceed MySQL's column length limit.
   - Changing them to varchar(191) to keep within the length limit breaks
   content in primarily-ASCII languages that is taking advantage of the
   existing 255-byte limit to store more than 191 codepoints.
   - Using a prefixed index makes ORDER BY on the column filesort.
   - Or the column length limit can be raised if your installation jumps
   through some hoops, which seem to be the default in 5.7.7 but not before:
   innodb_large_prefix
   
<https://dev.mysql.com/doc/refman/5.7/en/innodb-parameters.html#sysvar_innodb_large_prefix>
   set to ON, innodb_file_format
   
<https://dev.mysql.com/doc/refman/5.7/en/innodb-parameters.html#sysvar_innodb_file_format>
   set to "Barracuda", innodb_file_per_table
   
<https://dev.mysql.com/doc/refman/5.7/en/innodb-parameters.html#sysvar_innodb_file_per_table>
   set to ON, and tables created with ROW_FORMAT=DYNAMIC or COMPRESSED. I
   don't know what MariaDB might have as defaults or requirements in which
   versions.

The ideal, I suppose, would be to require those hoops be jumped through in
order for utf8mb4 mode to be enabled. Then a lot of code in MediaWiki would
have to vary based on that mode flag to enforce limits on bytes versus
codepoints.

BTW, for anyone reading this who's interested, the task for that schema
abstraction idea is https://phabricator.wikimedia.org/T191231.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introducing Quibble, a test runner for MediaWiki

2018-04-28 Thread Brad Jorsch (Anomie)
On Fri, Apr 27, 2018 at 5:58 PM, Antoine Musso <hashar+...@free.fr> wrote:

> [T193222] MariaDB on Stretch uses the utf8mb4 character set. Attempting
> to create a key on VARCHAR(192) or larger would cause:
>  Error: 1071 Specified key was too long; max key length is 767 bytes
>
> Reducing the key length is the obvious solution and some fields could
> use to be converted to ENUM.
>

Personally, I'd rather we didn't use more enums. They work inconsistently
for comparisons and ordering, and they require a schema change any time a
new value is needed. It'd probably be better to use NameTableStore instead.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Ops] Changes to SWAT deployment policies, effective Monday April 30th

2018-04-27 Thread Brad Jorsch (Anomie)
On Thu, Apr 26, 2018 at 6:14 PM, Greg Grossmeier <g...@wikimedia.org> wrote:

> First, we now disallow multi-sync patch deployments. See T187761[0].
> This means that the sync order of files is determined by git commit
> parent relationships (or Gerrit's "depends-on"). This is to prevent SWAT
> deployers from accidentally syncing two patches in the wrong order.
>

Is full scap now fast enough that sync-file is no longer necessary?
Discussion on that task seems to say "no".

I'd hate to see people mangling[1] the git log by submitting and merging
patch chains for updating individual files of a single logical change just
to satisfy this SWAT requirement. I'd hate even more if people do that in
mediawiki/core master (versus splitting an existing patch while
backporting), to the point where I'd recommend -2ing such patch-chains
there.

[1]: "mangling" both in that it would introduce unnecessary changes and in
that looking at a single change doesn't let you see the changes to other
files that should logically be grouped with it.


P.S. So if someone has a change that needs to touch 2 files in both
branches, they'd use up the whole smaller SWAT window for that one change
because they'd need 2 patches (1 per file) that both count double (because
two branches)?


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] "PHP test coverage decreased" - really?

2018-04-26 Thread Brad Jorsch (Anomie)
On Thu, Apr 26, 2018 at 3:55 PM, Kunal Mehta <lego...@member.fsf.org> wrote:

> | includes/Storage/RevisionSlots.php|  58.33 |  49.35 |
>

Most of the issue there is that hasSameContent() is not being recorded as
"covered" at all because you forgot the @covers annotation on
testHasSameContent().

I also see an added uncovered line in setSlotsInternal(). Probably that too
is actually being covered but nothing has a @covers annotation for the
method.


> | includes/Storage/SlotRecord.php   |  94.64 |  94.51 |
>

Coverage of all the existing code is unchanged. The patch adds 13 lines in
hasSameContentAs(), but only covers 12 of them with tests. Line 594 was
missed, which explains the slight decrease.

In this case no method-level @covers is needed because SlotRecordTest has
"@covers \MediaWiki\Storage\SlotRecord" at the class level.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mobile MonoBook

2018-04-04 Thread Brad Jorsch (Anomie)
On Wed, Apr 4, 2018 at 3:18 PM, Strainu <strain...@gmail.com> wrote:

> Skipping navboxes is a decision that was taken by the WMF team
> responsible for the mobile site (whatever it was called at the time)
> and can be solved cleanly only at the skin level, but I don't expect
> this to happen as long as it will break the mobile site.
>
> Your proposal would be the ideal argument for reversing the current
> "solution", yes, but realistically, it's not going to happen
> throughout the hundreds of wikis that implemented navboxes.
>

Here we're talking about Isarra's very interesting project to make the
Monobook skin responsive, not whatever decisions WMF's mobile teams may
have made in the past for their mobile-only skin. She is not obligated to
follow their lead just because they led, and I'd recommend she doesn't in
this case.

Note that's my personal recommendation and not any sort of official WMF
position. I likely won't even be involved in the code review for her patch.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mobile MonoBook

2018-04-04 Thread Brad Jorsch (Anomie)
On Tue, Apr 3, 2018 at 6:52 PM, Strainu <strain...@gmail.com> wrote:

> but what I would particularly like to see is how it
> handles navboxes. Traditionally, they have been hidden on the
> Wikipedia mobile site, prompting people to do all kinds of sick
> workarounds that kind of work, but not really. If anyone can come up
> with a decent solution to that it's probably you :)
>

The solution is probably for the on-wiki editors to make navboxes
responsive (e.g. using TemplateStyles[1]), rather than expecting the skin
to deal with it.

[1]: https://www.mediawiki.org/wiki/Extension:TemplateStyles


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Unblocking 79 IPs blocked as open proxies 12 years ago (T189840)

2018-03-21 Thread Brad Jorsch (Anomie)
In 2005–2006 a sysadmin blocked 79 IP addresses on all wikis as being
automatically-detected open proxies, without recording them in the block
log or attributing the block to any user account. These incomplete records
are now causing errors when MediaWiki tries to access them in various
places, see https://phabricator.wikimedia.org/T189840.

Since these are all over 12 years old, it seems reasonably likely that many
of these are no longer open proxies. Rather than trying to fix the
incomplete records, I'm just going to remove them.

Any existing blocks of these IPs that are not causing errors will not be
removed. At first glance this seems relevant mainly to enwiki, where only 5
of the IPs have incomplete records. 21 are currently blocked there with
complete records (19 since 2005 or earlier), and the other 53 are not
currently blocked there.

The list of IPs is at https://phabricator.wikimedia.org/P6876 in case
anyone wants to review them for potential reblocking.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP7 expectations (zero-cost assertions)

2018-03-15 Thread Brad Jorsch (Anomie)
On Thu, Mar 15, 2018 at 10:39 AM, David Causse <dcau...@wikimedia.org>
wrote:

> The biggest take-away (for me) of the discussion is:
> Pros:
> - perf: zero-cost assertions
> Cons:
> - the benefits of zero-cost assertion is not worth the risk in a moving
> code-base like MW.
>

The biggest take-away for me from the discussions is that PHP5 assert()'s
reliance on eval()-like operation is bad in a number of ways, and the weird
behavior of issuing a warning and then exiting is weird.

PHP7's expectations seem like they started fixing those issues, although
eval()-like use is still an option and exception-throwing seems to not be
the default.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP7 expectations (zero-cost assertions)

2018-03-15 Thread Brad Jorsch (Anomie)
On Thu, Mar 15, 2018 at 9:42 AM, David Causse <dcau...@wikimedia.org> wrote:

> Looking at the MW codebase we don't seem to use assert frequently (only 26
> files [2] ).
>

We generally use the wikimedia/assert library[3] instead. That's used a lot
more often.[4] The README.md for that library includes some reasoning for
using it over PHP's assert(), with links to past discussion.

[3]: https://packagist.org/packages/wikimedia/assert
[4]:
https://codesearch.wmflabs.org/search/?q=%5CtAssert%3A%3A=nope=php%24=

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Recursive Common Table expressions @ Wikimedia [was Fwd: [Wikimedia-l] What's making you happy this week? (Week of 18 February 2018)]

2018-02-28 Thread Brad Jorsch (Anomie)
On Wed, Feb 28, 2018 at 8:47 AM, Jaime Crespo <jcre...@wikimedia.org> wrote:

> Very recently I have been experimenting with recursive Common Table
> Expressions [2], which are or will be available on the latest versions of
> MySQL and MariaDB.
>

Do the other databases MediaWiki tries to support have that feature?


> With a single query on can obtain all titles directly or indirectly in a
> category:
>
> WITH RECURSIVE cte (cl_from, cl_type) AS
> (
> SELECT cl_from, cl_type FROM categorylinks WHERE cl_to =
> 'Database_management_systems' -- starting category
> UNION
> SELECT categorylinks.cl_from, categorylinks.cl_type FROM cte JOIN page
> ON
> cl_from = page_id JOIN categorylinks ON page_title = cl_to WHERE
> cte.cl_type
> = 'subcat' -- subcat addition on each iteration
> )
> SELECT page_title FROM cte JOIN page ON cl_from = page_id WHERE
> page_namespace = 0 ORDER BY page_title; -- printing only articles in the
> end
> , ordered by title
>

Does that work efficiently on huge categories, or does it wind up fetching
millions of rows and filesorting?

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Help test longer edit summaries (and other comments) on testwiki (for real this time)

2018-02-07 Thread Brad Jorsch (Anomie)
Yesterday I re-enabled the feature flag on the testing wikis[1][2][3] that
allows MediaWiki to store comments longer than 255 bytes.

The web UI has not been updated to allow longer comments in places where it
enforces a limit, such as the edit summary box. But if you use the API to
edit, or perform page moves or do other things where long comments could be
entered and were truncated, you should now find that they're truncated at
1000 Unicode characters (codepoints) rather than 255 bytes.

Please test it out! If you find errors, or places in core features (not
comments in extensions such as SecurePoll, AbuseFilter, CheckUser, or Flow)
where *new* comments are still being truncated to 255 bytes, or places
where comments aren't showing up at all, please let me know. You can reply
to this message or post a task in Phabricator and add me as a subscriber.

If things go well, we'll look at rolling this out to more wikis soon. See
https://phabricator.wikimedia.org/T174569 to follow progress there.

If anyone is interested in submitting patches for the web UI to reflect the
changed length limits, see https://phabricator.wikimedia.org/T185948.

[1]: https://test.wikipedia.org/wiki/Main_Page
[2]: https://test2.wikipedia.org/wiki/Main_Page
[3]: https://test.wikidata.org/wiki/Wikidata:Main_Page

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (Large) Discrepancies between API-Statistics and Stepwise API-based extraction [Update]

2018-02-02 Thread Brad Jorsch (Anomie)
On Fri, Feb 2, 2018 at 4:21 AM, Rüdiger Gleim <ruediger.gl...@gmx.de> wrote:

> => Is my assumpting correct that the sum of all Revisions queried via
> api.php?action=query=revisions= should match the statistics
> number of revisions?
>

No, the "edits" number also includes deleted revisions.

Another possibility is revisions that aren't attached to a valid page, if
some bug allowed that to happen.


> => What is the difference between continue=|| and rvcontinue=123456? Until
> now I have only been using rvcontinue. Including continue did not make a
> difference, but I could not find out what the meaning of continue=|| is.
>

It won't make a difference in modern MediaWiki. On outdated wikis running
MediaWiki 1.21 to 1.25, it will change the format in which the continuation
data is returned. In any case, when you're manually adding it you should
use an empty value rather than "||".

Note the values of all continuation parameters, whether old or new format,
should be considered as opaque tokens by clients (even though they usually
have obvious structure). The API may change the format of these
continuation tokens at any time without warning, and this will not be
considered a breaking change.

The old format for continuation returns data in a query-continue node to be
combined with the previous requests' parameters. When using generators or
multiple query submodules, the client has to do some non-obvious processing
of that continuation data to avoid missing data or looping. See
https://www.mediawiki.org/wiki/API:Raw_query_continue for details.

In 1.21, a new, easier to use format was introduced, enabled by passing an
empty "continue" parameter when making the initial query. In this mode the
API handles the tricky parts of generators and multiple query submodules
for you, all you have to do is combine everything under the returned
"continue" node with the original request's parameters. This was made the
default in MediaWiki 1.26. See
https://www.mediawiki.org/wiki/API:Query#Continuing_queries for details.
New clients should use this new format since it's much harder to handle it
incorrectly.

The "continue=||" is part of the new format's handling of the tricky bits
of generators and multiple query modules.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (Large) Discrepancies between API-Statistics and Stepwise API-based extraction

2018-01-30 Thread Brad Jorsch (Anomie)
On Tue, Jan 30, 2018 at 5:07 AM, Rüdiger Gleim <ruediger.gl...@gmx.de>
wrote:

> => I would appreciate any information and ideas that can explain the
> differences.
>

If a bug at some point caused the site_stats table to not be updated for
some situation, that would result in such discrepancies.

Try running maintenance/initSiteStats.php with the --update option on your
wiki.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal regarding the handling of imported usernames

2017-12-06 Thread Brad Jorsch (Anomie)
On Thu, Nov 30, 2017 at 12:31 PM, Brad Jorsch (Anomie) <
bjor...@wikimedia.org> wrote:

> The proposal was approved by TechCom, the code has been merged, and it's
> live now on the Beta Cluster. I'm running the maintenance script now.
> Please test things there and report any bugs you encounter, either by
> replying to this message or by filing it in Phabricator and adding me as a
> subscriber. Assuming no major errors turn up that can't be quickly fixed,
> I'll probably start running the maintenance script on the production wikis
> the week of December 11 (and perhaps on mediawiki.org and testwiki the
> week before).
>

I've now run the script on mediawiki.org, testwiki, test2wiki, and
testwikidatawiki. Please let me know about any related errors.

Assuming no error reports, I'll run the script on the rest of the wikis
next week.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Help test longer edit summaries (and other comments) on testwiki

2017-12-04 Thread Brad Jorsch (Anomie)
Unfortunately, I had to revert it. We'll try again later when the schema
change is compete.

On Mon, Dec 4, 2017 at 2:38 PM, Brad Jorsch (Anomie) <bjor...@wikimedia.org>
wrote:

> I've just now enable the feature flag on the testing wikis[1][2][3] that
> allows MediaWiki to store comments longer than 255 bytes.
>
> The web UI has not been updated to allow longer comments in places where
> it enforces a limit, such as the edit summary box. But if you use the API
> to edit, or perform page moves or do other things where long comments could
> be entered and were truncated, you should now find that they're truncated
> at 1000 Unicode characters rather than 255 bytes.
>
> Please test it out! If you find errors, or places in core features (not
> comments in extensions such as SecurePoll, AbuseFilter, CheckUser, or Flow)
> where *new* comments are still being truncated to 255 bytes, or places
> where comments aren't showing up at all, please let me know. You can reply
> to this message or post a task in Phabricator and add me as a subscriber.
>
> If things go well, we'll look at rolling this out to production wikis once
> the schema changes to the production databases are complete. See
> https://phabricator.wikimedia.org/T174569 to follow progress there.
>
> If anyone is interested in submitting patches for the web UI to reflect
> the changed length limits, please do. I'll try to review them if you add me
> as a reviewer.
>
> [1]: https://test.wikipedia.org/wiki/Main_Page
> [2]: https://test2.wikipedia.org/wiki/Main_Page
> [3]: https://test.wikidata.org/wiki/Wikidata:Main_Page
>
> --
> Brad Jorsch (Anomie)
> Senior Software Engineer
> Wikimedia Foundation
>



-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Help test longer edit summaries (and other comments) on testwiki

2017-12-04 Thread Brad Jorsch (Anomie)
I've just now enable the feature flag on the testing wikis[1][2][3] that
allows MediaWiki to store comments longer than 255 bytes.

The web UI has not been updated to allow longer comments in places where it
enforces a limit, such as the edit summary box. But if you use the API to
edit, or perform page moves or do other things where long comments could be
entered and were truncated, you should now find that they're truncated at
1000 Unicode characters rather than 255 bytes.

Please test it out! If you find errors, or places in core features (not
comments in extensions such as SecurePoll, AbuseFilter, CheckUser, or Flow)
where *new* comments are still being truncated to 255 bytes, or places
where comments aren't showing up at all, please let me know. You can reply
to this message or post a task in Phabricator and add me as a subscriber.

If things go well, we'll look at rolling this out to production wikis once
the schema changes to the production databases are complete. See
https://phabricator.wikimedia.org/T174569 to follow progress there.

If anyone is interested in submitting patches for the web UI to reflect the
changed length limits, please do. I'll try to review them if you add me as
a reviewer.

[1]: https://test.wikipedia.org/wiki/Main_Page
[2]: https://test2.wikipedia.org/wiki/Main_Page
[3]: https://test.wikidata.org/wiki/Wikidata:Main_Page

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal regarding the handling of imported usernames

2017-11-30 Thread Brad Jorsch (Anomie)
The proposal was approved by TechCom, the code has been merged, and it's
live now on the Beta Cluster. I'm running the maintenance script now.
Please test things there and report any bugs you encounter, either by
replying to this message or by filing it in Phabricator and adding me as a
subscriber. Assuming no major errors turn up that can't be quickly fixed,
I'll probably start running the maintenance script on the production wikis
the week of December 11 (and perhaps on mediawiki.org and testwiki the week
before).

If you're curious as to what the history of an existing imported page might
look like after the maintenance script is run, see
https://commons.wikimedia.beta.wmflabs.org/wiki/Template:Documentation?action=history
for an example.

On Tue, Oct 31, 2017 at 10:52 AM, Brad Jorsch (Anomie) <
bjor...@wikimedia.org> wrote:

> Handling of usernames in imported edits in MediaWiki has long been weird
> (T9240[1] was filed in 2006!).
>
> If the local user doesn't exist, we get a strange row in the revision
> table where rev_user_text refers to a valid name while rev_user is 0 which
> typically indicates an IP edit. Someone can later create the name, but
> rev_user remains 0, so depending on which field a tool looks at the
> revision may or may not be considered to actually belong to the
> newly-created user.
>
> If the local user does exist when the import is done, the edit is
> attributed to that user regardless of whether it's actually the same user.
> See T179246[2] for an example where imported edits got attributed to the
> wrong account in pre-SUL times.
>
> In Gerrit change 386625[3] I propose to change that.
>
>- If revisions are imported using the "Upload XML data" method, it
>will be required to fill in a new field to indicate the source of the
>edits, which is intended to be interpreted as an interwiki prefix.
>- If revisions are imported using the."Import from another wiki"
>method, the specified source wiki will be used as the source.
>- During the import, any usernames that don't exist locally (and can't
>be auto-created via CentralAuth[4]) will be imported as an
>otherwise-invalid name, e.g. an edit by User:Example from source 'en' would
>be imported as "en>Example".[5]
>- There will be a checkbox on Special:Import to specify whether the
>same should be done for usernames that do exist locally (or can be created)
>or whether those edits should be attributed to the existing/autocreated
>local user.
>- On history pages, log pages, and the like, these usernames will be
>displayed as interwiki links, much as might be generated by wikitext like "
>[[:en:User:Example|en>Example]]". No parenthesized 'tool' links (talk,
>block, and so on) will be generated for these rows.
>- On WMF wikis, we'll run a maintenance script to clean up the
>existing rows with valid usernames and rev_user = 0. The current plan there
>is to attribute these edits to existing SUL users where possible and to
>prefix them with a generic prefix otherwise, but we could as easily prefix
>them all.
>   - Unfortunately it's impossible to retroactively determine the
>   actual source of old imports automatically or to automatically do 
> anything
>   about imports that were misattributed to a different local user in 
> pre-SUL
>   times (e.g. T179246[2]).
>   - The same will be done for CentralAuth's global suppression
>blocks. In this case, on WMF wikis we can safely point them all at Meta.
>
> If you have comments on this proposal, please reply here or on
> https://gerrit.wikimedia.org/r/#/c/386625/.
>
>
> Background: The upcoming actor table changes[6] require some change to the
> handling of these imported names because we can't have separate attribution
> to "Example as a non-registered user" and "Example as a registered user"
> with the new schema. The options we've identified are:
>
>1. This proposal, or something much like it.
>2. All the existing rows with rev_user = 0 would have to be attributed
>to the existing local user (if any), and in the future when a new user is
>created any existing edits attributed to that name will be automatically
>attributed to that new account.
>3. All the existing rows with rev_user = 0 and an existing local user
>would have to be re-attributed to different *valid* usernames,
>probably randomly-generated in some manner, and in the future when a new
>user is created any existing edits for that name would have to be similarly
>re-attributed.
>4. Like #2, except the creation (including SUL auto-creation) of the
>same-named account would not be allowed. 

Re: [Wikitech-l] Proposal for a developer support channel

2017-11-19 Thread Brad Jorsch (Anomie)
On Sat, Nov 18, 2017 at 4:57 PM, Niharika Kohli <nko...@wikimedia.org>
wrote:

> I'd like to add that having Discourse will provide the one thing IRC
> channels and mailing lists fail to - search capabilities. If you hangout on
> the #mediawiki IRC channel, you have probably noticed that we get a lot of
> repeat questions all the time. This would save everyone time and effort.
>

No discussion system I've ever seen has managed to solve the problem of
people asking the same question instead of searching for past replies. I'm
skeptical that this new one will be any different.

Yes, the existing mailing lists have issues with searchability, although to
a large extent that's due to a misguided robots.txt policy preventing the
archives from being indexed in the first place.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Help test longer edit summaries (and other comments) on Beta Cluster

2017-11-17 Thread Brad Jorsch (Anomie)
I've just now enable the feature flag on the Beta Cluster[1] that allows
MediaWiki to store comments longer than 255 bytes.

The web UI has not been updated to allow longer comments in places where it
enforces a limit, such as the edit summary box. But if you use the API to
edit, or perform page moves or do other things where long comments could be
entered and were truncated, you should now find that they're truncated at
1000 Unicode characters rather than 255 bytes.

Please test it out! If you find errors, or places in core features (not
comments in extensions such as SecurePoll, AbuseFilter, CheckUser, or Flow)
where *new* comments are still being truncated to 255 bytes, or places
where comments aren't showing up at all, please let me know. You can reply
to this message or post a task in Phabricator and add me as a subscriber.

If things go well, we'll look at rolling this out to production wikis once
the schema changes to the production databases are complete. See
https://phabricator.wikimedia.org/T174569 to follow progress there.

If anyone is interested in submitting patches for the web UI to reflect the
changed length limits, please do. I'll try to review them if you add me as
a reviewer.


 [1]: https://deployment.wikimedia.beta.wmflabs.org/wiki/Main_Page

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal regarding the handling of imported usernames

2017-11-02 Thread Brad Jorsch (Anomie)
On Thu, Nov 2, 2017 at 4:46 PM, Strainu <strain...@gmail.com> wrote:

> 2017-10-31 16:52 GMT+02:00 Brad Jorsch (Anomie) <bjor...@wikimedia.org>:
> >- If revisions are imported using the "Upload XML data" method, it
> will
> >be required to fill in a new field to indicate the source of the
> edits,
> >which is intended to be interpreted as an interwiki prefix.
>
> What if that is not possible? How are imports between non-related
> websites handled?


It's always possible to enter in something, whether an actual interwiki
link is defined or not. But why not define one?


> I've just recently encountered a situation when a
> MediaWiki upgrade was considered easier to be done by exporting the
> old wiki and importing it in the new one.
>

That seems like a strange situation. But in a case like that, recreate the
user table first and no edits should need prefixing.

>
> >- If revisions are imported using the."Import from another wiki"
> method,
> >the specified source wiki will be used as the source.
> >- During the import, any usernames that don't exist locally (and can't
> >be auto-created via CentralAuth[4]) will be imported as an
> >otherwise-invalid name, e.g. an edit by User:Example from source 'en'
> would
> >be imported as "en>Example".[5]
>
> Why not use "~" like when merging accounts? Sounds like yet another
> "code" is growing for no obvious reason. If you are worried about
> conflicts, there shouldn't be any, as the interwiki prefix is
> different from the shortcut used on SUL.
>

You mean like the appended "~enwiki" used during SUL finalization? Because
legitimate usernames, including those from SUL finalization, can contain
'~', thus recognition is much more difficult and we'd have to do a lot more
work to handle conflicts when they arise.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Proposal regarding the handling of imported usernames

2017-10-31 Thread Brad Jorsch (Anomie)
Handling of usernames in imported edits in MediaWiki has long been weird
(T9240[1] was filed in 2006!).

If the local user doesn't exist, we get a strange row in the revision table
where rev_user_text refers to a valid name while rev_user is 0 which
typically indicates an IP edit. Someone can later create the name, but
rev_user remains 0, so depending on which field a tool looks at the
revision may or may not be considered to actually belong to the
newly-created user.

If the local user does exist when the import is done, the edit is
attributed to that user regardless of whether it's actually the same user.
See T179246[2] for an example where imported edits got attributed to the
wrong account in pre-SUL times.

In Gerrit change 386625[3] I propose to change that.

   - If revisions are imported using the "Upload XML data" method, it will
   be required to fill in a new field to indicate the source of the edits,
   which is intended to be interpreted as an interwiki prefix.
   - If revisions are imported using the."Import from another wiki" method,
   the specified source wiki will be used as the source.
   - During the import, any usernames that don't exist locally (and can't
   be auto-created via CentralAuth[4]) will be imported as an
   otherwise-invalid name, e.g. an edit by User:Example from source 'en' would
   be imported as "en>Example".[5]
   - There will be a checkbox on Special:Import to specify whether the same
   should be done for usernames that do exist locally (or can be created) or
   whether those edits should be attributed to the existing/autocreated local
   user.
   - On history pages, log pages, and the like, these usernames will be
   displayed as interwiki links, much as might be generated by wikitext like "
   [[:en:User:Example|en>Example]]". No parenthesized 'tool' links (talk,
   block, and so on) will be generated for these rows.
   - On WMF wikis, we'll run a maintenance script to clean up the existing
   rows with valid usernames and rev_user = 0. The current plan there is to
   attribute these edits to existing SUL users where possible and to prefix
   them with a generic prefix otherwise, but we could as easily prefix them
   all.
  - Unfortunately it's impossible to retroactively determine the actual
  source of old imports automatically or to automatically do anything about
  imports that were misattributed to a different local user in
pre-SUL times
  (e.g. T179246[2]).
  - The same will be done for CentralAuth's global suppression blocks.
   In this case, on WMF wikis we can safely point them all at Meta.

If you have comments on this proposal, please reply here or on
https://gerrit.wikimedia.org/r/#/c/386625/.


Background: The upcoming actor table changes[6] require some change to the
handling of these imported names because we can't have separate attribution
to "Example as a non-registered user" and "Example as a registered user"
with the new schema. The options we've identified are:

   1. This proposal, or something much like it.
   2. All the existing rows with rev_user = 0 would have to be attributed
   to the existing local user (if any), and in the future when a new user is
   created any existing edits attributed to that name will be automatically
   attributed to that new account.
   3. All the existing rows with rev_user = 0 and an existing local user
   would have to be re-attributed to different *valid* usernames, probably
   randomly-generated in some manner, and in the future when a new user is
   created any existing edits for that name would have to be similarly
   re-attributed.
   4. Like #2, except the creation (including SUL auto-creation) of the
   same-named account would not be allowed. Thus, an import before the local
   name exists would forever block that name from being used for an actual
   local account.
   5. Some less consistent combination of the "all the existing rows" and
   "when a new user is created" options from #2–4.

Of these options, this proposal seems like the best one.

[1]: https://phabricator.wikimedia.org/T9240
[2]: https://phabricator.wikimedia.org/T179246
[3]: https://gerrit.wikimedia.org/r/#/c/386625/
[4]: https://phabricator.wikimedia.org/T111605
[5]: ">" was chosen rather than the more typical ":" because the former is
already invalid in all usernames (and page titles). While a colon is *now*
disallowed in new usernames, existing names created before that restriction
was added can continue to be used (and there are over 12000 such usernames
in WMF's SUL) and we decided it'd be better not to suddenly break them.
[6]: https://phabricator.wikimedia.org/T167246

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Editing module in the wiki editor lead to overloading resouces of processor

2017-10-30 Thread Brad Jorsch (Anomie)
On Mon, Oct 30, 2017 at 8:18 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

> As a side note, I would be interested with any advice to ease my module
> edit process, as the direct in-wiki edition and test become quickly
> frustrating. When I edit a basic wiki text, I often use "it's all text"
> which enable to edit in my usual text editor. But even this option won't
> work with the widget of code edition


I like It's All Text too.

There's an icon on the toolbar that looks like "<>". Clicking that will
switch between the syntax-highlighting code editor and a plain textarea
that is compatible with It's All Text.


> (the same problem occurs with "syntax highlight" extension).
>

Assuming you're talking about
https://meta.wikimedia.org/wiki/Community_Tech/Wikitext_editor_syntax_highlighting,
it looks like you can use the toolbar icon to toggle it off so that It's
All Text will work.

On the other hand, the new VE-based wikitext editor mode does not seem to
have any obvious off-switch or other method to allow It's All Text to
function.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] I'm going to remove ContribsPager::getUserCond() without deprecation

2017-10-27 Thread Brad Jorsch (Anomie)
Per the deprecation policy,[1] this needs to be announced to wikitech-l.

In Gerrit change 383918,[2] it was realized that
ContribsPager::getUserCond() had grown weirdly large and provided a poor
interface for what it was doing, and changes in that patch were going to
exacerbate the situation. Since there are no users of that method in core
or extensions in Gerrit, it was decided that the best course of action
would be to merge it into its one caller rather than piling more tech debt
on top of the existing tech debt there.


[1]: https://www.mediawiki.org/wiki/Deprecation
[2]: https://gerrit.wikimedia.org/r/#/c/383918/

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Last call: Bump PHP requirement to 5.6 in MW 1.31

2017-10-19 Thread Brad Jorsch (Anomie)
I hate that I can never make these RFC meetings. Yesterday's meeting seems
to have had no one to point out that "if we drop php 5.6, we also drop
hhvm" is not necessarily true, even though I pointed that out on T172165
<https://phabricator.wikimedia.org/T172165#3658886> well before the
meeting. HHVM even in PHP5 mode (supposedly[1]) supports all the new PHP7
features that don't break PHP5 behavior. There are 7 php7 things it
specifically doesn't support (listed in T173786#3651007
<https://phabricator.wikimedia.org/T173786#3651007>).

So supporting the intersection of 7.0 and HHVM 3.18-in-php5-mode in 1.31
may not be as impossible as it was assumed to be.

For that matter, HHVM 3.18's php7 mode was tested and it was found that the
hhvm.php7.scalar_types bit was buggy (until 3.20), but no tests were made
for the other six parts of the php7 mode.

[1]: "Supposedly" because there may well be incompatibilities/bugs in
HHVM's implementation.


On Wed, Oct 18, 2017 at 11:54 PM, Tim Starling <tstarl...@wikimedia.org>
wrote:

> Today's RFC discussion was T172165, a proposal for MediaWiki 1.31 to
> require PHP 7.0. There was no consensus on that proposal, due to the
> opinion from Ops that it is not feasible to migrate all application
> servers to Debian Stretch and PHP 7.0 by the expected release date of
> June 2018.
>
> However, there was consensus on the lesser goal of requiring PHP 5.6.
> So, we have created a new RFC for PHP 5.6 (T178538) and are hereby
> placing it into Last Call.
>
> The proposal is: MediaWiki should bump its PHP requirement to 5.6 as
> soon as possible, and at the latest in time for the 1.31 branch point
> (i.e. April 2018).
>
> "As soon as possible" means as soon as the few remaining uses of PHP
> 5.5 in the WMF cluster have been migrated to PHP 5.6 or later, or to
> HHVM. We'd like to see this migration work be given a high priority.
>
> If you have any objection to this proposal, please raise it on
> Phabricator before the end of the Last Call period, which will be
> October 31.
>
> https://phabricator.wikimedia.org/T178538
>
> -- Tim Starling
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What does "p." stands for in the Scribunto reference manual and many modules

2017-09-29 Thread Brad Jorsch (Anomie)
On Fri, Sep 29, 2017 at 7:05 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

> Well, it's nothing functionally important, but I was wondering what the
> "p" initial was for, as it's used everywhere as the returned value by
> modules.


"package", I believe. As noted, nothing actually requires that the variable
be named "p", that's just the convention. For that matter, nothing requires
the variable even exist. You could have a module return the method table
directly, like

return {
hello = function () return "Hello, world!" end
}

and that would be valid. But for a long module that might get hard to read,
and wouldn't allow one method to call another if necessary.


On Fri, Sep 29, 2017 at 7:40 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

> Where? In the reference manual, it's `p` which is used. Should it be
> updated with `_module`?
>
> But I'm affraid that could lead to confusion for beginners when the
> console require to use `p`.
> What about alliasing `p` with `_module`, and even `m` in the scribunto
> console?


Let's not overcomplicate everything. If someone would rather use "_module"
or "m" in the console, they can always enter "_module = p" or the like
before starting their testing.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is it possible to change the locale of a scribunto module and have identifiers with locale characters

2017-09-29 Thread Brad Jorsch (Anomie)
On Fri, Sep 29, 2017 at 6:48 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

> By the way is there an official policy or whatever document regarding
> Scribunto evolutions?
>

Not that I know of. The biggest technical blocker to having Scribunto use a
newer version of Lua is that 5.2 heavily changed how function environments
work, so we'd have to redo the sandboxing and put it through a fresh
security review.


> Ok, thank you. I guessed that each Scribunto process was hugely sandboxed,
> especially as everything seems to be done to prevent passing information
> between successive invocations of the same module. I hadn't thought of
> possible side effect on PHP execution as explained in the ticket.
>

The problem with os.setlocale is that it's global to the whole process, not
inside the sandbox. When using luastandalone that's less of an issue since
the Lua code runs in a separate process (but we still don't start a new
process for each #invoke on the page), but when running with the luasandbox
PHP extension it shares the process.


> Do we have some nice (or even ugly) schema of PHP/Scribunto execution
> process so I could have a clearer representation of what's happening when I
> grab a webpage of a mediawiki article with some Scribunto invocation?
>

Not really. When the parser processes the {{#invoke:}}, it calls
ScribuntoHooks::invokeHook() which loads the module invoked, initializes
it, then calls the method invoked.


> But that's not the concern I was writing for. That is, I can't use unicode
> identifiers as in `locale plâtrière = préamorçage()`. When I see UTF-8
> somewhere, I would expect no problem to use any glyph. So are my
> expectations misguided, or is there something wrong with the way C.UTF-8 is
> handled somewhere in the software stack?
>

Lua's processing operates on C chars (i.e. bytes), and uses C's isalpha()
and isalnum() to recognize which characters are "letters" for the purpose
of identifiers. For single-byte encodings this allows non-ASCII characters
such as 'â', 'è', 'é', and 'ç' to be recognized as "letters", hence the
documentation in Lua 5.1 about that, but in UTF-8 these are all represented
with multiple bytes so that doesn't work.

Changing that would require rewriting all the Lua input processing to use
functions that can handle "wide" characters, which is well beyond what
we're at all likely to do. It'd have to happen upstream, and then we'd have
to spend the time to actually upgrade to Lua 5.4 or whatever version
implemented it. But since Lua 5.2 actually changed things the other way
("Lua identifiers cannot use locale-dependent letters",
https://www.lua.org/manual/5.2/manual.html#8.1) that too seems unlikely.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using a transcluded string valued with something like "param1|param2|…" as parameter list for an other template [Was: Re: Passing to {{ping}} a list of user stored in a template on Me

2017-09-28 Thread Brad Jorsch (Anomie)
On Thu, Sep 28, 2017 at 7:53 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

>
>
> Le 24/09/2017 à 19:31, bawolff a écrit :
>
>> Why not just make a template containing {{ping|first
>> user|second user|third user|...}}
>>
>> Your issue is almost certainly that the pipes aren't being tokenized
>> as argument separators when they come from a transcluded template.
>> (Its the same reason that {{!}} works in tables, except in reverse).
>>
>> Alternatively, {{ping|{{subst::Wiktionary/Tremendous Wiktionary User
>> Group/affiliates would probably work.
>>
> I encounter a similar problem in an other template I'm trying to write.
> Indeed, `subst` works as expected, but I would like to keep the call as is.
> So that when saving, the list won't be expanded, but will stay dynamic. So
> using the same example, something like
>
> {{ping|{{param::Wiktionary/Tremendous Wiktionary User Group/affiliates
>
> Is there already a way to do that.
>

Such a thing would be incompatible with not expanding a template parameter
until it's actually needed, since it wouldn't allow MediaWiki to know what
parameters were actually specified until after expansion..

>
> The other option I see is that the called template or module called should
> directly return the whole call. Actually, with a frame:expandTemplate or
> frame:preprocess it does work for the case I was asking this question.
>

Those are all possible solutions. I'd recommend avoiding frame:preprocess
in favor of frame:expandTemplate if you go that route.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is it possible to change the locale of a scribunto module and have identifiers with locale characters

2017-09-28 Thread Brad Jorsch (Anomie)
On Thu, Sep 28, 2017 at 5:19 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

> According to lua wiki <http://lua-users.org/wiki/Lua
> Locales%20In%20Lua%205.1>, in Lua 5.1 "identifiers [are] locale
> dependent, and from the reference manual which states that "[the
> documentation] derived from the Lua 5.1 reference manual <
> http://www.lua.org/manual/5.1/index.html>", I guess tha Scribunto is
> still derived form Lua 5.1.
>

That's correct.


> So, what I would like is being able to set the locale for a module and use
> identifiers with locale characters. But `os.setlocale` isn't accessible in
> scribunto modules.
>

Allowing os.setlocale would very likely cause problems on threaded
webservers where one thread's locale change stomps on another's. It might
even cause trouble for subsequent requests on non-threaded servers if the
locale doesn't get reset, or for other code running during the same request
(e.g. see T107128 <https://phabricator.wikimedia.org/T107128>).

For sanity's sake, on Wikimedia wikis we use C.UTF-8 as the OS-level
locale. This doesn't affect much since MediaWiki usually uses its own i18n
mechanisms instead of using the locale.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: [mozilla-wikimedia-discuss] Firefox Quantum is in Beta

2017-09-27 Thread Brad Jorsch (Anomie)
On Wed, Sep 27, 2017 at 2:54 AM, Gilles Dubuc  wrote:

> This year Mozilla has been working on our most significant release,
> possibly ever. Firefox Quantum is going into Beta today [1].
>

Are audio for Linux boxes without PulseAudio and many useful addons still
broken?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is there a way to gather contextual clues about the specific template call which led to a scribunto module execution?

2017-09-26 Thread Brad Jorsch (Anomie)
On Tue, Sep 26, 2017 at 4:01 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

> Here is what I mean with a simple use case
>
> ```
>
> == Section 1 ==
>
> Some text.
>
> Hello, we are in {{section}}, leading paragraph is
> {{paragraph|1|{{section.
>
>
> == Section 2 ==
>
> Some other text.
>
> Hello, we are in {{section}}, leading paragraph is
> {{paragraph|1|{{section.
>
> ```
>
> And each call should generate respectively something like
>
>Hello, we are in Section 1, leading paragraph is "Some text.".
>
>Hello, we are in Section 2, leading paragraph is "Some other text.".
>
> So, basically the idea is to let the module infers parameters from the
> calling context, rather than
> always make them explicit.
>

No, there is no simple way to do this.[2] Doing so would probably be a
violation of T67258.[1]

[1]: https://phabricator.wikimedia.org/T67258
[2]: There's a completely insane way to do something like it in some
limited cases, but per w:en:WP:BEANS
<https://en.wikipedia.org/wiki/Wikipedia:Don%27t_stuff_beans_up_your_nose>
I think I'll not go into details.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Looking for advice for structuring data module with redundant entries

2017-09-26 Thread Brad Jorsch (Anomie)
On Tue, Sep 26, 2017 at 3:25 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

> So in Lua, as far as I can say, there is no way to express directly
> something like `a = {b=1, c=a.b}`.
>

Well, if your data is read-only you could do `a = { b = 1 }; a.c = a.b`.


> Moreover, `a.c = 2` should lead to a state where `a.b == a.c and a.b == 2`
>

For that you'll probably want to use a metatable[1] with __index and
__newindex methods that map them both to the same key.

I note that neither metatables nor function accessors will work for data
modules loaded with mw.loadData(), but since assignment doesn't work with
them either that doesn't really matter.

[1]:
https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#Metatables


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Scribunto, failure in attempt to concatenate an array

2017-09-25 Thread Brad Jorsch (Anomie)
On Mon, Sep 25, 2017 at 4:04 PM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

> Hi, I have some trouble trying to concatenate an array stored in a
> Scribunto data module.
>
> mw.logObject(p.data().voir_aussi)
> table#1 {
>   metatable = table#2
>   "NU",
>   "nú",
>   "nụ",
>   "nư",
>   "nữ",
>   "ñu",
>   "ňu",
>   ".nu",
>   "nu!",
> }
> =table.concat(p.data().voir_aussi) == '' -- true
>
>
This is mentioned in the reference manual:[1]

   - The table actually returned by mw.loadData() has metamethods that
   provide read-only access to the table returned by the module. Since it does
   not contain the data directly, pairs() and ipairs() will work but other
   methods, including #value, next(), and the functions in the Table
   library, will not work correctly.

You'll have to copy the data into a real table before concatenating, or
concatenate manually. The former is likely faster if the table will have
many elements in it.

[1]:
https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#mw.loadData


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is it possible to edit scribunto data module content through template edit popup of visual editor?

2017-09-20 Thread Brad Jorsch (Anomie)
On Tue, Sep 19, 2017 at 8:12 PM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

> Well, actually, depending on what you mean with  have page parses make
> edits to the wiki, I'm not sure what I'm looking for fall under this
> umbrella.
>
> What I would like is a way to do something like
>
> local data = mw.loadData( 'Module:Name/data/entry' )
> -- do some stuff with `data`
> mw.saveData( 'Module:Name/data/entry', data)
>
> That's it.
>

And that, like all Scribunto modules, would run during the parse of the
page.

> If you do think it's an horrible akward awfully disgusting idea , I would
> be interested to know more technical details on what problems it might lead
> to (not the global result of nightmarish hell on earth that I'm obviously
> targeting )
>

Confusion for users when purging a page results in edits somewhere else.
Misattribution of the resulting edits. Opportunities for vandals to misuse
it. Performance issues. And T67258.


> Your initial idea of somehow hooking into the editor (whether that's the
> wikitext editor or VE) with JavaScript to allow humans to make edits to the
> data module while editing another page was much better.
>
> I didn't even thought about JS actually. For the wikitext removal of
> updating parameter, I had in mind some inplace template substitution.
> Does javascript allow to change an other page on the wiki, especially a
> data module, at some point when the user edit/save an article?
>

That's what your original message sounded like you were talking about. A
JavaScript gadget of whatever sort could make edits by using the action API.



-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HHVM vs. Zend divergence

2017-09-19 Thread Brad Jorsch (Anomie)
On Tue, Sep 19, 2017 at 2:41 PM, C. Scott Ananian <canan...@wikimedia.org>
wrote:

> You say "there's not much migration cost moving to PHP7" --
> well, it would be nice to assign someone to go through the details in a
> little more detail to double-check that.


What migration costs there were for PHP7 have probably already been paid,
since as has already been noted several developers are already running on
PHP 7 (myself included). There's nothing much open on the NewPHP
workboard,[1] and what is still open seems to be false positives (T120336,
T173850, T173849, T120694), mostly done already (T153505), deprecations of
stuff we only still have for backwards compatibility (T120333, T143788),
irrelevant (T174199), or tracking backports (T174262).

[1]: https://phabricator.wikimedia.org/project/board/346/


> The HHVM announcement specifically mentioned that they will maintain
> compatibility with composer and phpunit, so that seems to be a wash.
>

They specifically mentioned they'd maintain compatibility with *current
versions* of composer and phpunit *until replacements exist*. No specific
criteria for whether a replacement is good enough have been supplied.

They also imply that they may not support full use of those tools, versus
only features of the tools required for whatever use cases they decide to
support.


> [... much discussion of garbage collection ...] It may be a good
> opportunity to take a hard look at our
> Hooks system and figure out if its design is future-proof.
>

I note our hook system has nothing to do with garbage collection or
destructors. It does rely on references, since that's how PHP handles
output parameters.[1] And in particular, explicit references are needed to
handle output parameters combined with call_user_func_array().

Garbage collection and destructors do make a major difference to the use of
RAII patterns[2] such as ScopedCallback and our database classes.

[1]: https://en.wikipedia.org/wiki/Output_parameter
[2]: https://en.wikipedia.org/wiki/RAII


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is it possible to edit scribunto data module content through template edit popup of visual editor?

2017-09-19 Thread Brad Jorsch (Anomie)
On Tue, Sep 19, 2017 at 2:48 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

But having ability to write a limited amount of bytes in a single data
> module per script call, and possibly others safeguard limits, wouldn't be
> that risky, would it?
>

It would break T67258 <https://phabricator.wikimedia.org/T67258>. I also
think it's probably a very bad idea to be trying to have page parses make
edits to the wiki.


> If it's not, please provide me some feed back on the proposal to add such
> a function, and if I should document such a proposal elsewhere, please let
> me know.
>

You're free to file a task in Phabricator, but it will be closed as
Declined. There are too many potential issues there for far too little
benefit.

Your initial idea of somehow hooking into the editor (whether that's the
wikitext editor or VE) with JavaScript to allow humans to make edits to the
data module while editing another page was much better.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HHVM vs. Zend divergence

2017-09-18 Thread Brad Jorsch (Anomie)
On Mon, Sep 18, 2017 at 4:58 PM, Max Semenik <maxsem.w...@gmail.com> wrote:

> 3) Revert WMF to Zend and forget about HHVM. This will result in
> performance degradation, however it will not be that dramatic: when we
> upgraded, we switched to HHVM from PHP 5.3 which was really outdated,
> while 5.6 and 7 provided nice performance improvements.
>

In particular, I've heard good things about PHP 7 performance. Someone less
lazy than I am at 5pm might want to do some research on that though.

I can say that PHP 7 locally runs unit tests significantly faster than PHP
5.6, although that's not really a representative workload for running a
website.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to add custom styles to a page?

2017-09-18 Thread Brad Jorsch (Anomie)
On Sat, Sep 16, 2017 at 5:24 PM, Gergo Tisza <gti...@wikimedia.org> wrote:

> On Sat, Sep 16, 2017 at 7:46 AM, Brian Wolff <bawo...@gmail.com> wrote:
>
> > What you are looking for is TemplateStyles. This is not enabled on
> english
> > wikipedia yet, but should be coming there soon (I think). It would allow
> > you to use @media styles for your pages. See
> > https://www.mediawiki.org/wiki/Help:TemplateStyles for more details.
> >
> > Templatestyles does not let you use css variables (the properties
> starting
> > with -- and the var() css function), but the example css you used could
> > easily be rewritten to not use that feature.
> >
>
> You could also just put it into User:Kaartic/common.css and wrap it in a
> .page-User_Kaartic selector, that is less restrictive.
>

Although that will only apply the CSS for you while you're logged in, no
one else will see it.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Logs are down

2017-09-12 Thread Brad Jorsch (Anomie)
On Tue, Sep 12, 2017 at 5:51 AM, יגאל חיטרון <khit...@gmail.com> wrote:

> Hi. I think you missed that. I started from "The sql log tables are dead".
> It shows exactly where is the problem.
>

The criticism that you are receiving is that you didn't specify *which* log
tables in your email. It turns out you meant the replicas in Toolforge, but
your original message was so vague that it could easily have been
indicating a problem in the Beta Cluster or on the production wikis.

In general, it's best to say explicitly what you're talking about instead
of assuming people will somehow know it.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Last Call: PostgreSQL schema change for consistency with MySQL

2017-08-17 Thread Brad Jorsch (Anomie)
On Thu, Aug 17, 2017 at 10:06 AM, Strainu <strain...@gmail.com> wrote:

> Will the upgrade be seamless regardless of how they upgrade their
> version?


In the best-case scenario, they'll have to run update.php as with most
other MediaWiki updates, and it'll update everything for them. I don't know
of any reason that shouldn't be the case.

We may also increase the version requirement, so if they're still on PG 8.3
they might have to upgrade that too.


> Will they be able to recover their website after disaster
> from a backup with the old layout?


They'd likely be able to repeat whatever had to be done to upgrade the
first time around.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upgrade to jQuery 3 is coming

2017-06-30 Thread Brad Jorsch (Anomie)
On Fri, Jun 30, 2017 at 8:25 AM, Joaquin Oltra Hernandez <
jhernan...@wikimedia.org> wrote:

> > Breaking change and Feature: jQuery.Deferred is now Promises/A+
> compatible
>
> Yay! There are subtle but important differences, I recommend reading the
> deferred section of the upgrade-guide
> <https://jquery.com/upgrade-guide/3.0/#deferred>, which does a good job
> explaining the changes.
>

One change to Promise behavior that I don't see mentioned there (although I
may have missed it) that bit oojs-ui is that the behavior greatly changed
when passing a promise (or any "thenable") as the first argument to
.resolve(). Instead of resolving it with the given promise passed to the
original promise's handlers as you might expect, Promises/A+ decided that
you want to effectively retroactively reattach all the handlers to the new
promise instead.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] "Unloading"/reloading ResourceLoader modules the proper way?

2017-06-19 Thread Brad Jorsch (Anomie)
On Mon, Jun 19, 2017 at 2:14 AM, Legoktm <legoktm.wikipe...@gmail.com>
wrote:

> Hi,
>
> On 06/18/2017 01:00 PM, Jack Phoenix wrote:
> > What would be the proper way to signal to ResourceLoader, "actually I
> > need you to load this module again"?
>
> I don't think ResourceLoader was really designed for the concept of
> "unloading" modules. Instead I'd suggest scoping the CSS to some class
> like "theme-name" and then using JS to set and remove that class from
>  (or whatever other element) as needed.
>

I note VE has a similar bug in https://phabricator.wikimedia.org/T156414.
But I suspect that the part of that task relating to steps 8 and 9 is/will
be low priority.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Global language preference

2017-05-11 Thread Brad Jorsch (Anomie)
On Wed, May 10, 2017 at 7:41 PM, MZMcBride <z...@mzmcbride.com> wrote:

> How is duplicating a default
> value to hundreds of wikis, in a separate code base with its own user
> interface, a sane or desirable architecture?
>

Sane or desirable? Probably not. But it seems a lot easier for someone to
get that working than to design, implement, and deploy global preferences
in MediaWiki, either in core or as an extension.

Of course, if someone does want to work on implementing a global
preferences extension and doing the work to get it deployed, then more
power to them. A better solution to be sure, but it'll likely take more
time and effort.

Or, I suppose, someone can try to convince a WMF team's PM that it should
be prioritized above the team's existing work. But that seems like the
least timely way to get it done, and I can give no estimate of the
probability of success.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Global language preference

2017-05-10 Thread Brad Jorsch (Anomie)
On Wed, May 10, 2017 at 3:48 AM, Yongmin H. <li...@revi.pe.kr> wrote:

> Language is hackable via JS but settings like 'set timezone to blah blah,
> disable VE if it's enabled, disable compact language if enabled, set email
> to plaintext only, disable xwiki notification, ...' can't be done via JS
> hack, which is unfortunate.
>

It probably can, almost[1] anything you can change in Special:Preferences
could be changed via something in your global user JS calling the action
API's action=options.

The main drawback is that it wouldn't take effect on a wiki until after the
first time you visit a page there that loads your global user JS.


[1]: There are a few things like 'realname' and 'emailaddress' that can't
be set that way.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki-Codesniffer 0.8.0 released

2017-05-05 Thread Brad Jorsch (Anomie)
On Thu, May 4, 2017 at 5:14 PM, Legoktm <legoktm.wikipe...@gmail.com> wrote:

> If you encounter any bugs or have suggestions on new rules, please reply
> to this thread or file a bug in the #MediaWiki-Codesniffer Phabricator
> project.
>

I ran this over a few repositories and ran into some issues.

   - "@param string $name" seems fine on a method like "getItemByName()",
   there's little point in making that "@param string $name The name"
   - It whines if PHP builtins aren't documented. Do we really need to
   document __toString() over and over again?
   - Doxygen is happy to inherit documentation from the parent class for an
   overridden method. Your phpcs check doesn't even honor @inheritdoc, it want
   everything copied.

At that point I gave up looking for wheat in the chaff.
-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Proposal to deprecate parsing and diffs in the action API action=query=revisions

2017-04-28 Thread Brad Jorsch (Anomie)
TL;DR: I'm proposing deprecating a bunch of parameters. See
https://phabricator.wikimedia.org/T164106.

In the action API, there are two ways to parse a page/revision: using
action=parse, or using the rvparse parameter to action=query=revisions.
Similarly, there are two ways to get a diff: using action=compare, or using
parameters such as rvdiffto to action=query=revisions. And then
there's action=expandtemplates versus the rvexpandtemplates parameter to
prop=revisions. This is a somewhat annoying bit of code duplication.

Further, the prop=revisions versions of these features have somewhat
strange behavior. rvparse forces rvlimit=1. rvdiffto and related parameters
will sometimes output "notcached" with no way to directly handle the
situation.

So, I propose deprecating all of these parameters. The parameters that
would be deprecated are the 'rvdifftotext', 'rvdifftotextpst', 'rvdiffto',
'rvexpandtemplates', 'rvgeneratexml', 'rvparse', and 'rvprop=parsetree'
parameters to prop=revisions, and the similarly named parameters to
prop=deletedrevisions, list=allrevisions, and list=alldeletedrevisions.

Following the normal action API deprecation policy, they'd output warnings
but would continue to function until usage drops sufficiently or until it
becomes too much trouble to fix them, and they wouldn't receive new feature
development.
If anyone would object to this plan, please reply at
https://phabricator.wikimedia.org/T164106, or here if you really hate
Phabricator. If there aren't major objections, I'll probably do the
deprecation in the next week or two. Thanks.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fair use image indicated as free use?

2017-04-21 Thread Brad Jorsch (Anomie)
On Fri, Apr 21, 2017 at 2:20 PM, Fako Berkers  wrote:

> I'm running the tool algo-news and I discovered that this image:
> https://en.wikipedia.org/wiki/File:Odin_lloyd.jpg
> Is indicated as a free image in the API:
> https://en.wikipedia.org/w/api.php?format=json=
> pageprops=query=1=39787564
>
> Should I report this as a bug?


That feature uses the presence of a particular hidden  in the HTML of
the page to determine non-freeness. On that particular image, the license
template was missing that span,[1] and it also didn't use the standard
non-free use rationale template[2] which also would have added the needed
span.

Then null edits to the image and the page fixed things up.

 [1]: Fixed in
https://en.wikipedia.org/w/index.php?title=Template:Non-free_fair_use=776553380=749836414
 [2]: https://en.wikipedia.org/wiki/Template:Non-free_use_rationale
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] T37247: Request for a holding only the Parser output

2017-04-19 Thread Brad Jorsch (Anomie)
TemplateStyles could really use this, since people object to e.g. the
TemplateStyles CSS being able to mess with the diff tables. I posted an
analysis and some options at
https://phabricator.wikimedia.org/T37247#3181097. Feedback would be
appreciated, particularly from someone familiar with how exactly content
gets into VE and Flow as to what if anything else might be needed to get
the new div to be output in those extensions.

Thanks.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Editing JSON in MediaWiki

2017-04-10 Thread Brad Jorsch (Anomie)
On Sun, Apr 9, 2017 at 11:38 AM, Daniel Kinzler <daniel.kinz...@wikimedia.de
> wrote:

> Generating wikitext from some other thing is what Scribunto does.


Not really. What Scribunto does is let you run a program to generate
wikitext.

If you wanted to write code that took some JSON and turned it into wikitext
without going through all the trouble of writing an extension and getting
it deployed, you might write that code in Lua as a Scribunto module.


On Mon, Apr 10, 2017 at 12:17 AM, Denny Vrandečić <vrande...@gmail.com>
wrote:

> On Sat, Apr 8, 2017 at 11:30 PM James Hare <jamesmh...@gmail.com> wrote:
>
> > Why, exactly, do you want a wikitext intermediary between your JSON and
> > your HTML? The value of wikitext is that it’s a syntax that is easier to
> > edit than HTML. But if it’s not the native format of your data, nor can
> > browsers render it directly, what’s the point of having it?
>
> Ah, good question indeed. The reason is that users would be actually
> putting fragments of wikitext into the JSON structure, and then the JSON
> structure gets assembled into wikitext. Not only would I prefer to have the
> users work with fragments of wikitext than fragments of HTML, but some
> things are almost impossible with HTML - e.g. making internal links red or
> blue depending on the existence of the article, etc.
>

What you probably want to do then is to extend JsonContent and
JsonContentHandler. In the fillParserOutput() method, you'd convert the
JSON to wikitext and then pass that wikitext to the Parser; for the latter
step you could look at how WikitextContent does it.

You might also look at implementing Content::getWikitextForTransclusion()
to let people transclude the resulting wikitext.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introduction

2017-03-21 Thread Brad Jorsch (Anomie)
On Mon, Mar 20, 2017 at 7:03 PM, rupert THURNER <rupert.thur...@gmail.com>
wrote:

> Really? What keywords are you using?
>

"wikimedia projects programming beginner",[1] just as you said.

[1]:
https://www.google.com/search?q=wikimedia+projects+programming+beginner=utf-8=utf-8


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introduction

2017-03-19 Thread Brad Jorsch (Anomie)
On Sun, Mar 19, 2017 at 7:23 AM, rupert THURNER <rupert.thur...@gmail.com>
wrote:

> where is this beginners page hidden,


Is https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker what
you're looking for?


> and why are google, bing, and duckduckgo not able to find it?
>

I don't know. When I try that search on Google the above page shows up as
the 9th result, even in a private browsing session with a different browser
from a different IP.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ArchCom Monutes, News, and Outlook

2017-03-16 Thread Brad Jorsch (Anomie)
On Thu, Mar 16, 2017 at 6:40 AM, Dan Garry <dga...@wikimedia.org> wrote:

> Google Docs is easier to spin up in the moment and edit collaboratively
> than MediaWiki. Using proprietary software tools if they're a better fit
> for the intended purpose is entirely consistent with the Foundation's
> guiding principles
>

Doesn't etherpad (https://etherpad.wikimedia.org/) fit that need without
being proprietary?

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ResistanceManual.org looking for co-maintainers

2017-01-30 Thread Brad Jorsch (Anomie)
On Sun, Jan 29, 2017 at 6:38 PM, Ori Livneh <ori.liv...@gmail.com> wrote:

> Resistance Manual <https://www.resistancemanual.org/Resistance_Manual_Home
> >
> is a guide for organizing resistance to the policies of the Trump
> administration in the United States. The site is running MediaWiki 1.28,
> and its admins are looking for help maintaining the site. The main page
> says to reach out to i...@staywoke.org if interested.
>

Is "some random wiki needs sysadmins" really on-topic for this list?


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deprecation logging in production

2017-01-25 Thread Brad Jorsch (Anomie)
On Wed, Jan 25, 2017 at 7:43 PM, Max Semenik <maxsem.w...@gmail.com> wrote:

> 189 Use of SyntaxHighlight_GeSHi::prepare is deprecated. [Called from
> ScribuntoContent::fillParserOutput in
> /srv/mediawiki/php-1.29.0-wmf.8/extensions/Scribunto/common/
> ScribuntoContent.php
> at line 128]
> 189 Use of SyntaxHighlight_GeSHi::buildHeadItem is deprecated. [Called
> from ScribuntoContent::fillParserOutput in
> /srv/mediawiki/php-1.29.0-wmf.8/extensions/Scribunto/common/
> ScribuntoContent.php
> at line 141]
> 132 Use of SyntaxHighlight_GeSHi::prepare is deprecated. [Called from
> ScribuntoContent::fillParserOutput in
> /srv/mediawiki/php-1.29.0-wmf.9/extensions/Scribunto/common/
> ScribuntoContent.php
> at line 128]
> 132 Use of SyntaxHighlight_GeSHi::buildHeadItem is deprecated. [Called
> from ScribuntoContent::fillParserOutput in
> /srv/mediawiki/php-1.29.0-wmf.9/extensions/Scribunto/common/
> ScribuntoContent.php
> at line 141]
>

https://gerrit.wikimedia.org/r/#/c/245581/ is blocked on
https://gerrit.wikimedia.org/r/#/c/245580/, which seems to be waiting for
an update to do it in a different way.

   2705 Use of wfSetupSession was deprecated in MediaWiki 1.27. [Called
> from CollectionSession::startSession in
> /srv/mediawiki/php-1.29.0-wmf.8/extensions/Collection/
> Collection.session.php
> at line 38]
>  30 Use of wfSetupSession was deprecated in MediaWiki 1.27. [Called
> from CollectionSession::startSession in
> /srv/mediawiki/php-1.29.0-wmf.9/extensions/Collection/
> Collection.session.php
> at line 38]
>  20 Use of wfSetupSession was deprecated in MediaWiki 1.27. [Called
> from AbuseFilter::executeFilterActions in
> /srv/mediawiki/php-1.29.0-wmf.8/extensions/AbuseFilter/
> includes/AbuseFilter.class.php
> at line 796]
>   3 Use of wfSetupSession was deprecated in MediaWiki 1.27. [Called
> from wfAjaxPostCollection in
> /srv/mediawiki/php-1.29.0-wmf.8/extensions/Collection/Collection.php at
> line 268]
>   2 Use of wfSetupSession was deprecated in MediaWiki 1.27. [Called
> from AbuseFilter::executeFilterActions in
> /srv/mediawiki/php-1.29.0-wmf.9/extensions/AbuseFilter/
> includes/AbuseFilter.class.php
> at line 796]
>

https://phabricator.wikimedia.org/T124371 has details about how someone
could go about cleaning these up.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ArchCom Status & Meeting Minutes, WD3

2017-01-19 Thread Brad Jorsch (Anomie)
On Wed, Jan 18, 2017 at 10:44 PM, Daniel Kinzler <
daniel.kinz...@wikimedia.de> wrote:

> ** Related discussion about whether new features can require services
> serparate
> from MediaWiki core.
>

That seems like it would be a decent RFC at some point.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I8n

2017-01-07 Thread Brad Jorsch (Anomie)
On Sat, Jan 7, 2017 at 3:28 PM, Moriel Schottlender <mor...@gmail.com>
wrote:

> - but our translations are not being submitted directly through the
> repositories anyways, they are being fixed through translatewiki.
>
> If you want to provide translation fixes to various extensions, you should
> go to https://translatewiki.net/ look up the extension and translation
> key,
> and do it from there.
>

Note that applies to every language except English. If you're fixing errors
in the English messages, you have to do those by submitting a patch in
Gerrit.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation

On Sat, Jan 7, 2017 at 3:28 PM, Moriel Schottlender <mor...@gmail.com>
wrote:

> There's no single repo for this, since translation files are per repository
> - but our translations are not being submitted directly through the
> repositories anyways, they are being fixed through translatewiki.
>
> If you want to provide translation fixes to various extensions, you should
> go to https://translatewiki.net/ look up the extension and translation
> key,
> and do it from there.
>
> Hope this helps.
>
> On Sat, Jan 7, 2017 at 10:32 AM, Bartosz Dziewoński <matma@gmail.com>
> wrote:
>
> > No.
> >
> > --
> > Matma Rex
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
>
>
> --
> No trees were harmed in the creation of this post.
> But billions of electrons, photons, and electromagnetic waves were terribly
> inconvenienced during its transmission!
> ___________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Now live: Shared structured data

2016-12-22 Thread Brad Jorsch (Anomie)
On Thu, Dec 22, 2016 at 2:30 PM, Yuri Astrakhan <yastrak...@wikimedia.org>
wrote:

> Gift season! We have launched structured data on Commons, available from
> all wikis.
>

I was momentarily excited, then I read a little farther and discovered this
isn't about https://commons.wikimedia.org/wiki/Commons:Structured_data.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SOLVED Re: VisualEditor in 1.28 - fails after upgrade from 1.27

2016-12-20 Thread Brad Jorsch (Anomie)
On Tue, Dec 20, 2016 at 2:00 AM, Bartosz Dziewoński <matma@gmail.com>
wrote:

> On 2016-12-20 06:22, Daniel Barrett wrote:
>
>> The real issue is that a custom callback for the hook
>> "SpecialPages_initList" is invoking RequestContext::getMain()->get
>> User()->isLoggedIn().
>>
>> Apparently that doesn't work.
>>
>> I'll take a guess that SpecialPages_initList runs too early for this
>> check to succeed?
>>
>>
>> My goal is to remove some special pages for anonymous users but permit
>> logged-in users to see them.
>> Is there a better way to check for a logged-in user at this hook point?
>> Or a better way to remove
>> special pages for anonymous users?
>>
>
> Yes, the list of special pages can't depend on anything related to the
> current user.
>
> Instead, you should check whether the user is logged in when displaying
> the special page. You can just call `$this->requireLogin();` at the
> beginning of the special page's execute() function – this will check
> whether the user is logged in, and if not, display an error message and
> abort execution. You can optionally pass a custom error message. See e.g.
> /includes/specials/SpecialWatchlist.php in MediaWiki for an example.
>

And to hide a special page on Special:SpecialPages from users who can't use
it, have the page's userCanExecute() return false when appropriate and have
isRestricted() return true. If the check is based on having one user right,
this can be easily done by passing the user right as the $restriction
parameter to SpecialPage::__construct().


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VisualEditor in 1.28 - fails after upgrade from 1.27

2016-12-20 Thread Brad Jorsch (Anomie)
On Mon, Dec 19, 2016 at 8:21 PM, Daniel Barrett <d...@cimpress.com> wrote:

> > Ok, that's a start. Can you get the rest of the stack trace for the
> > exception so we can figure out how that's getting called?
>
>
> Sure. How do I get a full stack dump when the error appears only in the
> Chrome developer console?
>
>BadMethodCallException from line 845 of <...>/w/includes/session/
> SessionManager.php:
>Sessions are disabled for this entry point
>
> I've already enabled $wgShowExceptionDetails (which caused slightly more
> detail to appear,
> but no stack dump) and set every other debugging flag in sight on
> https://www.mediawiki.org/wiki/Manual:How_to_debug#PHP_errors in
> LocalSettings.php and in index.php.
> The only output I get is the above line in the Chrome developer console.
> The error does not appear in the associated Apache error log file.
>

If you haven't already done so, set up logging as described at
https://www.mediawiki.org/wiki/Manual:How_to_debug#Logging.

Then you can look up that code (the "[43d736c07dd76d73cf26db20]") in the
error log file.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VisualEditor in 1.28 - fails after upgrade from 1.27

2016-12-19 Thread Brad Jorsch (Anomie)
On Mon, Dec 19, 2016 at 8:20 AM, Daniel Barrett <d...@cimpress.com> wrote:

> Here you go:
>
>
> BadMethodCallException from line 845 of 
> <...>/w/includes/session/SessionManager.php:
> Sessions are disabled for this entry point
>
>
> load.php?debug=false=en=startup=scripts=vector:4
> [43d736c07dd76d73cf26db20] /w/load.php?debug=false=
> en=startup=scripts=vector   BadMethodCallException from
> line 845 of <...>/w/includes/session/SessionManager.php: Sessions are
> disabled for this entry point
>

Ok, that's a start. Can you get the rest of the stack trace for the
exception so we can figure out how that's getting called?

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Discussion Platform

2016-12-06 Thread Brad Jorsch (Anomie)
On Tue, Dec 6, 2016 at 7:50 AM, MAYANK JINDAL <mayank.jind...@gmail.com>
wrote:

> We are using IRC for discussion purpose. How will it be if we change our
> discussion platform?
> Many organizations have switched to gitter that have very user-friendly UI
> and very easy to use.
> Please give a view on my proposal.
>

It seems very unlikely that we would gain much by moving from an
established open standard to a proprietary walled garden service.

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Update on WMF account compromises

2016-11-16 Thread Brad Jorsch (Anomie)
On Wed, Nov 16, 2016 at 3:19 PM, Thomas Morton <morton.tho...@googlemail.com
> wrote:

> >
> > Another idea might be to for the software to offer to create a random
> > password for users at account creation time, and also to make the same
> > offer at password change time.
> >
> > For example, even using automatically generated simple-looking and
> > reasonably simple passwords like "little-center-ground-finger"
> > consisting of 4 words between 5 and 8 characters long, will give an
> > effective per-password entropy of 62 bits, significantly better than
> > most user-generated passwords.
>
> If we did this it's worth pro-actively making the wordlist "hard". For
> example, the words chosen above appear in the top-1000 most common English
> words, and so therefore are trivially vulnerable to dictionary attacks
> (hackers read XKCD too :)).
>

If you use the top-1000 most common English words (and the attacker knows
you picked 4 random words from that list), 4 randomly-chosen words would
have about 39.86 bits of entropy. That's a bit weak, but probably not
entirely trivial (at 1000 guesses/second it'd take 31 years to try all the
possibilities). Using a list of 1000 *un*common English words has the same
level of entropy, since we assume the attacker can get the word list
somehow (if nothing else, by using the service themselves a few thousand
times and collecting all the words seen).

If you want to increase the entropy, use a larger word list rather than a
"harder" one. The XKCD comic seems to have used a 2048-word list for its
44-bit estimate. Using a list with 8836 words gets the same entropy (about
52.44 bits) as a completely-random 8-character password using any of the 94
characters I can easily type on my keyboard (e.g. "'>hZ|=S\*").


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Internationalization of action API errors and warnings

2016-11-16 Thread Brad Jorsch (Anomie)
In Gerrit change 321406[1] and related patches,[2] I've proposed code to
allow the MediaWiki action API to return errors and warnings in languages
other than English. Feedback and code review would be appreciated.

A detailed description of the proposed changes is posted at
https://www.mediawiki.org/wiki/API/Architecture_work/i18n#Warnings_and_errors.
Summary:

   - For clients of the API, some error codes will change, particularly
   from query submodules. If you're trying to parse the human-readable error
   and warning text, these also are likely to change. A few modules that
   returned error or warning text in a non-standard manner have been changed.
   For the most part, though, client code should not need updating since the
   default is backwards-compatible.
   - For extension authors, several ApiBase methods are deprecated and
   should be replaced. The existing patches[2] may serve as examples.


 [1]: https://gerrit.wikimedia.org/r/#/c/321406/
 [2]: https://gerrit.wikimedia.org/r/#/q/topic:api-error-i18n/T47843

-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] Giving actual CSRF tokens to not logged in users (T40417)

2016-09-30 Thread Brad Jorsch (Anomie)
On Thu, Sep 29, 2016 at 5:10 PM, Max Semenik <maxsem.w...@gmail.com> wrote:

> On Thu, Sep 29, 2016 at 1:37 PM, Brad Jorsch (Anomie) <
> bjor...@wikimedia.org
> > wrote:
> > Note it will affect scripts and API clients that expect to see "+\" as
> the
> > token as a sign that they're logged out, or worse assume that's the token
> > and don't bother to fetch it.
>
> We had breaking API/frontend infrastructure changes before, this one seems
> less invasive and will break only badly written clients. In any case, most
> clients are intended for logged in users.
>

It still should be known that these will break and should be announced in
the proper place (mediawiki-api-announce) and time.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] Giving actual CSRF tokens to not logged in users (T40417)

2016-09-29 Thread Brad Jorsch (Anomie)
On Thu, Sep 29, 2016 at 4:00 PM, Brian Wolff <bawo...@gmail.com> wrote:

> This way it will work for users without cookies (Maybe none exist, but I
> like the idea you can edit wikipedia without cookies)


There have been people who disabled cookies and still wanted to be able to
use the sites.


> and for users who have rapidly changing IPs.


Well, only after they manage to get a session cookie set. I see the patch
there attempts to account for that by creating a session on token failure
via HTMLForm, which is good, although there are other code paths that would
need the same sort of thing (e.g. API token checks).


> It will also have minimal breakage, as you won't have to adjust any
> existing usages of tokens (For example, on special pages).
>

Note it will affect scripts and API clients that expect to see "+\" as the
token as a sign that they're logged out, or worse assume that's the token
and don't bother to fetch it.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Public Event Streams (AKA RCStream replacement) question

2016-09-26 Thread Brad Jorsch (Anomie)
On Sun, Sep 25, 2016 at 10:02 AM, Merlijn van Deen (valhallasw) <
valhall...@arctus.nl> wrote:

> You could consider not implementing streaming /at all/, and just ask
> clients to poll an http endpoint, which is much easier to implement
> client-side than anything streaming (especially when it comes to handling
> disconnects).
>

On the other hand, polling requires repeated TCP handshakes, repeated HTTP
headers sent and received, all that work done even when there aren't any
new events, non-real-time reception of events (i.e. you only get events
when you poll), and decision on what acceptable minimum values for the
polling interval are.

And chances are that clients that want to do polling are already doing it
with the action API. ;) Although I don't know what events are planned to be
made available from this new service to be able to say whether they're all
already available via the action API.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Public Event Streams (AKA RCStream replacement) question

2016-09-24 Thread Brad Jorsch (Anomie)
On Sat, Sep 24, 2016 at 11:41 AM, Andrew Otto <o...@wikimedia.org> wrote:

> ​So, since most of the dev work for a socket.io implementation is already
> done, you can see what the protocol would look like here:
> https://github.com/wikimedia/kasocki#socketio-client-set-up
>
> Kasocki is just a library, the actual WMF deployment and documentation
> would be more specific about MediaWiki type events, but the interface would
> be the same.  (Likely there would be client libraries to abstract the
> actual socket.io interaction.)
>

See, that's the sort of thing I was complaining about. If I'm not using
whatever language happens to have a library already written, there's no
spec so I have to reverse-engineer it from an implementation. And in this
case that seems like socket.io on top of engine.io on top of who knows what
else.


-- 
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   3   4   >