will be rendered as wikitext. See this page for some common
> formatting errors.
>
> https://www.mediawiki.org/w/index.php?diff=3240805=3240802=Scrum_of_scrums/2019-05-15=revision=source
> ** If your team is blocking/blocked, please make sure you copy/paste the
>
g development. And document it in the lifecycle document.
>
I have found that setting the priority to "lowest" is the closest thing we
have to "this is a valid task but we are not going to invest paid time into
it."
Is this script going to (eventually) undo all of the vandalism? Do users need
to do anything else?
> On Jul 1, 2018, at 2:31 AM, Max Semenik wrote:
>
> We've got ourselves da MVP!
>
> On Sun, Jul 1, 2018 at 12:51 AM, Leon Ziemba
> wrote:
>
>> I wrote a rollback script, currently running as
] and let us know what you think.
Feedback is welcome in any language. If you would like to get in touch
privately, you are also welcome to email me at jh...@wikimedia.org.
Best regards,
James Hare
[0] https://tools.wmflabs.org/hay/directory/
[1] https://meta.wikimedia.org/wiki/Toolhub
[2] https
On Mon, Jun 4, 2018 at 11:54 AM, Leon Ziemba
wrote:
> > ... the size differences shown on the history and contributions pages,
> which is the only thing that rev_parent_id is used for
>
> This may be true in MediaWiki but not so much for external tools. I just
> wanted to preemptively say this.
e is that this field was
never updated and there was never a reason to if the address still
technically works.
James Hare
Associate Product Manager
Wikimedia Foundation
https://wikimediafoundation.org
___
Wikitech-l mailing list
Wikite
It’s worth noting that (to the best of my knowledge) having a release for both
iOS and Android is not trivial, seeing as they’re entirely separate platforms
with separate programming languages, SDKs, approval workflows, and
publication/release systems.
> On May 1, 2018, at 4:19 AM, zppix e
19 Differential users seems believable to me, since as far as I know we
don't use Differential that much. (Differential is the code review tool
within Phabricator.)
James Hare
Associate Product Manager
Wikimedia Foundation
https://wikimediafoundation.org
On Sat, Mar 31, 2018 at 5:10 PM
I think deleting those block records is acceptable, and if they need to be
blocked again, we can just block them again. Instituting blocks like this
seems like an anti-pattern we should avoid.
On March 21, 2018 at 11:50:50 AM, Brad Jorsch (Anomie) (
bjor...@wikimedia.org) wrote:
In 2005–2006 a
I’m not entirely familiar with how we use Phan, but if it’s like the other
tests we run with Gerrit, what we could do is have a test for deprecated APIs
that passes or fails as you might expect, but have it be a non-voting test.
This way a failure is a signal to check something out, but it
recent iterations.
Also this delightful song: https://www.youtube.com/watch?v=dIiZ3vvZ78s
Cheers,
James Hare
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> On Sep 19, 2017, at 8:21 AM, C. Scott Ananian wrote:
>
>> On Mon, Sep 18, 2017 at 9:51 PM, Chad wrote:
>>
>> I see zero reason for us to go through all the formalities, unless we want
>> to really. I have yet to see anyone (on list, or on
On Sep 18, 2017, at 1:58 PM, Max Semenik wrote:
>
> Today, the HHVM developers made an announcement[1] that they have plans of
> ceasing to maintain 100% PHP7 compatibility and concentrating on Hack
> instead.
>
> While this does not mean that we need to take an action
What I wonder is – does this *need* to be a part of the database table, or
can it be a dataset generated from each revision and then published
separately? This way each user wouldn’t have to individually compute the
hashes while we also get the (ostensible) benefit of getting them out of
the
Sniffing goats sounds unpleasant.
> On Aug 13, 2017, at 2:32 PM, Legoktm wrote:
>
> Hello!
>
> Straight from Wikimania, MediaWiki-GoatSniffer 0.11.1 is now available
> for use in your MediaWiki extensions and other projects. This release
> fixes bugs from 0.10.0 as
For information on the budget and goals of the New Readers program, see this
part of the annual plan:
https://meta.m.wikimedia.org/wiki/Special:MyLanguage/Wikimedia_Foundation_Annual_Plan/2017-2018/Final/New_Readers
> On Aug 7, 2017, at 7:20 PM, Pine W wrote:
>
> Grace,
This looks really cool! I think it makes sense that if you search for the
name of a really old book you also get the Wikisource entry, or a
Wiktionary result if you look up a word.
On June 15, 2017 at 5:07:11 PM, Deborah Tankersley (
dtankers...@wikimedia.org) wrote:
Hello,
We're happy to
Multimedia was recently moved to the Reading (now Readers) vertical. To the
best of my knowledge that isn’t changing.
On June 12, 2017 at 12:52:44 PM, Pine W (wiki.p...@gmail.com) wrote:
James: thanks for asking; I'm copying that question to the Wikitech list.
While we're on that topic, what's
I thought MediaWiki, by default, stored data as binary blobs, rather than
something of a particular encoding?
On May 2, 2017 at 10:11:38 AM, Mark Clements (HappyDog) (
gm...@kennel17.co.uk) wrote:
Hi all,
I seem to recall that a long, long time ago MediaWiki was using UTF-8
internally but
For clarification, are you referring to the Content Translation tool?
On April 26, 2017 at 10:56:28 PM, Strainu (strain...@gmail.com) wrote:
Following the recent outage, we've had a new series of complaints
about the lack of improvements in CX, especially related to
server-side activities like
wikimania for "How
to write extensions for Parsoid and Visual Editor". But the area's only
half-baked, it might be more appropriate to do it as a hackathon session
instead. Thoughts?
--scott
On Mon, Apr 10, 2017 at 3:20 PM, James Hare <jamesmh...@gmail.com> wrote:
> On Apr
changing values in a JSON blob.
Since Parsoid supports JSON as a content type, would it thus be trivial to
create an extension for VisualEditor to support editing this content type
and others like it?
Thanks,
James Hare
___
Wikitech-l mailing list
Why, exactly, do you want a wikitext intermediary between your JSON and
your HTML? The value of wikitext is that it’s a syntax that is easier to
edit than HTML. But if it’s not the native format of your data, nor can
browsers render it directly, what’s the point of having it?
The CollaborationKit
For large conference calls, I highly recommend Zoom. (https://zoom.us). It
actually works for large conference calls. Wikimedia DC uses it all the
time and it is very effective. It is proprietary, but so is Google Hangout.
(I do not have a good solution to *that* problem.)
On March 13, 2017 at
I just had an idea. My bot Reports Bot is always in need of updates and minor
maintenance, and I think GSOC could be good for people interested in working in
Python. But I don't want to propose something if it would be too much work for
the person doing it. Is there guidance on how
Should people refrain from uploading videos at this time?
On February 9, 2017 at 5:01:51 PM, Brion Vibber (bvib...@wikimedia.org)
wrote:
Quick update -- I tested some batch-reencoding of missing low-res files and
have flooded the high-priority queue. :) Thanks for your patience as we get
the
I would find such a tool to be extremely useful as well. I am working on an
extension that uses a considerable amount of JavaScript and while we’re
trying to find as many breakdowns as we can, it’s entirely possible we’ll
miss something and the person who discovers the bug won’t be familiar with
the roles which were previously assigned to him.
Regards,
James Hare
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
;
> Aran
>
Is there any reason this needs to be handled by the extension? Can you edit the
corresponding MediaWiki-namespace page to change the label?
—
James Hare
http://harej.co
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
: https://phabricator.wikimedia.org/T147137
MediaWiki.org:
https://www.mediawiki.org/wiki/Requests_for_comment/JSON_validation
I look forward to seeing your comments.
—
James Hare
http://harej.co
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
30 matches
Mail list logo