[Wikitech-l] Codex code splitting: ResourceLoader now supports loading individual Codex components

2024-01-16 Thread Roan Kattouw
Hello everyone,

The Design Systems Team is pleased to announce that ResourceLoader now
supports basic code splitting for Codex
<https://www.mediawiki.org/wiki/Codex> components. The way this works is
similar to how Codex icons already work: you list the components you need
in your ResourceLoader module definition, and only the components you need
will be bundled. This should be helpful for performance, reducing the
amount of code that is loaded by loading only the components your feature
needs, not the entire library. Most new and existing uses of Codex should
be able to use this feature. If you maintain code that uses Codex, we
recommend trying this out!

There's documentation for how to use this feature on the Codex page on
mediawiki.org
<https://www.mediawiki.org/wiki/Codex#Requiring_Codex_components_in_MediaWiki_and_extensions>,
and you can also look at the CodexExample extension
<https://gitlab.wikimedia.org/repos/design-systems/CodexExample> for a more
complete example of how to use this feature. If you have any questions, or
run into any issues trying to use this feature, we're here to help, feel
free to reply to this email or post on the Codex talk page on mediawiki.org
<https://www.mediawiki.org/wiki/Talk:Codex>.

So far we have only implemented the most basic iteration of code splitting.
We plan on building some more advanced features to help with deduplication
of components as well. For more details about our plans, see the epic task
<https://phabricator.wikimedia.org/T349423> on Phabricator.

Roan Kattouw
Software engineer on the Design Systems Team at the Wikimedia Foundation
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Announcing Codex 1.0

2023-10-25 Thread Roan Kattouw
Today the Design Systems Team
 is announcing the
release of Codex 1.0!
What is Codex?

Codex  is the new design system
for Wikimedia. Over the past 2 years, the Design Systems Team and
contributors from the Wikimedia Foundation, Wikimedia Deutschland, and the
volunteer communities have collaborated to create a centralized design
system to serve Wikimedia projects. Codex provides more equitable
experiences for all Wikimedia movement participants, and makes it easier
and faster to design and build consistent user interfaces. With Codex, we
aim to enable more people to contribute to the mission.

Codex provides a library of design tokens
, user
interface components
, and
catalog of icons
 to use with
these components. Through the Codex Figma libraries, designers can reuse
these shared components

, tokens
,
and assets

in
their designs. For developers, Codex provides components built with Vue.js,
as well as some CSS-only components that do not require JavaScript to use.

Codex is already being used for Wikifunctions
, Vector 2022
,
the Growth Mentor Dashboard
 and Impact Module
, the New
Pages Feed

, MediaSearch ,
NearbyPages ,
QuickSurveys , and
ReadingLists .
Projects currently under development using Codex include Accessibility for
reading
 and
the Incident Reporting System
.

Codex provides a set of core components

that
cover a wide range of Wikimedia user interface needs, but does not
necessarily provide equivalents of all components in OOUI
. If you find that a component you
were expecting to use is missing, please talk to the Design Systems Team
 and
we'd be happy to help you. We strongly encourage contribution to Codex
, in
line with our vision of being a collaborative project guided by stewardship.
Why 1.0 now?

The Design Systems Team has been working towards this milestone for a
number of months now. Based on early feedback, we’ve already made
improvements to the developer experience of using Codex, like providing
ready-to-go example repos
 of using
Codex in a MediaWiki extension and changing our code snippets in the doc
site for easier copy-paste into non-TypeScript and MediaWiki-specific
projects.

We also dedicated significant time and effort ensuring accessibility of
Codex components and assets conform to WCAG 2.1 AA and other standards, and
plan to engage with groups like the American Foundation for the Blind for
further improvements.

We’ve been consolidating various design resources (like the design style
guide ) into Codex so there is a
single source of truth for Wikimedia front-end development and design
standards. This is to clarify Codex’s role not just as a UI library, but as
the design system for Wikimedia.
Who should use Codex?

Everyone! Some foundational elements like design tokens can and should be
used in all Wikimedia software going forward. Most of the MediaWiki code
that uses the older pre-token variables from mediawiki.ui has already been
migrated to use the Codex tokens instead. The Codex wiki documentation
 has more
information about using design tokens (and other elements) in MediaWiki.

At this time, Codex components are most suitable for client-side features
that don't have complex requirements for non-JavaScript support, or for
server-rendered interfaces that don't need much interactivity. Features
requiring both 

[Wikitech-l] Soliciting input on server rendering of user interfaces discussion (includes server-side rendering)

2022-11-01 Thread Roan Kattouw
Hello everyone,

In September, I submitted a proposal to the Technical Decision Making
Process called "Modern interfaces for all users" [1]. This proposal was
somewhat abstract, but it was about different ways we could render user
interfaces on the server, given that our server-side code is written in
PHP, but most UI code is written in JavaScript (including Codex[2], the new
library of shared UI components being developed by the Design Systems Team).

One solution that many people have been asking for for some time is Vue's
server-side rendering (SSR) feature[3]. Supporting Vue SSR in MediaWiki
will require in-depth architecture discussions, and I think that will
eventually be part of this TDMP too. However, there are also other
approaches for rendering UI components on the server which may be more
appropriate, depending on the situation. So before we delve into
architecture and implementation for Vue SSR, we should first discuss which
methods of rendering UIs on the server we want to use in which situations,
both to confirm that we really do need Vue SSR, and to inform the
architecture discussion.

To that end, I've started a discussion on a Phabricator task where I list
different server rendering methods and make a simple (probably too simple)
proposal for when to use which. If you're interested in this topic, please
go to https://phabricator.wikimedia.org/T322163 and participate in the
discussion there.

Roan Kattouw
Lead engineer on the Design Systems Team at the Wikimedia Foundation

[1] https://phabricator.wikimedia.org/T317270
[2] https://www.mediawiki.org/wiki/Codex
[3] https://vuejs.org/guide/scaling-up/ssr.html
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] MediaWiki now supports ES6

2021-03-16 Thread Roan Kattouw
The Vue migration team is pleased to announce that MediaWiki finally has
built-in support for writing modern JavaScript with ES6.

Up until now, all JavaScript code in MediaWiki has been written in ES5 (a
version of JavaScript standardized in 2009), because MediaWiki maintains
support for some browsers that don't support ES6. Over the years this has
become increasingly frustrating for developers, because ES6 (standardized
in 2015) adds new language features that make writing code easier and more
pleasant, but these features couldn't be used in MediaWiki. A lot of modern
code and tools are now written in ES6, and browsers that don't support ES6
have become obsolete. At this point, the only significant non-ES6 browser
that MediaWiki continues to support is Internet Explorer 11, whose usage is
relatively low, but not yet quite low enough for us to drop support for it.
In keeping with the IE11 announcement[1] from earlier this month, we will
continue to support IE11, but some new features will not support it.

What this means for developers is that you can now use ES6 code in
MediaWiki core, extensions, and skins, as long as it's in a feature that
doesn't need to support IE11. ResourceLoader modules that use ES6 code have
to be flagged as such, and you will need to put ES6 code in a separate
directory so that different eslint rules can be applied. For detailed
instructions on how to start using ES6 in your code, see
https://www.mediawiki.org/wiki/ResourceLoader/ES6 .

Unfortunately, ES6 code is not yet supported in on-wiki JavaScript
(Gadgets, user scripts, and site scripts), because we need to check that
these scripts are syntactically valid, and our current validator only
understands ES5 [2].

To make this possible, we added functionality[3] to ResourceLoader to allow
modules to mark themselves as ES6-only, and prevent those modules from
being loaded in browsers that don't support ES6. We also made significant
changes[4] to our JavaScript minifier[5] to support the minification of ES6
code, which previously generated invalid output[6] when confronted with ES6
syntax. And we updated our eslint ruleset[7] to add a linter configuration
for ES6 code[8]. Thank you to Timo Tijhof (Krinkle) and DannyS712 for code
reviewing the ResourceLoader and minifier changes, to Lucas Werkmeister for
finding a critical bug[9] in the minifier change, and to James Forrester
and Ed Sanders for code reviewing the linter changes and releasing new
versions of all these packages.

Roan Kattouw
On behalf of the Vue migration team (Anne Tomasevich, Eric Gardner, Volker
Eckl, and myself)

[1] https://www.mediawiki.org/wiki/Compatibility/IE11
[2] https://phabricator.wikimedia.org/T75714
[3] https://gerrit.wikimedia.org/r/c/mediawiki/core/+/657953
[4] https://gerrit.wikimedia.org/r/c/mediawiki/libs/Minify/+/664700
[5] https://github.com/wikimedia/Minify/
[6] https://phabricator.wikimedia.org/T26
[7] https://github.com/wikimedia/eslint-config-wikimedia
[8] https://github.com/wikimedia/eslint-config-wikimedia/pull/358
[9] https://phabricator.wikimedia.org/T277161
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How to invalidate a ResourceLoader URL that depends on user-generated content?

2020-02-12 Thread Roan Kattouw
Yes, it takes 5 minutes for just about everything to update in
ResourceLoader. This is also true for ResourceLoaderSiteStylesModule: when
you save an edit to e.g. MediaWiki:Common.css, you don't immediately see
the styles change. However, previewing edits to MediaWiki:Common.css does
work, because of some special handling.

When previewing, EditPage sets a content override (using
OutputPage::addContentOverride()) for the page being previewed, with the
text that's in the edit box. This content override is then propagated to
the ResourceLoaderContext object by OutputPage::getRlClientContext().
ResourceLoaderWikiModule checks these content overrides before accessing
wiki pages (which is how it gets the content from the edit box instead of
the content that's on the wiki page), and it also causes
shouldEmbedModule() to return true if there's a content override that
applies to it (which is how it avoids requesting the module from load.php
and instead embeds it as an inline 

Re: [Wikitech-l] Unbreak now! problem in this week train Watchlist

2019-03-14 Thread Roan Kattouw
This sounds like it could have been caused by
https://gerrit.wikimedia.org/r/c/mediawiki/core/+/416198

‪On Thu, Mar 14, 2019 at 10:29 AM ‫יגאל חיטרון‬‎ 
wrote:‬

> Hello. There is a regression problem, that started on this week deployment.
> I can see it in group 1 from yesterday evening. I do not file a phabricator
> ticket, because there is no algorithm to reproduce the problem.
>
> From time to time the API post query "mark this revision as read" does not
> work. In these times, there is a reproducing algorithm:
> 1) Open Special:Watchlist.
> 2) Pick an unread revision.
> 3) Open the diff to last version, or the view mode of the page.
> 4) Expected: the revision, and the whole page, should be automatically
> marked as read.
> 5) Refresh the Special:Watchlist.
> 6) Got: the revision remains bold.
> 7) Open API query for unread revisions, the relevant one is still there.
> 8) Try partagraphs 5-7 every 5 seconds.
> 9) In about twenty minutes the data is updated correctly.
> I saw this yesterday at about 19:45 UTC, and it happens again just now.
> Thank you.
> Igal (User:IKhitron)
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Ops] Changes to SWAT deployment policies, effective Monday April 30th

2018-04-27 Thread Roan Kattouw
Will multi-file, single-directory syncs still be allowed? In other words,
can I deploy a change to the Foo extension that touches many files with
scap sync-dir extension/Foo ?

On Thu, Apr 26, 2018, 15:15 Greg Grossmeier  wrote:

> Hello,
>
> I have made two changes to SWAT policies today.
>
> First, we now disallow multi-sync patch deployments. See T187761[0].
> This means that the sync order of files is determined by git commit
> parent relationships (or Gerrit's "depends-on"). This is to prevent SWAT
> deployers from accidentally syncing two patches in the wrong order.
>
> Second, we are reducing the number of allowed patches from 8 to 4. This
> is to reduce stress on the SWAT deployer as well as set expectations for
> requesters on the pace of the windows. See the approximate best case
> time spent breakdown[1] for how we came to this number.
>
> I've updated the on-wiki documentation on wikitech[2][3].
>
>
> Thank you for flying scap,
>
> Greg
>
>
> [0] https://phabricator.wikimedia.org/T187761
> [1]
> * +2/Wait for Jenkins to merge - 2 min
> * prepare git on tin - 1 min
> * Deploy to mwdebug - 1 min
> * Verify on mwdebug - 3 min
> * Deploy to production - 1 min
> * Verify & wait/watch logs - 2 min
> [2]
> https://wikitech.wikimedia.org/w/index.php?title=SWAT_deploys=prev=1789212
> [3]
> https://wikitech.wikimedia.org/w/index.php?title=SWAT_deploys=next=1789212
>
> --
> | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
> | Release Team ManagerA18D 1138 8E47 FAC8 1C7D |
>
> ___
> Ops mailing list
> o...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/ops
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] First step for MCR merged: Deprecate and gut the Revision class

2017-12-22 Thread Roan Kattouw
Thanks!

Context for those who don't want to read a couple hundred lines of IRC
logs: I was cooped up in the house all day and noticed it was about to get
dark, so I really did take a walk (relatively abruptly) after dealing with
the worst of the issue. During this walk I thought things over and realized
what the explanation for the early symptoms of this bug was, and I
explained it to Adam on IRC when I got back.

On Fri, Dec 22, 2017 at 6:00 PM, C. Scott Ananian 
wrote:

> Having just read through T183252, I feel Roan deserves a big hand for his
> "I take a walk and become Sherlock Holmes" detective work and "I'm just
> like Indiana Jones, except with code not tombs and bugs not snakes" code
> archaeology.
>
> I love working with smart folks.
>   --scott
>
> On Fri, Dec 22, 2017 at 11:37 AM, Chad  wrote:
>
> > Considering the code just landed last night and a good number of us are
> > going to be gone for vacation--is rolling this out with the Jan 2nd
> deploy
> > a little aggressive? I'd love to see this sit on beta (with eyes on it)
> for
> > a little longer. Or a way to deploy to testwiki etc independent of major
> > sites?
> >
> > The first deploy after a holiday break is already pretty exciting, and
> > rolling this out feels like something that could use a dedicated window.
> >
> > (Otherwise, kudos to the MCR team for hitting this milestone)
> >
> > -Chad
> >
> > On Fri, Dec 22, 2017 at 2:27 AM Daniel Kinzler <
> > daniel.kinz...@wikimedia.de>
> > wrote:
> >
> > > Hello all!
> > >
> > > Addshore last night merged the patch[1] that is the first major step
> > > towards
> > > Multi-Content-Revisions[2]: it completely guts the Revision class and
> > > turns it
> > > into a thin proxy for the new RevisionStore service. The new code is
> now
> > > live
> > > on beta.
> > >
> > > This is our second attempt: The first one, on December 18th, thoroughly
> > > corrupted the beta database. It took us some time and a lot of help
> from
> > > Aaron
> > > and especially Roan to figure out what was happening. A detailed
> > > post-mortem by
> > > Roan can be found at [3].
> > >
> > > Anyway - this stage of MCR development introduces the new
> multi-revision
> > > capable
> > > interface for revision storage (and blob storage) [4]. It does not yet
> > > introduce
> > > the new database schema, that will be the next step [5][6]. While doing
> > the
> > > refactoring, I tried to keep the structure of the existing code mostly
> > > intact,
> > > just moving functionality out of Revision into the new classes, most
> > > importantly
> > > RevisionRecord, RevisionStore, and BlobStore.
> > >
> > > Beware that with the next deployment (due January 2nd) the live sites
> > will
> > > start
> > > using the new code. Please keep an eye out for any strangeness
> regarding
> > > revision handling. Adam greatly improved test coverage of the relevant
> > code
> > > (thanks Adam!), but it's always possible that we missed some edge case,
> > > maybe
> > > something about archived revisions that were partially migrated from on
> > old
> > > schema or something similarly fun.
> > >
> > > Exiting times!
> > >
> > > Cheers
> > > Daniel
> > >
> > >
> > > [1] https://gerrit.wikimedia.org/r/#/c/399174/
> > > [2]
> > > https://www.mediawiki.org/wiki/Requests_for_comment/
> > Multi-Content_Revisions
> > > [3] https://phabricator.wikimedia.org/T183252#3853749
> > > [4] https://phabricator.wikimedia.org/T174025
> > > [5] https://phabricator.wikimedia.org/T174024
> > > [6] https://phabricator.wikimedia.org/T174030
> > >
> > >
> > > --
> > > Daniel Kinzler
> > > Principal Platform Engineer
> > >
> > > Wikimedia Deutschland
> > > Gesellschaft zur Förderung Freien Wissens e.V.
> > >
> > >
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
>
>
> --
> (http://cscott.net)
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikis paused at 1.28.0-wmf.18 (was: Re: Upgrade of 1.28.0-wmf.19 to group 1 is on hold)

2016-09-20 Thread Roan Kattouw
What are your plans for when you'll deploy wmf.19, branch wmf.20 and deploy
wmf.20?

I ask because I have a patch lined up that I wanted to merge today right
after the wmf.20 branch cut, because I want it to spend the full week on
beta labs before it goes to production. I can merge that patch today and
have it go to beta, and have it go into wmf.20 once it gets cut, but I'd
only be willing to do that if I had a guarantee that wmf.20 would not be
cut before early next week. Because it's currently unknown when wmf.20 will
be cut, I don't know what's safe to do.

Unrelatedly, I also have a patch merged in master that fixes a regression
in wmf.19. Since wmf.19 is not deployed right now, the regression doesn't
affect production, but if wmf.19 is ever going to be deployed, then the fix
needs to be cherry-picked into it. Do you know if wmf.19 is going to be
deployed, or is it going to be skipped? Since wmf.19 is not deployed right
now, should I submit a cherry-pick to it for SWAT today so that it gets
unbroken in the event it ever gets deployed?

I understand that you are investigating and will reevaluate after, but it
would be nice to have these questions answered quickly so we have some
sense of what is and isn't going to happen, and aren't blocked by this
uncertainty for much longer,

On Tue, Sep 20, 2016 at 12:27 PM, Tyler Cipriani 
wrote:

> tl;dr: All wikis are staying at 1.28.0-wmf.18 for now
>
> Last week all wikis were rolled back to MediaWiki version 1.28.0-wmf.18
> due to several problems that were spotted on Friday (2016-09-16)[0][1].
>
> The problems with wmf.19 seemed resolved by Monday (2016-09-19). The
> plan was to roll wmf.19 out to all wikis yesterday afternoon and continue
> with the wmf.20 branch-cut today as scheduled.
>
> Late yesterday a large performance regression was discovered in
> wmf.18[2].
>
> We've paused the rollout of wmf.19 and the branching of wmf.20 to allow
> time to investigate the nature of this performance regression.
>
> After there is a better understanding of the performance regression, we
> will reevaluate our plans for branching and rollout of wmf.19 and wmf.20.
>
> -- Tyler
>
> [0]. https://phabricator.wikimedia.org/T111441
> [1]. https://phabricator.wikimedia.org/T145819
> [2]. https://phabricator.wikimedia.org/T146099
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Ops] Canary Deploys for MediaWiki

2016-07-25 Thread Roan Kattouw
On Jul 25, 2016 16:12, "Bryan Davis"  wrote:
>
> I think Alex is "more right" here. If you are introducing a new $wmgX
> var you really should always sync-file the changed InitialiseSettings
> file first and then the CommonSettings that uses it. There's no really
> good reason to spew a bunch of "undefined X" warnings and there is no
> guarantee with sync-dir that the files will be sent in the proper
> order.
>

That's true, but sometimes (either because multiple changes are made, or
things are removed, or whatever) there is no sync order that does not
produce errors. This isn't common but I have had it happen. In those cases,
you'll now be forced to use sync-dir.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Ops] Canary Deploys for MediaWiki

2016-07-25 Thread Roan Kattouw
Note to deployers: when syncing certain config changes (e.g. adding a new
variable) that touch both InitialiseSettings and CommonSettings, you will
now need to use sync-dir wmf-config, because individual sync-files will
likely fail if the intermediate state throws notices/errors.

(It was a good idea to do this before, but it'll be more strongly enforced
now.)

On Jul 25, 2016 12:35, "Tyler Cipriani"  wrote:

> tl;dr: Scap will deploy to canary servers and check for error-log spikes
> in the next version (to be released Soon™).
>
> In light of recent incidents[0] which have created outages accompanied by
> large, easily detectable, error-rate spikes, a patch has recently landed in
> Scap[1] that will:
>
>1. Push changes to a set of canary servers[2] before syncing to proxy
> servers
>2. Wait a configurable length of time (currently 20 seconds[3]) for any
> errors to have time to make themselves known
>3. Query Logstash (using a script written by Gabriel Wicke[4]) to
> determine if the error rate has increased over a configurable threshold
> (currently 10-fold[5])
>
> Big thanks to the folks that helped in this effort: Gabriel Wicke, Filippo
> Giunchedi and Giuseppe Lavagetto, Bryan Davis and Erik Bernhardson (for
> their mad Logstash skillz)!
>
> It is noteworthy, that in instances where expedience is required—we're in
> the middle of an outage and who cares what Logstash has to say—the
> `--force` flag can be added to skip canary checks all together (i.e. `scap
> sync-file --force wmf-config/InitialiseSettings 'Panic!!'`).
>
> The RelEng team's eventual goal is still to move MediaWiki deployments to
> the more robust and resillient Scap3 deployment framework. There is some
> high-priority work that has to happen before the Scap3 move. In the
> interim, we are taking steps (like this one) to respond to incidents and
> keep deployments safe.
>
> Hopefully, this work and the error-rate alert work from Ori last week[6]
> will allow everyone to be more conscientious and more keenly aware of
> deployments that cause large aberrations in the rate of errors.
>
> <3,
> Your Friendly Neighborhood Release Engineering Team
>
> [0].
> https://wikitech.wikimedia.org/wiki/Incident_documentation/20160601-MediaWiki
> is the recent example I could find, but there have been others.
> [1]. https://phabricator.wikimedia.org/D248
> [2]. https://gerrit.wikimedia.org/r/#/c/294742/
> [3]. https://github.com/wikimedia/scap/blob/master/scap/config.py#L19
> [4]. https://gerrit.wikimedia.org/r/#/c/292505/
> [5]. https://github.com/wikimedia/scap/blob/master/scap/config.py#L18
> [6]. https://gerrit.wikimedia.org/r/#/c/300327/
>
> ___
> Ops mailing list
> o...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/ops
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Today's RFC meeting on IRC: Standardise on how to access/register JavaScript interfaces

2016-02-24 Thread Roan Kattouw
My apologies for the short notice. Normally we announce these more than one
hour in advance, but I forgot.

In today's RFC meeting, we will discuss the following RFC:

* Standardise on how to access/register JavaScript interfaces

>

The meeting will be on the IRC channel #wikimedia-office on
chat.freenode.net at the following time:

* UTC: Wednesday 22:00
* US PST: Wednesday 14:00
* Europe CET: Wednesday 23:00
* Australia AEDT: Thursday 09:00

Roan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] i would like that i need not set standart spacing in my code

2015-10-14 Thread Roan Kattouw
On Wed, Oct 14, 2015 at 10:23 AM, Jon Robson  wrote:

> All code standards are currently being enforced by jscs.
> They recently closed an issue to add auto-formatting
> https://github.com/jscs-dev/node-jscs/issues/516


There's also the --fix flag to jscs, which automatically fixes
whitespace-only failures.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New RFC: better JS minification

2015-09-03 Thread Roan Kattouw
On Thu, Sep 3, 2015 at 4:15 PM, Jon Robson  wrote:

> Thanks, I thought I was alone with being confused by this e-mail. As
> Jérémie correctly states we'll likely to get __less__ bugs with a more
> maintained library.


Less than zero? No one has managed to find a single bug in
JavaScriptMinifier.php since 2011.

I have nothing against moving to uglify (provided that source maps come
first), but don't use false arguments as a reason for migrating.

Roan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Collaboration team reprioritization

2015-09-02 Thread Roan Kattouw
On Wed, Sep 2, 2015 at 6:51 AM, Risker  wrote:

> I am certain once the team has a chance to refocus, they may choose to
> examine workflows that are common across multiple Wikimedia projects that
> would benefit from improvement.  Off the top of my head, creating a
> "deletion" wizard in collaboration with a couple of large Wikipedias might
> be a starting point.


Obviously this work is still in very early stages, but deletion-related
processes are definitely on our radar. AfD on enwiki was one of our case
studies during early work on this, as you can tell from this slide in
Danny's Wikimania presentation:
https://wikimania2015.wikimedia.org/w/index.php?title=File:User%28s%29_Talk%28ing%29_-_Wikimania_2015.pdf=8


>   I suspect that the Collaboration team and the
> Community Tech team will find many overlaps in their work as they go
> forward.
>
>
Yes, we are aware that that is likely to happen, and we'll be talking to
them about that.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Collaboration team reprioritization

2015-09-02 Thread Roan Kattouw
On Wed, Sep 2, 2015 at 6:21 AM, David Gerard  wrote:

> Did the stuff to port LQT threads/pages to Flow ever make it to
> production quality?
>
>
It was used to convert all LQT pages on mediawiki.org, including
[[mw:Support desk]] which is probably the largest LQT page that has ever
existed. So it meets the "WMF uses it" definition of production quality, at
least.

I don't think we currently have good documentation on how you can convert
your own wiki, but AFAIK "simply" running the convertAllLqtPages.php
maintenance script on a wiki that has both LQT and Flow installed should
work.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Collaboration team reprioritization

2015-09-02 Thread Roan Kattouw
On Wed, Sep 2, 2015 at 11:20 AM, Roan Kattouw <roan.katt...@gmail.com>
wrote:

> I don't think we currently have good documentation on how you can convert
> your own wiki, but AFAIK "simply" running the convertAllLqtPages.php
> maintenance script on a wiki that has both LQT and Flow installed should
> work.
>

It turns out we do have some documentation:
https://www.mediawiki.org/wiki/Flow/Converting_LiquidThreads . It's a bit
more focused on how users will experience the conversion, but it does also
tell you which scripts to run.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] Marielle Volz joins Wikimedia as Software Developer

2014-10-27 Thread Roan Kattouw
On Mon, Oct 27, 2014 at 11:48 AM, Tomasz Finc tf...@wikimedia.org wrote:
 I am pleased to announce Marielle Volz joining the Wikimedia
 Foundation as a Software Engineer for the Editing team.

Welcome Marielle!

 and started developing a node.js service Citoid[5] to
 make it easy to insert citations using a URL/DOI/Title in VE/Wikitext.

To put that into perspective: she's working on a feature in VE that
will let you paste in a URL to, say, a New York Times article, and
will then automatically generate and insert a {{cite news}} template
for you with all the right information (title of the article, author,
date, etc.). Some of you may have seen the demo that she and Erik did
at a metrics meeting earlier this year. This is really exciting,
because it's a big step forward towards the goal of making even
relatively complex tasks like inserting citations more convenient in
VE than in wikitext.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC meeting this week

2014-10-08 Thread Roan Kattouw
On Tue, Oct 7, 2014 at 4:08 AM, Tim Starling tstarl...@wikimedia.org wrote:
 * Australia AEST: Thursday 07:00

Ironically, Tim miscalculated the time in his own timezone :D it's 08:00 AEST.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Tell my favorite conference about your Wikimedia tech

2014-06-12 Thread Roan Kattouw
The linux.conf.au conference, which I have presented to and love going
to, just opened up its call for talks: http://linux.conf.au/cfp . They
want talks about all kinds of open source programming stuff, not just
Linux: Trevor and I presented about ResourceLoader in 2012, and James
and I  presented about VisualEditor in 2014.

LCA 2015 will be 12-16 January in Auckland, New Zealand. That's summer
in the Southern  Hemisphere, and the climate there is very moderate,
so no sweltering heat like in Australia :)

LCA provides travel funding to some speakers, and Wikimedia funds TPS
grants for people going  to conferences to talk about
Wikimedia-related things: https://meta.wikimedia.org/wiki/Grants:TPS .
So you could even get to go for free.

LCA is basically my favorite conference - the talks are great and of
high quality, they treat speakers well, and their audience gets what
we're doing. This is a fun chance to show off your project. I
encourage you to suggest a talk, or tell someone else that they should
talk about their project.

Roan

P.S.: Thanks to Sumana for writing most of this email, and for talking
me into proposing a talk for LCA in the first place back in 2011

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tell my favorite conference about your Wikimedia tech

2014-06-12 Thread Roan Kattouw
On Jun 12, 2014 4:08 PM, Luis Villa lvi...@wikimedia.org wrote:
 I got a balloon ride the year I spoke. It was trumped the next year by a
 helicopter ride. Definitely an amazing trip.

Wow, when was that?! I got no such thing :)

In 2012 they did take all the speakers a show (and dinner) at a staged Gold
Rush town (which is the *the* tourist attraction in Ballarat), in 2014 it
was just a dinner at the local marina.

Roan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bot flags and human-made edits

2014-05-19 Thread Roan Kattouw
On Mon, May 19, 2014 at 4:09 PM, Dan Garry dga...@wikimedia.org wrote:
1. When editing via the API, allows the user to choose whether or not to
flag an edit as a bot edit using the bot parameter.
I'm responsible for this part of the mess. I don't remember why it was
done this way though. I think the people that I took over the API edit
code from had put this feature in and I kept it.

What I think was going on is that no one could be bothered to figure
out how to represent I am a human editing using a bot account and I'd
like this edit to not be bot-flagged in the UI, so it wasn't done.
But in the API it was trivial (bot=1), so it was done.

That's just my recollection of it and it's probably flawed, this was
6-7 years ago.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] Rob Moen takes on new role in Growth team

2014-05-08 Thread Roan Kattouw
Thanks for all your work on VE, Rob, and best of luck on the growth team!

Roan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Roadmap and deployment highlights - Week of April 14th

2014-04-11 Thread Roan Kattouw
On Fri, Apr 11, 2014 at 4:15 PM, Greg Grossmeier g...@wikimedia.org wrote:
 * VisualEditor fixes:
 ** Adding a reference adds an empty one; editing a reference inserts a
new one
 ** fix JS error on opening redirect pages

+Enable VisualEditor on French Wikinews per community request

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Grrrit-wm down for a few hours

2014-03-17 Thread Roan Kattouw
On Mon, Mar 17, 2014 at 10:33 AM, Yuvi Panda yuvipa...@gmail.com wrote:
 Hello!

 I was a terrible maintainer, and forgot to migrate grrrit-wm to eqiad
 tools before the cutoff date. As a result, it will be down for a few
 hours as Coren does a batch migrate of all the things. Apologies for
 the disruption

The migration seems to have finished (for lolrrit-wm at least) 20
minutes ago. I kicked it and it now seems to be back up.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfsf] Announcement: Kunal Mehta joins Wikimedia as Features Contractor

2013-12-12 Thread Roan Kattouw
On Thu, Dec 12, 2013 at 5:08 PM, Terry Chay tc...@wikimedia.org wrote:
 Hello everyone,

 It’s with great pleasure that I’m announcing that Kunal Mehta[1] has joined
 the Wikimedia Foundation as contractor in Features Engineering.

Sweet! Welcome, Kunal!

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Module storage is coming

2013-12-03 Thread Roan Kattouw
On Tue, Dec 3, 2013 at 12:30 AM, Ori Livneh o...@wikimedia.org wrote:
 We'll gradually enable module storage on all Wikimedia wikis over the
 course of the next week or two.
Ori deployed this to the live site earlier today :) . For reference,
the original post about module storage is archived at [1].

I tweeted an impressive graph from Ganglia [2] that Ori shared on IRC
a little while after the deployment, and consequently my phone is now
blowing up as lots of people are retweeting it and replying to it.
Turns out lots of people are interested in Wikipedia :)

However, while the graph was impressive, there were some caveats:
* It was of internal traffic into the Apache backends serving bits,
not the Varnish caches
* The Varnish traffic (that actually get all the traffic) dropped
suddenly but not very sharply, and the drop was insignificant compared
to time-of-day and day-of-week variance
* The drop occurred because ResourceLoaderLanguageDataModule had a bug
in its mtime computation, causing it to recache all the time; module
storage greatly dampened the impact of that bug. The bug was
identified later that day and fixed by Krinkle [3], in the commit with
probably the highest commit-message-to-diff ratio of all time.

Although it wasn't really responsible for the huge drop we saw in
the graphs, make no mistake, this is awesome. Thanks Ori, for working
on this and putting up with my code review slowness and nitpicking :)

One interesting response I got on Twitter [4] said we should avoid
localStorage in favor of indexedDB and modern async APIs. I suppose
we could look into that :)

Roan

[1] http://lists.wikimedia.org/pipermail/wikitech-l/2013-November/072839.html
[2] https://twitter.com/catrope/status/408018210529615872
[3] https://gerrit.wikimedia.org/r/#/c/99010/
[4] https://twitter.com/andreasgal/status/408108587320623104

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VE/Migration guide for gadgets developers

2013-09-25 Thread Roan Kattouw
On Tue, Sep 24, 2013 at 11:45 AM, Sumana Harihareswara
suma...@wikimedia.org wrote:
 Roan, how did that go? Any links? :-)

The code ended up being written and merged. The documentation ended up
not being written before Wikimania, and then Life happened. Some Time
Soon I hope to get around to actually writing this up properly.

In the meantime, https://gerrit.wikimedia.org/r/#/c/75270/ fixes a
ResourceLoader bug that fatally breaks VE's plugin infrastructure for
on-wiki scripts (Gadgets and user/site JS), so until that's merged and
deployed, the interface I'd be documenting doesn't actually work
anyway.

Another thing I'm gonna do soon is take the GSoC project code for Math
and SyntaxHighlight and move it out of the VisualEditor extension into
the respective extensions. That'll be a good showcase for the
extension plugin integration, which does already work. Right after I
get that done would be a good time to document this, because I'll have
just gone through the process and know the pitfalls etc. first-hand.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Killing 1.XXwmfYY branches -- another idea?

2013-09-25 Thread Roan Kattouw
On Wed, Sep 25, 2013 at 2:53 PM, Chad innocentkil...@gmail.com wrote:
 wmf-foo - 1.22wmf19
 wmf-bar - 1.22wmf20
 wmf-baz - 1.22wmf21
 wmf-foo - 1.22wmf22
 wmf-bar - 1.22wmf23

This looks like it's exactly the same concept as slot0/slot1/slot2 in
Ryan's git-deploy proposal.

The objection that I would have to this is that it makes the
cherry-picking workflow less intuitive. You have to know which of
wmf-{foo,bar,baz} to cherry-pick to, rather than being able to
cherry-pick to wmf23 or whatever. Also, because the wmfN tags can only
be created after they're no longer live (because only at that point do
we know they'll have stopped changing), you can't actually find the
*current* wmfN anywhere, because it won't have been tagged yet.

The motivation is to stop making new branches all the time, but tags
and branches are equally cheap. A tag is just a frozen branch that
can't advance. If you're trying to tag something but it can still
change, use a branch. And that's what our deployment branches are,
IMO.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Killing 1.XXwmfYY branches -- another idea?

2013-09-25 Thread Roan Kattouw
On Wed, Sep 25, 2013 at 3:46 PM, Chad innocentkil...@gmail.com wrote:
 I actually like this idea a lot and it's way less confusing than my idea.
 Unless anyone's got any objections I'm going to go ahead and do this
 for all the 1.20 and 1.21 wmf branches.

Sounds good to me.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Git migration: complete

2013-07-26 Thread Roan Kattouw
On Fri, Jul 26, 2013 at 2:57 PM, Chad innocentkil...@gmail.com wrote:
 With the moving of pywikibot to Gerrit today, the last projects actively
 using SVN have
 been moved over to Git. As such, SVN is now a read-only service and the
 migration is
 done (of course any old SVN projects can always be migrated to Git/Gerrit,
 that's easy).

\o/

 Thanks to everyone for your patience, help, bug reports, blood, sweat,
 tears, firstborn
 children...whatever you gave to this monumental effort over the past year
 and a half.

And thank you, for giving this migration more blood, sweat and tears
than anyone else!

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] dirty diffs and VE

2013-07-25 Thread Roan Kattouw
On Wed, Jul 24, 2013 at 2:49 PM, C. Scott Ananian
canan...@wikimedia.org wrote:
 For what it's worth, both the DOM serialization-to-a-string and DOM
 parsing-from-a-string are done with the domino package.  It has a
 substantial test suite of its own (originally from
 http://www.w3.org/html/wg/wiki/Testing I believe).  So although the above
 is probably worth doing as a low-priority task, it's really a test of the
 third-party library, not of Parsoid.  (Although, since I'm a co-maintainer
 of domino, I'd be very interested in fixing any bugs which it did turn up.)

I didn't mean it as a test of Domino, I meant it as a test of Parsoid:
does it generate things that are then foster-parented out, or other
things that a compliant DOM parser won't round-trip? It's also a more
realistic test, because the way that Parsoid is actually used by VE in
practice is that it serializes its DOM, sends it over the wire to VE,
which then does things with it and gives an HTML string back, which is
then parsed through Domino. So even in normal operation, ignoring the
fact that VE runs stuff through the browser's DOM parser, Parsoid
itself already round-trips the HTML through Domino, effectively.

 The foster parenting issues mostly arise in the wikitext-parsoid DOM
 phase.  Basically, the wikitext is tokenized into a HTML tag soup and then
 a customized version of the standard HTML parser is used to assemble the
 soup into a DOM, mimicking the process by which a browser would parse the
 tag soup emitted by the current PHP parser.  So the existing test suite
 does expose these foster-parenting issues already.
Does it really? There were a number of foster-parenting issues a few
months ago where Parsoid inserted meta tags in places where they
can't be put (e.g. trs), and no one in the Parsoid team seemed to
have noticed until I tracked down a few VE bugs to that problem.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] dirty diffs and VE

2013-07-24 Thread Roan Kattouw
On Wed, Jul 24, 2013 at 3:10 AM, Marc Ordinas i Llopis
marc...@wikimedia.org wrote:
 As Subbu said, I'm currently working on improving the round-trip test
 server, mostly on porting it from sqlite to MySQL but also on expanding the
 stats kept (with things like performance, etc.). If you think of some other
 data we should track, or any new report we could add, we certainly welcome
 suggestions :) Please open a new bug or add to the existing one:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=46659

Thanks for working on this! The Parsoid testing infrastructure is
pretty awesome.

There are a few things I wish it tested, but they're mostly about how
it tests things rather than what data is collected. For instance, it
would be nice if the round-trip tests could round-trip from wikitext
to HTML *string* and back, rather than to HTML *DOM* and back. This
would help catch cases where the DOM doesn't cleanly round-trip
through the HTML parser (foster-parenting for instance). It may be
that this is already implemented, or that it was considered and
rejected, I don't know.

Additionally, it might be helpful to have some tests looking for null
DSRs or other broken data-parsoid stuff (because this breaks selser),
and/or some sort of selser testing in general (though off the top of
my head I'm not sure what that would look like). Another fun
serialization test that could be done is stripping all data-parsoid
attributes and asserting that this doesn't result in any semantic
diffs (you'll get lots of syntactic diffs of course).

 Or just drop by #wikimedia-parsoid, I'm marcoil there.

The channel is #mediawiki-parsoid :)

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs

2013-07-23 Thread Roan Kattouw
On Tue, Jul 23, 2013 at 12:44 PM, Daniel Barrett d...@vistaprint.com wrote:
 Risker asks:
Why do you think those nowiki tags were added by the editors?

 I can't speak for the original poster, but the last time I used VE,
 it added unwanted nowiki tags by itself.
 You can see an example in my most recent bug report:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=51829

Of course those nowiki tags weren't added by the editors, VE doesn't
let you do that directly. What I think Robert was talking about
(thanks for that analysis, BTW!) is edits where the user typed
something like [[Foo]] into the editor, and Parsoid, in order to
achieve a truthful rendering of [[Foo]], wraps it in nowiki tags.
That's a case of we did what the user asked us to do, although the
user probably didn't mean that. To counter this, there's a
notification bubble that appears as soon as you type something that
looks like wikitext. However, there's a bug that's causing this
notification to appear at the top of the page and so if you're
scrolled down more than a little bit, you'll probably never see it. We
intend to fix this today or tomorrow so that everyone who types in
wikitext will see this warning. It typically displays in the close
vicinity of, or even on top of, the save button, so it should be
pretty hard to miss once the positioning bug is fixed.

The heading bug where you get ==nowiki /== is a separate issue. This
happens when you blank a heading (or try to remove it but don't quite
succeed) and leave an empty heading behind. VE then sends an empty
heading to Parsoid, which very diligently puts in a nowiki tag so you
get the empty heading you supposedly wanted. These kinds of
technically-correct-but-probably-not-what-you-wanted issues are a bit
tricky. They're on our list of things to deal with though.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VE and nowiki

2013-07-23 Thread Roan Kattouw
Whoops, didn't realize this thread had been forked off while I was out
to lunch, and so I responded to the other thread. Sorry about that :(

On Tue, Jul 23, 2013 at 1:11 PM, Steve Summit s...@eskimo.com wrote:
 This is a likely enough mistake, and the number of times you
 really want explicit double square brackets is small enough, that
 it's worth thinking about (if it hasn't been already) having VE
 detect and DTRT when a user types that.

VE does detect this and warn the user, but there's a bug that makes
the warning bubble appear out of view if you're not scrolled all the
way up to the top. One of our team members is working on fixing that
right now.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VE and nowiki

2013-07-23 Thread Roan Kattouw
On Tue, Jul 23, 2013 at 2:14 PM, John Vandenberg jay...@gmail.com wrote:
 There is an enhancement for that.

 https://bugzilla.wikimedia.org/show_bug.cgi?id=51897

 IMO the current behaviour isnt correct.  There are so few instances
 that nowiki is desirable, that the current VE should refuse to accept
 wikitext (at least [[ and {{, and maybe ==, # and * at beginning of
 line, etc ), until at least such time as they have sorted out all the
 other bugs causing this to happen unintentionally.  If the user needs
 to input [[ or {{ they can use the source editor.  Or the VE could
 walk the user through each nowiki and either a) ask the user to
 confirm they want the obvious fix done automatically for them, or b)
 help them fix the problem.  Before saving.

We disagree on that one then. VisualEditor is meant to hide wikitext
entirely. The primary focus is on people that don't know wikitext. I
agree that we should keep nowiki-fication to a minimum and get rid of
the other bugs that cause this to happen, but I think the action we
currently take when a user types in wikitext (which is to warn them
and say this won't work) is appropriate. VE is meant to be a visual
editor, not a 
visual-except-with-weird-shortcuts-that-only-make-sense-if-you-know-the-legacy-markup-we-used-before-your-time
editor.

That's my opinion. Actual product direction is not something I'm in
charge of, that's James F's job, but AFAIK our current product
direction is similar to what I just said.

 nowiki is also being inserted at beginning of lines.

 https://es.wikipedia.org/w/index.php?title=Yacimiento_de_Son_Forn%C3%A9sdiff=prevoldid=68561708
 https://fr.wikipedia.org/w/index.php?title=Joel_Lindperecurid=5580501diff=95038161oldid=95037498

That's because the lines start with a space, which triggers a pre if
not nowiki-ed. Obviously the correct thing to do there is just remove
the space, but we haven't fixed that yet.

 instead of lists

 https://es.wikipedia.org/w/index.php?title=Jorge_Eslavadiff=68542729oldid=68451155

Once again that's people typing wikitext into VE, which is not supported.

 There are 24 open bugs for 'visualeditor nowiki'

 https://bugzilla.wikimedia.org/buglist.cgi?title=Special%3ASearchquicksearch=visualeditor%20nowikilist_id=219994

A lot of those are parts of symptoms of other bugs, and I'm pretty
sure a few of them are duplicates. nowiki often appears when
something else goes wrong and Parsoid attempts to fix things.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VE/Migration guide for gadgets developers

2013-07-22 Thread Roan Kattouw
On Mon, Jul 22, 2013 at 9:51 AM, Eran Rosenthal eranro...@gmail.com wrote:
 Hi,
 When the ResourceLoader was deployed (or even before it) to production,
 there were migration development guides for gadget/extension developers:

-

 http://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_for_extension_developers
-
http://www.mediawiki.org/wiki/ResourceLoader/Developing_with_ResourceLoader

 Such guides allowed easier adoption of ResourceLoader. We need something
 similar for the visual editor:

Yes we do :) . More generally, VE needs plugin infrastructure. Most of
it is already there, and I'm working on the last bits of it.

- Migration - What are the recommended steps to make gadget/extension
VE adapted? [with answers to questions such as: how to get the underlying
model - instead of $('#wpTextbox1').val() and what is this model [and what
modifications to the underlying model are supported/to be avoided by
gadgets/user scripts ] ]
- Development with the VE: guides with explanation for common editor UI
customization, and what is recommended API for it (for example: add custom
toolbar buttons).

These are excellent suggestions for documentation we should write once
the plugin infrastructure work is done. I intend to do this in the
next few weeks.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs

2013-07-22 Thread Roan Kattouw
On Mon, Jul 22, 2013 at 11:41 AM, John phoenixoverr...@gmail.com wrote:
 Minimal java-script load my ass,
Your language and tone are inappropriate. Please keep it civil.

 I guess you must be using a fiber-optic
 connection. Most pages already have a lag due to the amount of JS needed to
 run the site. Jumping pages have been a normal thing since resourceloader
 (caused by lagging JS issues)

The initial JS loaded for VisualEditor is a whopping 2.9KB gzipped.
I'd call that minimal. There is lots of other JS out there, but this
thread is about VisualEditor.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Alternative editor?

2013-07-21 Thread Roan Kattouw
On Fri, Jul 19, 2013 at 1:06 PM, Derric Atzrott
datzr...@alizeepathology.com wrote:
 Are there any extensions that make use of this hook?  Does
 VisualEditor?
No, VisualEditor doesn't use this hook. Instead, it uses JavaScript to
insert edit links and/or modify existing ones. The editor itself is
also entirely JavaScript-based. There is very little PHP code in the
VisualEditor extension and really all it does is load JavaScript code.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Australia (was Re: conferences, Hacker School, Code for America)

2013-07-19 Thread Roan Kattouw
On Fri, Jul 19, 2013 at 5:54 AM, Quim Gil q...@wikimedia.org wrote:
 CFP Extension Announced linux.conf.au 2014 - linux.conf.au !!!
 Now it's July 20.
 http://linux.conf.au/media/news/27


 ... and this is tomorrow.

Beware that because of Australia's far-east timezone, it's already
tomorrow there :)

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Git config trick.

2013-07-19 Thread Roan Kattouw
On Fri, Jul 19, 2013 at 10:40 AM, Ori Livneh o...@wikimedia.org wrote:
 In ~/.gitconfig, add:

 [url ssh://your_usern...@gerrit.wikimedia.org:29418/mediawiki/extensions/]
 insteadOf = ext:

 Now you can:

 git clone ext:UploadWizard

 !

! indeed. Sweet trick, dude, thanks!

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Mediawiki-api] Blacklisted?

2013-07-15 Thread Roan Kattouw
There was an API outage yesterday (Sunday July 14th) between roughly
12:00 and 14:00 UTC (5am-7am Pacific Time). As part of our
firefighting we blocked the IPs that were hitting the API the hardest
at the time. We didn't have time to do outreach because things were on
fire and falling over; I'm very sorry about that. We also should have
unblocked these IPs after the outage was over, but forgot (I was
mostly just trying to get things back up then go back to sleep).

For privacy reasons we can't disclose the IPs we blocked on Sunday. We
unblocked them just now, though, so if your bots are now working again
that would be it :)

I'm very sorry for the inconvenience and lack of follow-up. We
generally try to contact people, but when things are as bad as they
were yesterday we have to act quickly.

Roan


On Mon, Jul 15, 2013 at 10:50 AM, Alex Monk kren...@gmail.com wrote:
 Sending to wikitech-l instead, mediawiki-api is just for the MediaWiki API
 itself, not Wikimedia server stuff.

 Alex Monk

 On Mon, Jul 15, 2013 at 6:38 PM, Robert Crowe rob...@ourwebhome.com wrote:

 It looks to me like I was blacklisted for EN Wikipedia API requests.  My
 website has been using the API for awhile now, but suddenly I'm getting 403
 errors coming back.  I've tried to follow all the rules, but if I've missed
 something I'm happy to make changes.  What disturbs me most is that there
 was no attempt to contact me before being blacklisted to let me know there
 was a problem.  Where should I go to find out what the problem is?

 Thanks,

 Robert




 ___
 Mediawiki-api mailing list
 mediawiki-...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/mediawiki-api



 ___
 Mediawiki-api mailing list
 mediawiki-...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/mediawiki-api


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] Announcement: C. Scott Ananian joins Wikimedia as Senior Features Engineer

2013-07-11 Thread Roan Kattouw
On Thu, Jul 11, 2013 at 3:24 PM, Terry Chay tc...@wikimedia.org wrote:
 Hello everyone,

 It’s with great pleasure that I’m announcing that C. Scott Ananian[1] has
 joined the Wikimedia Foundation as a Features Engineer.

Welcome! I'm excited to hear you're officially with us now.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [SOLVED] Re: ResourceLoader support question: how to construct a value in CSS from PHP

2013-07-09 Thread Roan Kattouw
On Wed, Jul 3, 2013 at 3:27 PM, Thomas Gries m...@tgries.de wrote:
 I found the solution ( $wgExtensionAssetsPath )
Why do you need this? You shouldn't need this. Paths relative to the
CSS file should be remapped automatically and should Just Work.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Australia (was Re: conferences, Hacker School, Code for America)

2013-07-07 Thread Roan Kattouw
On Sat, Jul 6, 2013 at 7:30 AM, Quim Gil q...@wikimedia.org wrote:
 I also think that linux.conf.au could be a good chance to spread our word in
 Australia. You are encouraged to apply.

I went to (and presented at) linux.conf.au in 2012, and I had an
awesome time. The quality of the talks and the number of high-quality
talks is amazing. I highly recommend it.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit tests fail from Jenkins but work fine locally

2013-06-28 Thread Roan Kattouw
On Fri, Jun 28, 2013 at 10:50 AM, Arthur Richards
aricha...@wikimedia.org wrote:
 Mobile web is trying to merge https://gerrit.wikimedia.org/r/#/c/69585/ but
 PHPUnit tests are failing when Jenkins executes them.

 What's weird is that we've executed PHPUnit tests on our various local
 machines - with no failures. We've been scratching our heads trying to
 figure out what might be causing the inconsistency.

 Anyone have any ideas what might be causing this?

Something that's tripped me up with VisualEditor a couple of times is
that Jenkins doesn't actually run the tests on your branch, it runs
them on a hypothetical merge with your branch into master. The process
is something like this:
* Attempt to merge the branch into master
* If there is a conflict, reject the commit with a message saying
something about a merge conflict, and abort
* If there was no conflict, there is now a local (to the Jenkins
server) merge commit that merges this branch into master
* Run tests on this merge commit

This means that if there was a recent change in master that broke your
code, but doesn't conflict directly (perhaps because an interface you
were using changed), the tests succeed in the branch but fail in the
merge commit. Jenkins correctly recognizes that merging your branch
would break master, and rejects your branch reporting the relevant
test failures, but this is not immediately obvious because the code as
committed worked fine.

The way I address this is to rebase the change onto master (and
because there is no conflict, you can even use Gerrit's rebase button
for this), at which point the tests will start failing locally as
well.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Brian Wolff's summer gig, with Wikimedia!

2013-06-03 Thread Roan Kattouw
On Mon, Jun 3, 2013 at 12:50 PM, Rob Lanphier ro...@wikimedia.org wrote:
 Hi everyone,

 Many of you already know Brian Wolff, who has been a steady
 contributor to MediaWiki in the the past several years (User:Bawolff),
 having gotten a start during Google Summer of Code 2010[1].

 Brian is back for another summer working with us, working generally to
 improve our multimedia contribution and review pipeline.  In addition
 to his normal GMail address, he's also available at
 bawo...@wikimedia.org, and is on Freenode as bawolff.

 Welcome Brian! (again! \o/)

Awesome! Welcome, Brian!

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deploying alpha of VisualEditor to non-English Wikipedias

2013-04-29 Thread Roan Kattouw
On Fri, Apr 26, 2013 at 9:42 AM, James Forrester
jforres...@wikimedia.org wrote:
 The report a problem link sends the report privately to a Parsoid
 server. From your video, it appears that the API call to do so failed
 - could you file a Bugzilla item with more details so we can
 investigate?

It's sending information to parsoid.wmflabs.org, and for now that's
intended behavior. We may change the way this information is collected
later, but for now it lives on labs, where the Parsoid developers
collect and analyze the information.

Even though Lukas's video shows a failed XHR request in the console,
the data was most likely submitted successfully. This is because we
use cross-domain AJAX to POST the data, and jQuery's attempt to read
the response (which is empty anyway) is blocked because of same-origin
restrictions, which causes the error you're seeing. At this point the
information has already been submitted, though, so the error is a red
herring. This behavior is clearly suboptimal and misleading, but it's
a minor issue at this point. It's tracked as
https://bugzilla.wikimedia.org/show_bug.cgi?id=42974 .

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deploying alpha of VisualEditor to non-English Wikipedias

2013-04-29 Thread Roan Kattouw
On Fri, Apr 26, 2013 at 11:37 AM, Claudia Müller-Birn
c...@inf.fu-berlin.de wrote:
 Hi James,

 Thanks for clarifying.

 I am just wondering why the two feedback mechanisms send to different targets 
 which I personally find a bit confusing. What has been the decision behind it?

They are different feedback mechanisms with different privacy
expectations. The Leave feedback flow collects general, free-form
feedback about the editor and posts it publicly. Only the text the
user types into the form is collected and published. The Something is
wrong flow is intended to collect data about cases where users see a
diff they don't expect. It collects a lot of information to help us
figure out what happened and why; this not only helps us find bugs,
but it also collects the information we need to track it down and fix
it. Because the information collected includes the text the user was
attempting to save but apparently chose not to save, we treat it as
private information.

So one flow collects general feedback and the text field is all we
collect, whereas the other collects data about a specific failure and
the text field just serves as a footnote on a lot of technical data we
collect along with it and is narrowly focused on the specific problem
at hand: what did the user do to trigger the bug. In the general
feedback flow, we communicate to the user that what they submit will
be public, and it's very easy to explain what will be public (the text
they put into the field). In the bad diff report flow, the data we
collect (editor contents with an unsubmitted edit, among other things)
is a bit more privacy-sensitive and it's hard to explain to the user
what we'll be collecting.

 And why is the Something is wrong or Etwas ist schief gelaufen a 
 privately sent feedback? Wouldn't be great to have here an open process and 
 not sending the feedback to a black hole?

This flow is mostly for reporting bugs, and the form asks them to
provide details about the specific bug they encountered rather than
general feedback. These comments wouldn't be particularly useful
without the associated debugging data. The debugging data isn't
particularly interesting to the general public, only to developers
(that's not an argument for why it shouldn't be public, just one for
why it doesn't hurt too much that it's not public). We feel like it
would be against the spirit if not the letter of the privacy policy to
publish this data without telling the user what we're publishing
(especially since unsaved editor contents are privacy-sensitive: there
is no expectation they'll be published because they're unsaved). But
it's also difficult to explain to the user what exactly it is that we
would be publishing.

Contrary to the confusion and other reports on this thread, I can
assure you that the data submitted through the Report problem
interface isn't sent to a black hole. It's being collected by the
Parsoid team, and they have in the recent past analyzed it and
discovered bugs in both Parsoid and VE.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deploying alpha of VisualEditor to non-English Wikipedias

2013-04-29 Thread Roan Kattouw
On Fri, Apr 26, 2013 at 3:05 PM, Lukas Benedix
bene...@zedat.fu-berlin.de wrote:
 I'm not a non-tech-user, but I don't like using bugzilla.
 here is the new bug: https://bugzilla.wikimedia.org/47755

Per my earlier post, I've closed that bug as INVALID (as feedback
collection does actually work), with a reference to the bug for making
it less sketchy.

 There are lines marked as changed in the diff that I never touched:
 http://lbenedix.monoceres.uberspace.de/screenshots/tlv5b64zcj_(2013-04-26_23.41.46).png
 http://lbenedix.monoceres.uberspace.de/screenshots/tlv5b64zcj_%282013-04-26_23.41.46%29.png

 Is this a (known) bug?

Not that I'm aware of. Looks like a bug in reference serialization.
Could you tell me which page you were editing there?

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deploying alpha of VisualEditor to non-English Wikipedias

2013-04-29 Thread Roan Kattouw
On Mon, Apr 29, 2013 at 4:36 AM, David Gerard dger...@gmail.com wrote:
 Cool, thank you :-) Every time I've used it lately, it's messing with
 the ref tags. Just now I literally moved a comma and it decided
 messing with the ref tags would be just the thing to do ...

Yeah, sorry about that :( . There was a bug where Parsoid duplicated
ref tags; Lukas's screenshot showed that too. The Parsoid team
deployed a fix for this just now, so the duplication bug should be
fixed now. There are rumors of other issues with ref tags, we're
looking into those still.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] VisualEditor on wikitech.wikimedia.org

2013-04-29 Thread Roan Kattouw
Because Faidon idly suggested that we should install VisualEditor on
wikitech as a way of dogfooding, I went ahead and did it.
https://wikitech.wikimedia.org/w/index.php?title=PowerDNSdiff=68284oldid=14633
is the first VE edit :)

Inside baseball note: this uses the Tampa Parsoid cluster, not the one
in eqiad, because the machine hosting wikitech (virt0) is in Tampa. If
and when things move to eqiad we can change that.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Temporarily disabling l10n update

2013-04-10 Thread Roan Kattouw
On Wed, Apr 10, 2013 at 1:14 PM, Max Semenik maxsem.w...@gmail.com wrote:
 Does that mean that scap is also prohibited?

Scap runs the same script to clear blobs. We can sabotage that script,
but that will cause all i18n updates to JS to propagate more slowly,
regardless of whether they came from a code deployment or l10nupdate.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New git-review version revives year-old bug

2013-04-08 Thread Roan Kattouw
About a year ago, we were struggling with a git-review bug that caused
lots of bogus warnings to appear. When running git review, you'd get
a warning saying you're about to submit multiple commits, followed by
a list of lots of other people's commits that have already been
merged. I fixed this in https://review.openstack.org/#/c/6741/ last
year.

This bug is now back in the latest release of git-review that came out
over the weekend. I complained about this at
https://review.openstack.org/#/c/20450/ , which is the change that
reintroduced the broken behavior. We are suffering from it
disproportionately because we have defaultrebase=0 set on most of our
projects, and the bug only triggers when rebasing is disabled (using
either that setting or the -R flag).

The workaround is the same as last year: if git-review complains and
you see bogus commits in the list, respond no to abort, run git
fetch gerrit, then rerun git-review. This will ensure git-review has
an up-to-date view of the remote master.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia

2012-12-12 Thread Roan Kattouw
On Tue, Dec 11, 2012 at 10:55 PM, Lee Worden worden@gmail.com wrote:
 Very exciting - congratulations!

 I know these are early days for the VisualEditor, but is there a plan for
 extension developers to be able to hook in to provide editing for the things
 their extensions support?
Yes, absolutely! We've been working on cleaning up and rewriting
various internal APIs in VE such that they can reasonably be used to
write extensions. We've made progress, but we're not done yet, and
more recently it's received less attention because of yesterday's
release. We're gonna be picking that work back up in January, and once
it's done, we would be happy to work with willing guinea pigs to test
our APIs in the wild and work out the remaining kinks. As for when
that'll actually be scheduled to happen, I defer to James F.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia

2012-12-12 Thread Roan Kattouw
On Wed, Dec 12, 2012 at 11:32 AM, Daniel Barrett d...@vistaprint.com wrote:
 Will the method for hooking into VE be the same as for WikiEditor?  Or will 
 extension
 developers need to support both editors in two different ways?

It won't be the same as for WikiEditors, because the two are very
fundamentally different. WikiEditor gives you a toolbar that allows
you to insert and manipulate wikitext. VisualEditor gives you
something similar (a toolbar that allows you to insert and manipulate
rich content), but it also gives you inspectors with pop-up dialogs,
toolbar buttons that can change state depending on what's selected,
and much more. You can also add new *types* of content, change how
content is rendered, etc. The possibilities are endless compared to
WikiEditor.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia

2012-12-12 Thread Roan Kattouw
On Wed, Dec 12, 2012 at 12:14 PM, Matthew Flaschen
mflasc...@wikimedia.org wrote:
 I would definitely be willing to serve as a guinea pig, working to
 integrate ProveIt (http://en.wikipedia.org/wiki/User:ProveIt_GT).

Awesome!

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Nobody Wikidata bugs: notify when you start working on a bug

2012-12-08 Thread Roan Kattouw
On Thu, Dec 6, 2012 at 10:07 AM, Quim Gil q...@wikimedia.org wrote:
 Hi, thanks to the metrics reports now we know that the top bug fixers in
 November were Nobody (228) and Wikidata bugs (83)... followed by Michael
 Dale (28), Roan Kattouw (23), etc.

 http://www.mediawiki.org/wiki/Community_metrics/November_2012#People

Those statistics don't actually measure who fixes bugs, they measure
who the fixed bugs were assigned to. Those aren't necessarily the same
person (although I imagine this is rare), but the larger issue is
that, as you say, most bugs have no human assignee. Another statistic
that is used in the BZ reports sent to this list is who closed the bug
(i.e. changed its status to RESOLVED FIXED), but this is also
suboptimal. For instance, in the VisualEditor team, James somewhat
frequently cleans up after developers who fix a bug but forget to
close it, or even mention the bug in the commit summary, so he's
probably the top bug fixer in VE by that metric, even though most of
that is just him taking paperwork off our hands. Another problem is
that bugs can bounce between REOPENED and FIXED multiple times, and
can be set to FIXED by different people each time.

So both metrics are noisy, although I imagine the latter would not
have a 50% signal-to-noise ratio like the former. Getting more
accuracy would be complicated: you'd probably have to look for Gerrit
links on the bug and identify their authors, or something like that.

Not saying the metric you used is wrong (it has advantages and
disadvantages), but I do think it's a bit misleadingly labeled.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] problems merging changes from master - remote branch

2012-12-01 Thread Roan Kattouw
On Fri, Nov 30, 2012 at 2:12 PM, Arthur Richards
aricha...@wikimedia.org wrote:
  ! [remote rejected] HEAD - refs/publish/esisupport/bug/41286 (change
 32896 closed)
It looks like https://gerrit.wikimedia.org/r/#/c/32896/ is a commit
that you submitted to esisupport at some point, then abandoned, and
are now trying to push in again. Gerrit doesn't like that very much.
To fix it, you can try restoring the change (click Restore Change on
the Gerrit page for the change; you may need the author or a Gerrit
admin to do this for you, I don't know if we've managed to give
everyone permission to restore yet), then trying to run 'git review'
again. That should introduce new changes for the things you're
submitting, and create a new patchset for the change that was giving
you trouble.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WMF's webpage

2012-11-28 Thread Roan Kattouw
On Wed, Nov 28, 2012 at 4:21 PM, Chad innocentkil...@gmail.com wrote:

 On Wed, Nov 28, 2012 at 6:48 PM, Luke Welling lwell...@wikimedia.org
 wrote:
  Is there a reason not to use the Yahoo championed approach of embedding a
  version number in all static file names so you can set a very long cache
  expires time and just add new versions to the CDN when a change is made?
 

 That's what we used to do for CSS/JS--but we don't serve individual files
 like that anymore--they're all served together via the resource loader.

 ResourceLoader actually uses a somewhat more advanced version of the
version-number-in-filename approach: it dynamically computes the last
modified timestamp of the collection of resources it's requesting, and puts
that timestamp in the query string. If the timestamp is not available
dynamically, we omit the timestamp, and RL automatically serves the
resource with a short cache timeout (5 min) in that case.


 Also, it wouldn't have helped much in this case--the problem was that the
 HTML/CSS output changed in an incompatible way and we had old (or
 new, during the rollback) versions of the HTML still being served via
 Squid (they're cached for 30 days, unless something purges them like an
 edit).

 Don't change the HTML in incompatible ways is probably a good general
 rule to live by--but having an easy way to say start purging all pages on
 $theseWikis from Squid/Varnish would also be nice.

 Yes, the HTML and CSS changed in incompatible ways. The CSS cache
invalidated quickly (5-10 mins probably), but the HTML cache didn't.
Platonides probably misspoke when he said cached (skin) CSS had h5; I'd
imagine it was the Squid-cached HTML instead.

Either way, the CSS should be backwards-compatible with the old HTML, and
in this case it wasn't. That's the core of the problem.

Roan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Off-and-on CSS and rendering issues, Bits related?

2012-11-16 Thread Roan Kattouw
On Fri, Nov 16, 2012 at 12:34 PM, Andre Klapper aklap...@wikimedia.org wrote:
 * Missing Gadgets tab (also with Vector skin):
   
 https://en.wikipedia.org/wiki/Wikipedia:Village_pump_%28technical%29#Missing_Gadgets_Tab
This was fixed last night. It has nothing to do with bits, it was
caused by memcached corruption.

 * Maybe also: Edit conflict sometimes does not show the current
   text: https://bugzilla.wikimedia.org/show_bug.cgi?id=42163

I think it's very very very unlikely that this is because of a
front-end issue, it's almost certainly a backend issue.

 Speculation:

 Looking at the monthly stats on
 http://ganglia.wikimedia.org/latest/graph_all_periods.php?c=Bits%20application%20servers%20pmtpam=cpu_reportr=hours=by%20namehc=4mc=2st=1353096306g=network_reportz=largec=Bits%20application%20servers%20pmtpa
 (click on Inspect) there is a drop of Out on November 5/6th.

 From http://wikitech.wikimedia.org/view/Server_admin_log#November_5 :
 21:54 aaron synchronized wmf-config/mc.php 'Switched all wikis to
 memcached-multiwrite.'
 21:54 logmsgbot_: aaron synchronized wmf-config/InitialiseSettings.php
 'Disabled spammy (with pecl memcached) memcached.log, the useful one is
 memcached-serious.'

The drop is strange, but could be explained by the memcached changes
and the logging changes, if similar drops were observed in the other
Apache clusters.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] jQuery 1.9 will remove $.browser (deprecated since jQuery 1.3 - January 2009)

2012-11-16 Thread Roan Kattouw
On Thu, Nov 8, 2012 at 5:17 PM, Tim Starling tstarl...@wikimedia.org wrote:
 I can understand the rationale behind removing jQuery.browser:
 apparently most developers are too stupid to be trusted with it. Maybe
 the idea is to use per-project reimplementation of jQuery.browser as
 an intelligence test. The trouble is, I think even the stupidest
 developers are able to copy and paste.

Yes, there are legitimate reasons for using browser detection, in
cases where feature detection doesn't work. The Safari segfault is a
good example. In fact, VisualEditor uses browser detection to disable
itself in browsers with broken contentEditable support, because it's
very hard or impossible to feature-detect this. I believe Timo was
working on cleaning up a different UA detection plugin somewhere on
github and using that for VE's browser detection.

tl;dr: browser detection is evil, you should think twice before using
it, but sometimes you have to

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Creating custom skin based on Vector in MediaWiki 1.20

2012-11-16 Thread Roan Kattouw
On Tue, Nov 13, 2012 at 12:33 AM, Dmitriy Sintsov ques...@rambler.ru wrote:
 However, 'skin.vector' module includes both styles and scripts. And
 setupSkinUserCss() adds styles only. So 'dependencies' did not help, vector
 styles are loaded later, anyway. What can I do with that?

Unfortunately, addModuleStyles() and dependencies don't work well
together. You shouldn't use dependencies for CSS that is essential for
rendering the page.

 Also, I need to load remote google webfonts. Does ResourceLoader support
 this or I have to use old-fashionated methods of OutputPage() ?
Unfortunately RL doesn't support this directly. Although to load a
webfont, all you need is a stylesheet with an @font-face in it, right?

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Content handler feature merge (Wikidata branch) scheduled early next week

2012-09-27 Thread Roan Kattouw
On Sep 25, 2012 12:15 PM, IAlex ialex.w...@gmail.com wrote:
 Would it be possible to have the whole changes as an changeset on Gerrit?
 This would make review and comments much easier than having to do this on
this list.

Yes and no. The person that merges the branch can create a merge commit and
submit that for review (git checkout -b mergewikidata master  git merge
wikidata  git review) but Gerrit will not show the diff properly: it'll
either show just the conflict resolutions, or nothing at all. You can view
the full diff by fetching the commit on your localhost and using standard
git tools (e.g. git review -d 12345  git show), but you won't be able to
use inline comments quite as nicely.

Roan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review meeting notes

2012-09-21 Thread Roan Kattouw
On Thu, Sep 20, 2012 at 2:08 PM, Mark Holmquist mtrac...@member.fsf.org wrote:
 Hm. Will this be file-level whitelisting (i.e., this file changed from the
 master branch in this patchset, so we'll show the changes) or is it
 line-level? If the latter, how? Because I'm not sure it's trivial

I believe it's file-level, which eliminates most but not all noise.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit server issues

2012-09-12 Thread Roan Kattouw
On Wed, Sep 12, 2012 at 3:51 PM, Platonides platoni...@gmail.com wrote:
 Doesn't formey have core.logAllRefUpdates set to true?
 Wouldn't that have prevented git gc from removing commits referenced in
 the reflog? (at least until two weeks without the references passed, it
 should probably have been run with --no-prune)

I did indeed manage to recover some commits that were lost (apparently
gc'ed) on manganese but still present on formey, so that sounds
plausible.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit server issues

2012-09-10 Thread Roan Kattouw
On Fri, Sep 7, 2012 at 2:12 PM, Chad innocentkil...@gmail.com wrote:
 All of these have been fixed other than Nonlinear (more heavily
 broken). TranslationNotifications' master is intact, but some of
 the changes are still in a bad state and I need to finish cleaning
 it up.

TranslationNotifications has now been fixed.

 software, wikimedia-lvs-realserver, wikimedia-search-qa and
 wikistats are all back and fine. mysqlatfacebook is very broken
 like Nonlinear.

I fixed mysqlatfacebook, which had been completely garbage-collected
on manganese (the primary server). I restored the commits from formey
(the replication slave) where they'd been orphaned but not deleted.
Nonlinear is also fixed.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] scaled media (thumbs) as *temporary* files, not stored forever

2012-09-05 Thread Roan Kattouw
On Sun, Sep 2, 2012 at 5:59 PM, Tim Starling tstarl...@wikimedia.org wrote:
 The other reason for the existence of the backend thumbnail store is
 to transport images from the thumbnail scalers to the 404 handler. For
 that purpose, the image only needs to exist in the backend for a few
 seconds. It could be replaced by a better 404 handler, that sends
 thumbnails directly by HTTP. Maybe the Swift one does that already.

My understanding is that thumb.php already streamed the thumbnail back
to the 404 handler via HTTP and has done so for at least the past two
years or so.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] scaled media (thumbs) as *temporary* files, not stored forever

2012-09-05 Thread Roan Kattouw
On Wed, Sep 5, 2012 at 12:35 PM, Asher Feldman afeld...@wikimedia.org wrote:
 Browser scaling is also at least worth
 experimenting with.  Instances where browser scaling would be bad are
 likely instances where the image is already subpar if viewed on a high-dpi
 / retina display.
Other instances where browser scaling is bad are:
* PoS browsers that don't render SVGs (how old are these by now?)
* Even modern browsers have subpar SVG rendering at 1x, PNG looks better
* Some media types are scaled in unusual ways (SVGs, but also video
stills, PDF pages, ...)
* Some original images are really friggin' large (20-30 megapixels
sometimes), so at least some downscaling is needed there
* Mobile clients will want to minimize the amount of data transferred

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work offer inside

2012-08-31 Thread Roan Kattouw
On Fri, Aug 31, 2012 at 1:07 PM, Derric Atzrott
datzr...@alizeepathology.com wrote:
 Forgive me for not knowing, but what is OIT?  A quick Google gives me Oregon
 Institute of Technology, but I that is it given the context.

Wikimedia's Office IT department.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] scaled media (thumbs) as *temporary* files, not stored forever

2012-08-31 Thread Roan Kattouw
On Fri, Aug 31, 2012 at 1:22 PM, Jeremy Baron jer...@tuxmachine.com wrote:
 On Fri, Aug 31, 2012 at 1:48 PM, Derric Atzrott
 datzr...@alizeepathology.com wrote:
 Perhaps generate the standard thumbnail sizes at upload time

 I believe the status quo is no thumbs (of any size) are generated at
 upload time. They are all just done on demand.

That's correct in theory. In practice (for uploads using
Special:Upload at least) the user is redirected to the file
description page after a successful upload, which contains a thumb of
the file of a given size (800px?), which is then immediately generated
on demand :) . Special:UploadWizard has similar behavior: the success
page contains 200px(?) thumbs of all the images the user uploaded.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [testing] TitleBlacklist now tested under Jenkins

2012-08-30 Thread Roan Kattouw
On Thu, Aug 30, 2012 at 1:11 AM, Oren Bochman orenboch...@gmail.com wrote:
 I've tried to do this for translate ext last week - so here are a couple of
 questions:
 1. is successfully runnig the test a requirement to successfully score on
 gerrit? (i.e. how is gerrit integrated)
Jenkins sets a Verified +1 or -1 on the change, depending on whether
the tests succeed or not, just like for core. A V+1 is required to
merge a change. So to merge any change into an extension, the tests
(that is, the core tests and any extension tests) have to pass. But
the extension isn't required to have tests, it's just that you can't
submit changes that break the core tests or break any tests that are
present in the extension. Existing extensions without any tests or
awareness of phpunit will continue to work just fine, as long as they
don't try to do crazy stuff that breaks a test in core.

 2. does the extension need to include php unit?

No, all of the phpunit wrapper stuff is in MediaWiki core. Adding
tests to extensions is simple, see
https://www.mediawiki.org/wiki/Manual:PHP_unit_testing/Writing_unit_tests_for_extensions
.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Disabling direct pushing to master for extensions

2012-08-29 Thread Roan Kattouw
On Wed, Aug 29, 2012 at 12:58 PM, Chad innocentkil...@gmail.com wrote:
 Before I make the change though, I
 wanted to ask about it publicly to make sure there's no major blockers
 to me doing so.

As long as people that were previously able to push are still able to
+2 (which is probably true, unless the ACLs are really weird), and as
long as self-review isn't prevented by the software in the future
(there was some discussion about this at some point), I think this'll
be fine, because extension maintainers that used to use direct push
can submit their change for review, then approve it themselves.

I maintain that self-review is evil for core and deployed extensions,
but for undeployed extensions that don't have multiple active
maintainers I think it's fine.

Also, didn't Shawn say there were vague plans to make direct pushes
actually create changes and stuff? If and when that happens, I'd like
to open direct push back up, because really this self-review workflow
is a workaround for the fact that Gerrit doesn't handle direct pushes
very well.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Disabling direct pushing to master for extensions

2012-08-29 Thread Roan Kattouw
On Wed, Aug 29, 2012 at 1:55 PM, Jeroen De Dauw jeroended...@gmail.com wrote:
 Forgive my git ignorance here, but will merging this merge commit result
 into all original commits being in the version history, or will it just
 show as one huge change? In case of the later, I don't think this is an
 acceptable approach.
It will still show up in the history as individual commits on the
right-hand side of the merge commit, same as when you do 'git checkout
master; git merge mybranch; git push'.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Appreciation thread

2012-08-24 Thread Roan Kattouw
On Thu, Aug 23, 2012 at 7:50 PM, Erik Moeller e...@wikimedia.org wrote:
 Completely random appreciation for whoever implemented the undo
 feature in MediaWiki, one of its many hidden gems. :-)

Looks like the main contributors to this were Andrew Garrett in July
2006 [1] and December 2006 [2], Aaron Schulz in March 2007 [3], and
Andrew again in July 2007 [4].

So appreciation for them, the other old hands that have been around
for such an incredibly long time, and all the other committers that
created the tens of thousands of commits in MediaWiki's history [5];
it's always interesting to dig through them when looking up old stuff
like this.

Roan

[1] 
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=commitdiff;h=f062b09dc38a0612a92a5dba08dc01746b6f42f2
[2] 
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=commitdiff;h=c490d265fa711547ff110438540b4f3457079fa2
[3] 
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=commitdiff;h=0fe87673b78482cfe60b02e2937d1284d83ad4d8
[4] 
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=commitdiff;h=d53b216aa7819db154b3bde9cb797f20f42900f5
[5] For those of you wondering shouldn't that be 'over 100k?': there
were about 114k revisions in SVN at the time of the git migration, but
those weren't all MW core. At the time of this writing, there were
about 43k commits in the mediawiki/core.git history (excluding merge
commits).

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Appreciation thread

2012-08-23 Thread Roan Kattouw
On Thu, Aug 23, 2012 at 2:30 AM, Niklas Laxström
niklas.laxst...@gmail.com wrote:
 * Sumana for this idea.
+1

Also:
* Inez for writing code I intended to write, exactly the way I
intended to write it, while I was busy with something else yesterday
* Timo (Krinkle) for announcing he's back from vacation via the gerrit-wm bot
* The rest of the VE team for being generally awesome
* Aaron, Ariel, Ben and Faidon (and anyone else that's working on this
that I'm forgetting) for their relentless work on the Swift migration
this week

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Can we make an acceptable behavior policy? (was: Re: Mailman archives broken?)

2012-08-17 Thread Roan Kattouw
On Fri, Aug 17, 2012 at 11:20 AM, Ryan Lane rlan...@gmail.com wrote:
 1. Adjust our procedures for thread deletion so that this situation
 doesn't occur again.
 2. Fix the links, if the effort level isn't insane.

I said this on IRC in some form, but I'll repeat it here on the
(hopefully permalinked ;) ) record: I think the concern MZ was raising
is that initially (for quite a long time actually), there was total
silence on #2. Daniel and other ops people posted to the list, but
made no mention of attempts to fix the links or the feasibility
thereof, it was all focused on #1 (here's how we'll prevent this in
the future, and BTW, mailman sucks).

While #1 is important (and bashing mailman for this is deserved), #2
was being completely ignored. Instead, everyone focused on how MZ's
phrasing was hostile (which is fair enough, if people feel someone is
behaving in an offensive manner, they should call that person out on
it). Specifically, in the thread where Ryan called out MZ, the
question but what are you doing to fix this? was repeated in some
form or other in three posts (2 by MZ, one by Michel) before Daniel
said he was working on it and Ryan posted the above. On the original
thread, there were suggestions as to how the links could be fixed, but
no one in ops responded to those posts, or acknowledged them, or even
acknowledged that it's something that should be worked on until Daniel
and Ryan did that just now. MZ's post complained about a lack of
communication from ops about issue #2, and the response to it was a
more active lack of communication from ops about issue #2.

Hostile behavior has no place on this list and calling it out is a
good thing. But what happened here is an anti-pattern that I've seen
around here before: when someone says something in an inappropriate
way, people call them out on it (which is good), but subsequently
refuse to discuss the substance of what was said, to the point where
whenever it's brought up it's either ignored or the conversation is
steered back to the inappropriate behavior. In this case it was
eventually addressed after repeated questions from multiple people,
but that hasn't always happened. I've been in a situation where I
wrote overly aggressive criticism and got called out on it, which is
fair, but the substance of the criticism was never addressed and
because the subject is apparently tainted, reviving the discussion is
futile. I tried, and it went straight back to everyone piling on me
for how hostile I'd been, so I gave up.

I'm not complaining about being called out or even bashed for writing
overly hostile posts. I've done that a few times and I hate it when I
slip up and do it. If I get called out or bashed over that, I've
deserved it, and I'll take it. But what pisses me off is that it's
apparently impossible to get people to discuss real issues once
they're tainted with inappropriate communication. That's unhealthy and
needs to stop. So by all means, call people out on hostility. But
don't ignore stifle discussion about real problems.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Status::getXML usage?

2012-08-15 Thread Roan Kattouw
On Wed, Aug 15, 2012 at 10:04 AM, Tyler Romeo tylerro...@gmail.com wrote:
 Hey,

 So a quick grep search cannot tell me where the method Status::getXML is
 used. It doesn't seem to exist anywhere in the core. Maybe some extensions
 use it? I'm asking primarily because if it's not used then it should
 probably be removed. Logic for processing Status objects into export
 formats such as XML shouldn't be handled (and apparently aren't handled) by
 the Status object itself.

I grepped all extensions in git and SVN for 'getxml'
(case-insensitive), and I only saw a few calls to something called
getXML(), all of which referred to a different classes (within the
extensions themselves). So Status::getXML() isn't used in any
extensions in our version control systems.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Criteria for serious alternative

2012-07-27 Thread Roan Kattouw
On Jul 27, 2012 6:58 AM, Chad innocentkil...@gmail.com wrote:
 1) Personal dashboards will now be private -- the proper way to query
 someone's work is the owner: query in the search box.
As MZ and I found out last night, this is not the case. An owner: search
only finds changes that person *started*, so it doesn't find changes by
others they've contributed patchsets to. Searching for an e-mail address
does get all changes someone's been involved in (even if they only comment
on the change), I believe it's equivalent to reviewer:. But this is all far
from obvious and took me a while to figure out, people who haven't used
Gerrit before don't really stand much of a chance there.

Roan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Criteria for serious alternative

2012-07-27 Thread Roan Kattouw
On Jul 27, 2012 7:17 AM, Faidon Liambotis fai...@wikimedia.org wrote:
 I think GitLab looks promising but I'm unable to judge it against all of
 our requirements just from the online demo and without spending some
 amount of time on it. I just put some pros and cons as I see them to the
 Wiki page and hope it can be considered.

I would also like Gitlabs to be considered, so I'll poke at that wiki page
later. I may or may not have time to play with it before Monday.

Roan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Criteria for serious alternative

2012-07-27 Thread Roan Kattouw
On Jul 27, 2012 1:11 AM, Antoine Musso  We have disallowed branch
creation and merging for now, that can
 definitely be opened up. Chad recently opened the sandbox reference and
 we have a few of them:

   remotes/gerrit/sandbox/apramana/gsoc
   remotes/gerrit/sandbox/demon/foo-bar
   remotes/gerrit/sandbox/devunt/oauth
   remotes/gerrit/sandbox/ori/hacks

 They are actual branch we can most definitely merge -no-ff back in master.
 http://www.mediawiki.org/wiki/Gerrit/personal_sandbox

This is a great feature, but I think most people realize it exists. Was it
ever publicized widely?

Roan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Serious alternatives to Gerrit

2012-07-25 Thread Roan Kattouw
On Tue, Jul 24, 2012 at 10:46 PM, Alolita Sharma
alolita.sha...@gmail.com wrote:
 Cool, Priestley is awesome. If he comes to visit we should prevent him from
 leaving :)

 +1. We should definitely think about adopting Phabricator as a project if
 we're going to invest in its core developer.

I think this should be reversed. If we're gonna use Phabricator, we
should attract its core developer. We shouldn't switch code review
systems just because we hired a famous person.

I personally think that something like gitlabs would be superior to
Phabricator, but that's another discussion.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Serious alternatives to Gerrit

2012-07-25 Thread Roan Kattouw
On Wed, Jul 25, 2012 at 12:39 AM, Ryan Lane rlan...@gmail.com wrote:
 1. It would be worrisome from a security point of view to host the
 operations repos on Github. I think it's very likely that those repos
 would stay in Gerrit if the devs decided to move to Github. This would
 mean that we'd have split workflows, forcing developers to use both.

What's more, we also wouldn't want to host MW deployment code (the
wmf/* branches) on Github for the same reason. Those aren't even
separate repos, they're branches within the mediawiki/core repo.
Having a split workflow for contributing code to the project on the
one hand and merging it into deployment on the other hand would be
even worse.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Serious alternatives to Gerrit

2012-07-25 Thread Roan Kattouw
On Wed, Jul 25, 2012 at 3:31 PM, Chad innocentkil...@gmail.com wrote:
 I haven't actually tried doing custom Javascript yet, but it should be
 completely doable via the GerritSite.html header that you can
 customize (in fact, I've got some other non-JS customizations I
 want to roll out there soon).
I've done this, stealing some code from the OpenStack-CI people. It's
in an abandoned change in the puppet repo:
https://gerrit.wikimedia.org/r/#/c/3285/2/files/gerrit/skin/GerritSiteHeader.html

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Serious alternatives to Gerrit

2012-07-20 Thread Roan Kattouw
On Thu, Jul 19, 2012 at 11:52 PM, Antoine Musso hashar+...@free.fr wrote:
 How are the blind spot? You can diff each patchset with its previous one
 (though a rebase kind of screw it up).

 Whenever I send a new patchset, I try to add a cover message introducing
 what was done in the new patchset so people receive a mail notification
 explaining what has been done.

Yes, Gerrit handles this well (except for post-merge fixups, which
aren't handled well), and from what I've heard Github has improved and
now does a decent job as well. But I've seen other systems (such as
Barkeep) where the developers have massively underestimated the
complexity that fixups bring to pre-merge review.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How to tell MW that I want to use the HTTPS version?

2012-07-20 Thread Roan Kattouw
On Fri, Jul 20, 2012 at 6:10 AM, Chad innocentkil...@gmail.com wrote:
 It's because the URLs are based on who was using the wiki at the
 time and caused the e-mail to be generated. If I'm using HTTP,
 you'll get an HTTP e-mail.

This is no longer true, I fixed this about a year ago. E-mail
notifications now use the canonical URL, which is based on
$wgCanonicalServer. By default (and on the WMF cluster), this is HTTP.
We can't just use the recipient's protocol because there is no
protocol preference.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Serious alternatives to Gerrit

2012-07-19 Thread Roan Kattouw
On Thu, Jul 19, 2012 at 4:40 PM, Platonides platoni...@gmail.com wrote:
 It should be possible to only bisect on the left-branch of merges. If
 git doesn't have such feature, it should be added. In fact, it's very
 likely that it's what it uses when start and end are in the same line,
 but as the merged commits can also have that ancestor, it might also go
 there.

 We would need to store multiple commits as non-fastforwards, even if
 descending from head, but that's trivial.
 What I have trouble with is in imagining the UI for reviewing a topic
 branch, with perhaps several commits per step.

 Always registering as a merge, has succh as storing the merger name as
 the commiter, but I'd like keeping them as fest-forwards having for
 one-commit changes.
Yes, I suppose that's reasonable, if you no-ff multi-commit merges it
should be fine.

I completely agree with you and Subbu re needing to have an interface
for reviewing a multi-commit branch as a whole. I would prefer a
multi-commit model over Gerrit's amend model, but only if the UI for
followups is reasonable. I don't know of any existing tool that has
this. In fact, Gerrit is the only tool I'm aware of where it's clear
that the UI was designed with fixup commits in mind (even though
they're amends, which I agree is suboptimal). Fixup commits tend to be
a blind spot when people think about pre-commit review.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Serious alternatives to Gerrit

2012-07-19 Thread Roan Kattouw
On Thu, Jul 19, 2012 at 11:58 AM, Erik Moeller e...@wikimedia.org wrote:
 What are people's experiences with Gitorious? Does it seem
 sufficiently hackable to perhaps meet our needs, or does it have too
 many architectural flaws to be worthwhile?

 Here are a few things I've been able to find out from poking at it:

 * Unlike GitHub, it's fully open source (AGPL).
 * It's a Ruby on Rails codebase with lots of gem dependencies and a
 reputation for being hard to install (haven't tried it).
 * Like GitHub/Gerrit, it's not just a code review tool, but also
 manages repositories.
 * Like GitHub, it makes it easy to clone/fork entire repos and send a
 merge request for a series of commits, encouraging true decentralized
 development.
 * It supports both individual clones and team clones.
 * Like most tools, it supports inline comments in code review:
 http://blog.gitorious.org/2009/11/06/awesome-code-review/
 * Unlike Gerrit, it conveniently shows all diffs for a commit on a
 single page, e.g.:
 https://gitorious.org/gitorious/mainline/merge_requests/208
gitlab is an open source clone of GitHub (which means you can
self-host it) which pretty much as the same characteristics as listed
above:
* MIT licensed
* Ruby on Rails (makes me worry about installability and
performance/scalability)
* Doesn't manage repositories but integrates with gitolite which does
* Has most features that GitHub has AFAIK, but I haven't looked at it in depth

http://gitlabhq.com

 * Help build a realistic alternative to GitHub, probably by building
 on the open source toolset that's the closest right now in terms of
 its overall architectural approach and existing functionality. (My
 impression: Gitorious is closer in functionality but at a lower
 overall quality, Phabricator/Barkeep are much more limited in
 functionality right now and therefore would take a long time to be
 usable with our current workflow assumptions.)

gitlab might be this, but it's written in Ruby so presumably our
developer community would be less able to contribute to it. And I'm
pretty sure ops is not just gonna say sure, no problem if/when we
ask them to deploy a Ruby web app :)

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Serious alternatives to Gerrit

2012-07-19 Thread Roan Kattouw
On Thu, Jul 19, 2012 at 6:29 PM, Roan Kattouw roan.katt...@gmail.com wrote:
 gitlab is an open source clone of GitHub (which means you can
 self-host it)
Forgot to say: thanks to Timo for bringing gitlab to my attention.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Serious alternatives to Gerrit

2012-07-18 Thread Roan Kattouw
On Wed, Jul 18, 2012 at 9:30 PM, Subramanya Sastry
ssas...@wikimedia.org wrote:
 (b) Commit amends hide evolution of an idea and the tradeoffs and
 considerations that went into development of something -- the reviews are
 all separate from git commit history.   All that is seen is the final
 amended commit -- the discussion and the arguments even that inform the
 commit are lost.  Instead, if as in (a), a topic branch was explicitly (or
 automatically created when a reviewer rejects a commit), and fixes are
 additional commits, then, review documentation could be part of the commit
 history of git and shows the considerations that went into developing
 something and the evolution of an idea.

 There was an email recently on wikitext-l where Mark Bergsma was asked to
 squash his commits (http://osdir.com/ml/general/2012-07/msg20847.html) -- I
 personally think it is a bad idea that is a result of the gerrit review
 model.  In a different review model, a suggestion like this would not be
 made.

Although squashing and amending has downsides, there is also an
advantage: now that Jenkins is set up properly for mediawiki/core.git
, we will never put commits in master that don't pass the tests. With
the fixup commit model, intermediate commits often won't pass the
tests or even lint, which leads to false positives in git bisect and
makes things like rolling back deployments harder.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Linux.conf.au CfP deadline about to close

2012-07-06 Thread Roan Kattouw
On Thu, Jul 5, 2012 at 6:17 AM, Sumana Harihareswara
suma...@wikimedia.org wrote:
 http://linux.conf.au/cfp

 Closes in about 24 hours.
This was extended by 2 weeks
https://twitter.com/linuxconfau/status/221225678127894528

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] getting Jenkins to run basic automated tests on commits to extensions

2012-07-05 Thread Roan Kattouw
On Thu, Jul 5, 2012 at 1:37 PM, Jon Robson jdlrob...@gmail.com wrote:
 If we do this running jslint on JavaScript would also be great. I have a
 git hook I stole from Yuvi that I'm currently using. Would be great to have
 Jenkins do this check for me as well...
I believe Timo is already working on getting jshint working on the
command line. There's a few commits to .jshint and .jshintignore
(IIRC) that are currently pending review.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] getting Jenkins to run basic automated tests on commits to extensions

2012-07-05 Thread Roan Kattouw
On Thu, Jul 5, 2012 at 2:11 PM, Chad innocentkil...@gmail.com wrote:
 Yes, but we can't even get people to type `php -l` before pushing,
 which is why having Jenkins do this for us is crucial.

Yes, I don't disagree that jshint should be run by Jenkins. AIUI
Timo's work to make jshint work on the command line is prep work for
exactly that.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Barkeep code review tool

2012-07-01 Thread Roan Kattouw
On Sat, Jun 30, 2012 at 11:53 PM, Antoine Musso hashar+...@free.fr wrote:
 Roan Kattouw wrote:
 Yes, ops essentially uses a post-commit workflow right now, and that
 makes sense for them.

 ops also uses pre-commit review for non-ops people :-]

Yeah, that's right. What I meant to say (and thought I had said in
some form later in that message) was that the puppet repo has
post-commit review for most changes by ops staff, and pre-commit
review for everything else (non-ops staff, volunteers, and certain
changes by ops staff in some cases).

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] git commit history

2012-07-01 Thread Roan Kattouw
On Sun, Jul 1, 2012 at 5:30 PM, Subramanya Sastry ssas...@wikimedia.org wrote:

 One thing I just noticed when looking at the git history via gitk (on
 Ubuntu) is that the history looks totally spaghetti and it is hard to make
 sense of the history.  This seems to have happened since the switch to git
 and post-commit review workflow.  It might be worth considering this as
 well.  git pull --rebase (which I assume is being used) usually helps
 eliminate noisy merge commits, but I suspect something else is going on --
 post-review commit might be one reason.  Is this something that is worth
 fixing and can be fixed?  Is there some gerrit config that lets gerrit
 rebase before merge to let fast-forwarding and eliminate noisy merges?

Yes, this can be configured in Gerrit on a per-repo basis. I seem to
recall we had a reason for not enabling this but I don't remember that
discussion very well or whether it took place at all.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Barkeep code review tool

2012-06-30 Thread Roan Kattouw
On Fri, Jun 29, 2012 at 2:20 PM, Marcin Cieslak sa...@saper.info wrote:
 As seen on IRC:

 https://github.com/ooyala/barkeep/wiki/Comparing-Barkeep-to-other-code-review-tools

The most prominent feature of Barkeep mentioned on this page is that
it was built for a post-commit review workflow. Given that the reason
we moved MediaWiki to git was basically so that we could move *away*
from post-commit review, I don't think using Barkeep as-is would work.

Then again, from watching the demo video (see getbarkeep.org) it looks
like their UI is a lot better than Gerrit's, and I like features like
saved searches and most-commented-on-commits dashboards. Integrating
Barkeep or the UI/UX ideas from it with Gerrit (or vice versa --
integrating Gerrit's pre-commit review workflow support with Barkeep
-- but I think that would be harder) would be cool but I have no
concrete ideas as to how it could be done right now.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Barkeep code review tool

2012-06-30 Thread Roan Kattouw
On Sat, Jun 30, 2012 at 4:23 PM, Faidon Liambotis fai...@wikimedia.org wrote:
 Well, in the ops puppet repo though, we very often +2 commits ourselves
 and push them, instead of waiting for someone else to review/approve
 them. You could argue that it's our workflow that it's wrong, but I just
 think the needs for infrastructure-as-code might be different from the
 needs code development has.

 It's like asking for pre-execution reviews of whatever you type in a
 shell prompt and we can all agree that's just crazy :) In a perfect
 world we'd stage every change in VMs where we'd do local puppet commits
 without reviews; then push those tested changesets into the pre-commit
 review system to get into production. But we're very far from that and
 being perfectionists just hurts us more on our daily work.

 Having a proper post-commit review workflow would be much better than
 hacking around the system and reviewing commits ourselves. It'd also
 allow us to actually have post-commit reviews, something that rarely
 happens right now. At least I'd do that, while currently it's a PITA to
 do so.

Yes, ops essentially uses a post-commit workflow right now, and that
makes sense for them. Gerrit doesn't support post-commit review very
well so Barkeep might work better there. But other projects, such as
MediaWiki core, do use real pre-commit (or pre-merge, rather) review.

 Barkeep claims to work with both post- and pre-commit workflows,
 although the details elude me.

Per Subbu's message, you'd have to add support for pre-commit. Gerrit
manages the git repositories themselves, implements fine-grained
permission settings and handles gated branches both in terms of access
control (no one can push directly to master, only certain people can
approve things, etc.) and automatic merging.

 The UI is much *much* nicer than Gerrit's. They also have a demo
 website, which is a pleasure to work with IMHO.

Agreed.

 They also claim to have useful, relevant and configurable e-mail
 notifications too, in contrast to Gerrit's which are basically useless.
 Maybe I'm too much of a relic to prefer reading commit diffs in my mail
 client, rather than fancy web interfaces :)

I don't know what these useful e-mail notifications look like but I
would love to find out. Gerrit's aren't totally useless but they're
also not all that useful.

 All in all, I like it very much but I don't have a broad knowledge of
 how people use Gerrit right now and therefore I can't form an opinion on
 whether it's suitable for us.

I think that depends on your definition of us. For the puppet repo,
a mix of pre-commit and post-commit (post-commit for immediate changes
made by ops people, pre-commit for changes submitted by non-ops people
(staff or volunteers) and for ops changes that can afford to wait for
review) would probably be best. I think a number of other projects,
such as the analytics repos or extensions that aren't deployed on the
WMF cluster, might prefer this as well. But for MediaWiki core and
deployed extensions I'm pretty sure we'll want to stick with
pre-commit review.

This makes it sound like using both Gerrit and Barkeep together might
work, but I don't think that's necessarily a good idea. These are the
problems I imagine we'd have:
* people having to learn two different tools in general: more
learning, possible confusion over what lives where and when to use
what
* review comments on one commit are spread across two systems, and you
can't see the Gerrit comments in Barkeep or vice versa
* ops staff works mostly with Barkeep, non-ops and volunteers work
mostly with Gerrit -- ops staff more likely to neglect Gerrit review
queue

So I think it would be much better if we could have proper integration
between the two, or maybe even implement pre-commit review in Barkeep.
The issue ticket that Subbu found [1] has a comment suggesting how a
pre-commit workflow might be implemented:
* push to feature branches
* run a script that polls Barkeep for approved commits in feature
branches and merges them into master
Problems with this approach are:
* you need access controls to prevent people from pushing to master
** this can be done with something like gitolite, or even with Gerrit
* Gerrit's immediate-merge-upon-approval feature is very nice
** this is easy to implement if Barkeep offers an event stream
* fixing things to address review isn't handled nicely
** to fix something you'd either add a fixup commit to your branch or
create a new branch with an amended commit
** in both cases the commits are separate, so the review comments are
spread out and aren't linked like patchsets in Gerrit
** it's easy to accidentally approvemerge an outdated branch, or to
forget all fixup commits
** (the fixup workflow is flawed in general; with the amend/new-branch
workflow, every single commit in master is safe to deploy, with the
fixup workflow there can be commits that are broken, don't pass the
tests, or even have syntax errors)

Per the points above I 

Re: [Wikitech-l] Registering JavaScript vars for only a certain resource loader module in php

2012-06-22 Thread Roan Kattouw
On Fri, Jun 22, 2012 at 12:19 PM, Daniel Werner
daniel.wer...@wikimedia.de wrote:
 I was wondering whether it is possible somehow to register a javascript var
 for a certain resource loader module in php?
This is a feature I've been wanting to have in ResourceLoader for a
while, but I haven't gotten around to implementing it yet.

 There is the
 OuputPage::addJsConfigVars function but I didn't find anything to only load
 a var only conditionally when a module is loaded. Right now I am
 registering a variable in the  ResourceLoaderGetConfigVars  hook, but
 actually, I only want to have it included when the 'wikibase' module is
 loaded since it is rather big and shouldn't be loaded when not necessary.

A workaround you can use is to write your own custom module (a
subclass of ResourceLoaderModule) that provides this variable. I did
this in VisualEditor recently, see
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/extensions/VisualEditor.git;a=commitdiff;h=ae48f152f92753a438fb9b371e2f70cd40ff179c
.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] who can merge into core/master?

2012-06-15 Thread Roan Kattouw
On Fri, Jun 15, 2012 at 8:48 AM, Sumana Harihareswara
suma...@wikimedia.org wrote:
 If i want to be able to commit to extension X. i first have to file a
 request on the developer access page, and then another request for
 project membership? on a page that is named project ownership?

Membership != ownership. If you have a Gerrit account, you can submit
commits for review to any repository in the system, including your
favorite extension. What you seem to want is not to be able to
*commit* to extension X, but to *review* and *merge* proposed commits.
Obviously that's a higher level of trust and responsibility (because
there is no second level of review like there used to be, this *is*
the is-this-safe-for-deployment review), so it's not given out to just
anyone.

The ability to approve and merge is called ownership. There is no such
thing as extension membership, only global membership (by having a
Gerrit account) you're either an owner with special access to a
certain repo, or just another contributor with the same level of
access as anyone else.

 Just have *one* page for all requests. gerrit account, group
 membership, project ownership, gerrit repo... one request queue, at
 least for all things that need feedback/voting

 Maybe it would help to have a page that briefly describes the
 different roles, and which responsibilities and abilities they entail in
 terms of git and gerrit, how to request the respective permissions, and
 which policies govern the granting of these.

Yes, documentation would be useful. It's simpler than you seem to
think, but since it's not really documented anywhere it can seem
complex.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Revoking +2 (Re: who can merge into core/master?)

2012-06-15 Thread Roan Kattouw
On Fri, Jun 15, 2012 at 1:53 PM, Rob Lanphier ro...@wikimedia.org wrote:
 It would also be nice if we didn't have to resort to the nuclear
 option to get the point across.  One low-stakes way we can use to make
 sure people are more careful is to have some sort of rotating oops
 award.  At one former job I had, we had a Ghostbusters Stay Puft doll
 named Buster that was handed out when someone broke the build that
 they had to prominently display in their office.  At another job, it
 was a pair of Shrek ears that people had to wear when they messed
 something up in production.  In both cases, it was something you had
 to wear until someone else came along.  Perhaps we should institute
 something similar (maybe as simple as asking people to append OOPS
 to their IRC nicks when they botch something).

whobrokeitthistime.wikimedia.org? :)

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


  1   2   3   4   5   6   7   8   >