The "remove from code review queue" is not that hard really, you can
just remove yourself (and reviewers you added) from reviewers. The most
helpful reviewers also comment on why they've removed themselves. If
reviewers removed themselves from patches they have no intention
whatsoever to review
PSR-3 logging has been fully supported in MediaWiki since1.25wmf5.
We've been making various tuning improvements since then including a
recent deprecation of the initial MWLogger wrapper class in favor of
direct usage of Psr\Log\LoggerInterface by wfDebugLog() and other
internal wrapper methods [2]
Huge +tons to everything below (and top posting to drive Mz up the wall)
:-)
As Erik said, I know I'm probably going to be driving Rachel and Quim nuts
with my handwringing and second guessing as we figure out the pros and cons
of how things went this year so that we can keep improving this. But
On 2015-01-29 1:14 PM, Jon Robson wrote:
> Thanks for kicking off the conversation Brad :-)
>
> Just mean at the moment. I hacked together and I'm more than happy to
> iterate on this and improve the reporting.
>
> On the subject of patch abandonment: Personally I think we should be
> abandoning in
Language fragmentation is always fun, but, as with any new one, my concerns
lie in the environment - is there enough tools to make the advertised
benefits worth it, does it have a decent IDE with the smart code
completion, refactoring, and a good debugger? Does it have a
packaging/dependency system
I'm personally more excited about Rust. It is a true systems language with
a modern type system, does away with the GC for more predictable
performance and generally outperforms Go on CPU-bound tasks. It could
actually become an interesting option for a highly parallel Parsoid 2.0
version once its
Great blog post and awesome achievements!
I've spotted a small typo in the text: "that offer more compelling
featuresd."
On Thu, Jan 29, 2015 at 5:01 PM, Bryan Davis wrote:
> For the last four months, my main focus has been the Librarization
> project [0]. Today a wrap up blog post was posted t
(Sorry, this was meant for wikitech-l.)
On Thu, Jan 29, 2015 at 7:20 PM, Ori Livneh wrote:
> We should do the same, IMO.
> http://bowery.io/posts/Nodejs-to-Golang-Bowery/
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikime
On Thu, Jan 29, 2015 at 2:47 PM, Arlo Breault
wrote:
> There’s a brief discussions of the security implications of
> some proposed solutions in the review of,
> https://gerrit.wikimedia.org/r/#/c/181519/
>
To clarify, the possible solutions seem to be:
1. Unstrip the marker and then encode the
For the last four months, my main focus has been the Librarization
project [0]. Today a wrap up blog post was posted to
blog.wikimedia.org [1] that I'd invite all of you to read to get an
overview of what our high level goals and motivations were and what we
accomplished. The TL;DR is that we now h
As I mentioned to Nemo on the talk page, I want an easy way to see how
my code review efficiency compares to other projects and to see which
projects are getting more love than others. A few thoughts:
1) From
http://korma.wmflabs.org/browser/repository.html?repository=gerrit.wikimedia.org_mediawi
On Thu, Jan 29, 2015 at 12:56 PM, Jon Robson wrote:
>
> Introducing:
> https://www.mediawiki.org/wiki/Extension_health
Interesting! I'm hopping between flights back to Europe, and I don't have
time to review these metrics more carefully, but please check
http://korma.wmflabs.org/browser/gerrit
Brion Vibber wrote:
>Good point Yuri -- a lot of those items on my queue are assigned to
>several reviewers so none of us feels ownership, and that's definitely
>part of the reason some of them sit around so long.
>
>A regular bot run that assigns untouched review requests to a single
>person in Ph
Good point Yuri -- a lot of those items on my queue are assigned to several
reviewers so none of us feels ownership, and that's definitely part of the
reason some of them sit around so long.
A regular bot run that assigns untouched review requests to a single person
in Phab probably does make sens
Brion, i would love to use gerrit more fully (that is until we finally
migrate! :)), but gerrit to my knowledge does not differentiate between a
CC (review if you want to) and TO (i want you to +2). Having multiple cooks
means some patches don't get merged at all. I feel each patch should be
assig
Currently, while {{urlencod}}ing, content in strip markers is skipped.
I believe this violates the expectation that the entire output
will be properly escaped to be placed in a sensitive context.
An example is in the infobox book caption on,
https://en.wikipedia.org/wiki/%22F%22_Is_for_Fugitive
I'd like us to start by using the review request system already in gerrit
more fully.
Personally I've got a bunch of incoming reviews in my queue where I'm not
sure the current status of them or if it's wise to land them. :)
First step is probably to go through the existing old patches in
everybo
How about a simple script to create a phabricator task after a few days (a
week?) of a patch inactivity to review that patch. It will allow "assign
to", allow managers to see each dev's review queue, and will prevent
patches to fall through the cracks.
Obviously this won't be needed after we move
This is a situation where disciplined testing can come in really handy.
If I submit a patch, and the patch passes the tests that have been
specified for the feature it implements (or the bug it fixes), and the code
coverage is sufficiently high, then a reviewer has a running start in terms
of conf
Thanks for kicking off the conversation Brad :-)
Just mean at the moment. I hacked together and I'm more than happy to
iterate on this and improve the reporting.
On the subject of patch abandonment: Personally I think we should be
abandoning inactive patches. They cause unnecessary confusion to
s
On Thu, Jan 29, 2015 at 12:56 PM, Jon Robson wrote:
> The average time for code to go from submitted to merged appears to be
> 29 days over a dataset of 524 patches, excluding all that were written
> by the L10n bot. There is a patchset there that has been _open_ for
> 766 days - if you look at i
I was really happy to hear Damon, at the MediaWiki Developer Summit,
ask us how long we take to code review and whether we had communicated
a timeframe in which we promised to do it to our community. He quite
rightly stressed that this was vital for the survival of our
community. I spoke to one of
Brion,
Doubling the memory limit resolved my problem. Thanks for the suggestion
(and the quick response).
Keith Welter
On Thu, Jan 29, 2015 at 9:58 AM, Brion Vibber wrote:
> Memory limits are notoriously difficult to deal with on Unix like systems.
> Virtual memory address space can easily be e
Memory limits are notoriously difficult to deal with on Unix like systems.
Virtual memory address space can easily be exhausted mapping in the program
itself, its libraries, and sometimes even scratch files might be memory
mapped, eating address space up further.
Usage also tends to be higher on 6
Hi!
> I believe we can make installing a fully-featured MediaWiki service system
> as simple as copy&pasting 2-3 lines to a shell, or even executing a remote
I think this is underestimating the issue, unfortunately. I.e., in ideal
situation - the user runs the same system, with the same services
The GraphViz extension uses wfShellExec() to invoke the "dot" command.
Sometime in the last month or so, on my Ubuntu 14.04 installation, the
command started failing with:
Warning: Could not load "/usr/lib/graphviz/libgvplugin_gd.so.6" - file
not found
The file does exist and the dot command runs
> As to JSON, IMHO YAML is better, more human-readable and less verbose
I agree. As a human, YAML is rather nicer to read (and write) than JSON.
Fortunately it's pretty easy to convert one to the other. We've even made
some tweaks to Swagger UI to support both YAML and JSON, and we're
currently
As of
https://gerrit.wikimedia.org/r/#/c/29879/2/utils/MessageTable.php,cm ,
Linker::link took 20 KiB of memory per call. Cf.
http://laxstrom.name/blag/2013/02/01/how-i-debug-performance-issues-in-mediawiki/
I don't know if such bugs/unfeatures and related best practices were
written down somew
On 28/01/15 06:43, Erik Moeller wrote:
Just a quick note that I really appreciated everyone's help making the
summit come together. As always, we'll be doing lots of second-guessing of
everything we did and didn't do, and how we want to use future time
together. Before we go into that, I'd like t
On Wed, Jan 28, 2015 at 11:29 PM, Brian Gerstle
wrote:
> JSON Schema is a recurring theme here which I'd like to encourage. I've
> thought it was a promising idea and would like to explore it further, both
> on the client and server side. If we can somehow keep data schema and API
> specificati
30 matches
Mail list logo