don't merge anything into core pls. rip the stuff out of core and make
extensions from them, and finally, please make core a lightweight and faster
On Wed, Feb 6, 2013 at 8:34 AM, Matma Rex matma@gmail.com wrote:
On Wed, 06 Feb 2013 04:31:13 +0100, Tim Starling tstarl...@wikimedia.org
On Sat, Dec 1, 2012 at 7:32 AM, Quim Gil q...@wikimedia.org wrote:
Are you going to FOSDEM? If so (or if you are considering going) please add
yourself to
http://www.mediawiki.org/wiki/Events/FOSDEM
I still don't know. Depends on whether we have a MediaWiki EU critical mass.
--
Quim Gil
On Tue, Feb 5, 2013 at 10:31 PM, Tim Starling tstarl...@wikimedia.org wrote:
On 05/02/13 13:02, Sébastien Santoro wrote:
Good evening,
Hashar and me discussed about the Vector extension this morning on
#wikimedia-tech and we wonder if the code shouldn't be merged to
core.
Rationale: Vector
There will be (actually, there is already) a web API offering the kind of
data required, and for client wikis not running on WMF infrastructure this
will eventually be the way to access the data.
For WMF clients, like the Wikipedias, our decision was not to use HTTP web
requests, but to
Heads-up to API developers and to deployment reviewers.
http://lists.wikimedia.org/pipermail/glam/2013-January/thread.html#310
is the full discussion thread.
-Sumana
Original Message
Subject: Re: [GLAM] Wiki GLAM Toolset project (GLAM Digest, Vol 18,
Issue 11)
Date: Thu, 24
On Wed, Feb 6, 2013 at 10:35 AM, Zaki Akhmad zakiakh...@gmail.com wrote:
The FOSDEM video team had released this talk. You may visit the video URL
here:
http://video.fosdem.org/2013/lightningtalks/How_to_hack_on_Wikipedia.webm
Thanks for letting us know. I have added link to the video to the
Last week we had our first Bug Day of the year.
*---How it Went*---
We looked at bugs (excluding enhancements) that had not seen any changes in
over a year, a little over 250 bugs. The bugs came from a number of
different products and components. We started looking at the bugs in
General/Unknown.
On Wed, Feb 6, 2013 at 11:37 AM, Valerie Juarez
valerie.m.jua...@gmail.com wrote:
Last week we had our first Bug Day of the year.
*---How it Went*---
We looked at bugs (excluding enhancements) that had not seen any changes in
over a year, a little over 250 bugs. The bugs came from a number of
On Wed, Feb 6, 2013 at 12:37 PM, bawolff bawolff...@gmail.com wrote:
On Wed, Feb 6, 2013 at 11:37 AM, Valerie Juarez
valerie.m.jua...@gmail.com wrote:
Last week we had our first Bug Day of the year.
*---How it Went*---
We looked at bugs (excluding enhancements) that had not seen any changes
On 02/06/2013 03:53 AM, Petr Bena wrote:
don't merge anything into core pls. rip the stuff out of core and make
extensions from them, and finally, please make core a lightweight and faster
The speed of MediaWiki isn't directly related to the amount of code in
core. Speed is all about keeping
On 02/06/2013 04:49 AM, Denny Vrandečić wrote:
There will be (actually, there is already) a web API offering the kind of
data required, and for client wikis not running on WMF infrastructure this
will eventually be the way to access the data.
For WMF clients, like the Wikipedias, our
On Wed, Feb 6, 2013 at 8:54 AM, Gabriel Wicke gwi...@wikimedia.org wrote:
Local HTTP requests have pretty low overhead (1-2ms), but api.php
suffers from high start-up costs (35-40ms). This is more an issue with
api.php and the PHP execution model than with HTTP though, and might be
improved in
* Chris Steipp wrote:
On Wed, Feb 6, 2013 at 8:54 AM, Gabriel Wicke gwi...@wikimedia.org wrote:
Local HTTP requests have pretty low overhead (1-2ms), but api.php
suffers from high start-up costs (35-40ms). This is more an issue with
api.php and the PHP execution model than with HTTP though, and
On 02/06/2013 09:29 AM, Chris Steipp wrote:
On Wed, Feb 6, 2013 at 8:54 AM, Gabriel Wicke gwi...@wikimedia.org wrote:
Local HTTP requests have pretty low overhead (1-2ms), but api.php
suffers from high start-up costs (35-40ms). This is more an issue with
api.php and the PHP execution model
On Wed, Feb 6, 2013 at 10:37 AM, bawolff bawolff...@gmail.com wrote:
On Wed, Feb 6, 2013 at 11:37 AM, Valerie Juarez
valerie.m.jua...@gmail.com wrote:
- Hosting the event in a different IRC channel
- We held the event in #wikimedia-dev. We were able to get help from
developers
On 6 February 2013 18:42, Mark A. Hershberger m...@everybody.org wrote:
On 02/06/2013 03:53 AM, Petr Bena wrote:
don't merge anything into core pls. rip the stuff out of core and make
extensions from them, and finally, please make core a lightweight and faster
The speed of MediaWiki isn't
Forwarding to wikitech with RFC:
Currently the mobile site scrubs an elements with the noprint class.
It also scrubs elements with the nomobile class. Each scrub of
element effects performance [citation needed] so this would be a good
thing to do.
It has been suggested that we stop scrubbing
Hey,
Just wanted to bring in few points.
If its in core it would be easy tackle bug [1] (note the votes it has) and
more that may come up.
Somewhat related, I think MediaWiki should automatically create
disambiguation pages [2].
[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=3483
[2]
On Wed, Feb 6, 2013 at 10:04 AM, Bjoern Hoehrmann derhoe...@gmx.net wrote:
* Chris Steipp wrote:
On Wed, Feb 6, 2013 at 8:54 AM, Gabriel Wicke gwi...@wikimedia.org wrote:
Local HTTP requests have pretty low overhead (1-2ms), but api.php
suffers from high start-up costs (35-40ms). This is more
On 02/06/2013 01:14 PM, Niklas Laxström wrote:
More refactorings and rewrites are needed (like ContentHandler) to get
MediaWiki back to enabling the development of new ideas rather that
limiting that. At the same time constantly changing interfaces and
breaking existing extensions will annoy
On 02/06/2013 10:49 AM, Chris Steipp wrote:
In general, it seems to me like there will be more attacks opened up
by having lua open network requests to the api, than there would be by
defining an internal api.
Initially the use case will be providing access to the Wikidata API, not
the
On Wed, Feb 6, 2013 at 10:24 AM, Jon Robson jdlrob...@gmail.com wrote:
Forwarding to wikitech with RFC:
Currently the mobile site scrubs an elements with the noprint class.
It also scrubs elements with the nomobile class. Each scrub of
element effects performance [citation needed] so this
Hey,
I think Niklas his comment is spot on:
Rather than absolute size, we should care about core that is modular
and has well-defined interfaces. Moving stuff to extensions is one way
to encourage this, as then the functionality cannot be relied on or it
can be replaced with an another
Please don't forget about the hybrid approach -- API supports FauxRequests
- so an API call can be made without doing a web call, but an internal one
instead, without any json or startup overhead:
http://www.mediawiki.org/wiki/API:Calling_internally
On Wed, Feb 6, 2013 at 2:08 PM, Gabriel Wicke
Hey,
Why is this thread not titled return of the bikeshed? I've seen this one
so many times over the past few days. Wonder how many man-days have been
wasted on this :)
It doesn't seem faster to me.
Those benchmarks are only testing some rather small arrays no?
I did some stuff a while back
On Wed, Feb 6, 2013 at 8:36 PM, Jeroen De Dauw jeroended...@gmail.comwrote:
Currently our coding conventions have major prohibitions on the use of
isset() ...
Oh, would not think so if you did a grep on core.
There are valid uses for isset, and as far as I am aware, those are not
On Wed, Feb 6, 2013 at 11:54 AM, Gabriel Wicke gwi...@wikimedia.org wrote:
It should be possible to hide optimization details like local DB access
vs. an actual HTTP request behind the same interface. A URL-based query
interface can support local handlers for specific URL prefixes.
Or the
On Wed, Feb 6, 2013 at 2:39 PM, Petr Bena benap...@gmail.com wrote:
Hi,
this is a second time I am opening this - last time we decided to merge
wikitech and labsconsole to make a documentation base for developers and
operation engineers (at least those working at labs - I consider operation
Just want to summarize and make sure I've got the right conclusions, as
this thread has wandered a bit.
*1. X-MF-Mode: Alpha/Beta Site Usage*
*
*
We'll roll this into the X-CS header, which will now be KV-pairs (using
normal URL encoding), and set by Varnish. This will avoid an explosion of
On Wed, Feb 6, 2013 at 11:48 AM, Chad innocentkil...@gmail.com wrote:
Well, we enabled transwiki importing on labsconsole from wikitech (see
https://labsconsole.wikimedia.org/wiki/Special:Import). I don't see any
reason we couldn't start importing things.
Being someone who often uses
On Wednesday, February 6, 2013, David Schoonover wrote:
Just want to summarize and make sure I've got the right conclusions, as
this thread has wandered a bit.
*1. X-MF-Mode: Alpha/Beta Site Usage*
*
*
We'll roll this into the X-CS header, which will now be KV-pairs (using
normal URL
That all sounds fine to me so long as we're all agreed.
--
David Schoonover
d...@wikimedia.org
On Wed, Feb 6, 2013 at 12:59 PM, Asher Feldman afeld...@wikimedia.orgwrote:
On Wednesday, February 6, 2013, David Schoonover wrote:
Just want to summarize and make sure I've got the right
On 02/06/2013 11:43 AM, Brad Jorsch wrote:
On Wed, Feb 6, 2013 at 11:54 AM, Gabriel Wicke gwi...@wikimedia.org wrote:
It should be possible to hide optimization details like local DB access
vs. an actual HTTP request behind the same interface. A URL-based query
interface can support local
On Wednesday, February 6, 2013, David Schoonover wrote:
That all sounds fine to me so long as we're all agreed.
Lol. RFC closed.
--
David Schoonover
d...@wikimedia.org javascript:;
On Wed, Feb 6, 2013 at 12:59 PM, Asher Feldman
afeld...@wikimedia.orgjavascript:;
wrote:
On
On 02/06/2013 02:24 PM, Jeroen De Dauw wrote:
MediaWiki is currently closer to being a ball of mud then a set of core
components nicely separated and working together via clean interfaces.
I don't disagree. But we're also a strange amalgamation of volunteers,
administrators of wikis,
So basically here's my goal (it's the same as if ( v. if( ). I don't care
about reaching consensus on either side, because such an attempt is futile.
The only thing I want to at least get some support on is that a given
patchset should not be blocked from merging just because it uses empty() in
a
On Wed, Feb 6, 2013 at 4:04 PM, Gabriel Wicke gwi...@wikimedia.org wrote:
On 02/06/2013 11:43 AM, Brad Jorsch wrote:
On Wed, Feb 6, 2013 at 11:54 AM, Gabriel Wicke gwi...@wikimedia.org wrote:
It should be possible to hide optimization details like local DB access
vs. an actual HTTP request
On Wed, Feb 6, 2013 at 12:45 PM, Steven Walling steven.wall...@gmail.comwrote:
On Wed, Feb 6, 2013 at 11:48 AM, Chad innocentkil...@gmail.com wrote:
Well, we enabled transwiki importing on labsconsole from wikitech (see
https://labsconsole.wikimedia.org/wiki/Special:Import). I don't see any
On Wed, Feb 6, 2013 at 1:41 PM, Ryan Lane rlan...@gmail.com wrote:
First, we'd be merging them and renaming labsconsole to wikitech. Second,
it's absurd to have operations documentation spread across two places, and
wikitech is almost solely operations documentation. Third, it's easier to
On 02/06/2013 01:30 PM, Brad Jorsch wrote:
-- Would fetch JSON from
-- http://wikidata.org/api/query/?param1=fooparam2=bar
-- if no local handler is defined and the base URL is in a whitelist
jsonObject = JSONRequest(http://wikidata.org/api/query/;,
{ param1=foo,
Le 06/02/13 20:39, Petr Bena a écrit :
Hi,
this is a second time I am opening this - last time we decided to merge
wikitech and labsconsole to make a documentation base for developers and
operation engineers (at least those working at labs - I consider operation
of bots and such services
Hey,
For that matter, what is it?
Let's assume it means not repeating the tight coupling and badly defined
interfaces mistakes made in the past:
The problem is: who is going to do it?
How is it going to be accomplished?
Everyone that's writing new code or refactoring old code. These
Le 06/02/13 20:24, Jeroen De Dauw a écrit :
MediaWiki is currently closer to being a ball of mud then a set of core
components nicely separated and working together via clean interfaces. We
have so many components that could just as well be independent from the
rest of the codebase, but they
On Wed, Feb 6, 2013 at 4:58 PM, Jeroen De Dauw jeroended...@gmail.com wrote:
No magic involved. If we all pay attention to these points, the codebase as
a whole will gradually increase in quality. There is no need for someone to
go rewrite things, though if someone feels like doing that, it'll
Le 06/02/13 22:22, Tyler Romeo a écrit :
So basically here's my goal (it's the same as if ( v. if( ). I don't care
about reaching consensus on either side, because such an attempt is futile.
I prefer unless( !$foo ) but that is just me.
The only thing I want to at least get some support on
On 07/02/13 06:36, Jeroen De Dauw wrote:
Those benchmarks are only testing some rather small arrays no?
I did some stuff a while back as well:
https://github.com/JeroenDeDauw/php/blob/master/profile/array-empty.php
That is an interesting way to do benchmarks. XHProf has a bug in it
which
On 02/05/2013 04:45 AM, MZMcBride wrote:
you have a link to a demo?
https://en.wikipedia.org/wiki/Main_Page?tour=test
And gettingstarted. E.g.
https://en.wikipedia.org/wiki/Wikipedia?tour=gettingstarted .
Matt Flaschen
___
Wikitech-l mailing list
On 02/05/2013 10:31 PM, Tim Starling wrote:
On 06/02/13 00:12, Matma Rex wrote:
Vector is a weird mess of extensions to the skin, extensions to
general functionality, and unused broken scripts. Merging it
properly would require some work and some deleting.
That sounds like a good reason to
On 06/02/13 19:53, Petr Bena wrote:
don't merge anything into core pls. rip the stuff out of core and make
extensions from them, and finally, please make core a lightweight and faster
There are lots of lightweight wiki engines to choose from. MediaWiki
was never going to be among them, it had
On 6 February 2013 23:58, Tim Starling tstarl...@wikimedia.org wrote:
I think we can turn MediaWiki into a fully featured wiki engine which
can compete with the likes of Confluence. I don't think it can ever
compete with TiddlyWiki or UseModWiki in their respective niches.
(veering slightly
On Wednesday, February 6, 2013 at 3:58 PM, Tim Starling wrote:
I think we can turn MediaWiki into a fully featured wiki engine which
can compete with the likes of Confluence.
What would it take?
--
Ori Livneh
___
Wikitech-l mailing list
I don't think people are arguing to keep it lightweight like Tiddly or
UseMod but rather on a similar platform on what we already have and to
keep the core in a position where it can have features that would be
widely used by most installations and the cruft that is more rarely
used to be housed
On my local Vagrant MediaWiki install
(https://github.com/wikimedia/wmf-vagrant), I'm getting:
Running test BUG 1887, part 2: A math with a thumbnail- math
enabled... No lock manager defined with the name `nullLockManager`.
when running the parser tests:
php tests/parserTests.php
nullLockManager is defined in Setup.php.The code:
LockManagerGroup::singleton()-get( 'nullLockManager' );
... works fine in eval.php and is used in production.
--
View this message in context:
http://wikimedia.7.n6.nabble.com/NullLockManager-and-the-math-extension-tp4995536p4995539.html
Sent
Ori Livneh wrote:
On Wednesday, February 6, 2013 at 3:58 PM, Tim Starling wrote:
I think we can turn MediaWiki into a fully featured wiki engine which
can compete with the likes of Confluence.
What would it take?
Off-hand, it looks like:
* better enterprise support,
* granularized read
On Wed, Feb 6, 2013 at 4:48 PM, Gabriel Wicke gwi...@wikimedia.org wrote:
jsonData = JSONAPI.wikidata.request({ param1=foo, param2=bar })
At that point, we may as well make it mw.wikidata.request{
param1=foo, param2=bar } (or mw.ext.wikidata.request, no one has
replied to that suggestion yet).
Dear Wikimedians,
Wikimedia Commons is happy to announce that the 2012 Picture of the Year
competition is now open. Click here To see the candidate images just go to
the POTY 2012 page on Wikimedia Commons at
https://commons.wikimedia.org/wiki/Commons:Picture_of_the_Year/2012/Introductionto
vote
On 07/02/13 11:16, K. Peachey wrote:
I don't think people are arguing to keep it lightweight like Tiddly or
UseMod but rather on a similar platform on what we already have and to
keep the core in a position where it can have features that would be
widely used by most installations and the
On 07/02/13 11:05, David Gerard wrote:
What in Confluence are you thinking of MediaWiki as lacking?
The main two in my experience are (1) firm access control (2) WYSIWYG
editor. (1) is just not something MW can promise to do securely unless
pretty much every developer cared and (2) is on the
On Wed, Feb 6, 2013 at 10:13 PM, Željko Filipin zfili...@wikimedia.org wrote:
I have noticed that they use a few mirrors. Could the format that they are
using be hosted at Commons?
Željko
--
[1] https://www.mediawiki.org/wiki/How_to_contribute/Presentation
Wow, I don't know we could upload
As many people know, our current search infrastructure has caused a
few problems with the site. It's an area that was greatly improved by
the work that Robert Stojnic (a.k.a. Rainman) did in 2008, but he
hasn't had the time to keep up with it, and to date, the WMF hasn't
invested a lot in further
Some notes that will be of interest to the community now that this
quarter's LevelUp is underway:
https://www.mediawiki.org/wiki/Mentorship_programs/LevelUp/Q1_2013
* If you have AbuseFilter patches that need review, please toss them to
Matthias Mullie so he can review them, towards getting AF
On 02/07/2013 01:02 AM, Sumana Harihareswara wrote:
Who's we? WMF isn't using any for obvious reasons (arbitrary code
injection).
We is the Wikimedia technical community, I'd wager. :-)
My point was three different people made those extensions, and while we
can encourage them to work
Yes, but we don't have to keep all three extension pages. As the MediaWiki
community, don't we want to make things simpler for sysadmins? We can't
force them to work together, but I don't see how keeping all three
extensions on MW.org is useful.
*--*
*Tyler Romeo*
Stevens Institute of Technology,
On 02/07/2013 01:17 AM, Tyler Romeo wrote:
Yes, but we don't have to keep all three extension pages. As the MediaWiki
community, don't we want to make things simpler for sysadmins? We can't
force them to work together, but I don't see how keeping all three
extensions on MW.org is useful.
I
I think that this is a solution in search
of a problem.
-Chad
On Feb 7, 2013 1:18 AM, Tyler Romeo tylerro...@gmail.com wrote:
Yes, but we don't have to keep all three extension pages. As the MediaWiki
community, don't we want to make things simpler for sysadmins? We can't
force them to work
It's one thing to have redundant extensions, it's another to have three
different extensions (two of which have names differing by one letter) that
are exactly the same thing, as in line-for-line they do the same exact
thing.
*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major
On 2/7/2013 12:24 AM, Chad wrote:
I think that this is a solution in search
of a problem.
How about saying, extensions on mw.org shouldn't expose security
vulnerabilities to wiki's running them. That would probably be a better
metric.
___
Until we forbid pasting code on-wiki,
I don't see how there's anything we
can do.
-Chad
On Feb 7, 2013 1:28 AM, Q overlo...@gmail.com wrote:
On 2/7/2013 12:24 AM, Chad wrote:
I think that this is a solution in search
of a problem.
How about saying, extensions on mw.org shouldn't expose
On Thu, Feb 7, 2013 at 1:23 AM, MZMcBride z...@mzmcbride.com wrote:
Ori Livneh wrote:
On Wednesday, February 6, 2013 at 3:58 PM, Tim Starling wrote:
I think we can turn MediaWiki into a fully featured wiki engine which
can compete with the likes of Confluence.
What would it take?
* a
70 matches
Mail list logo