[Wikitech-l] Using wiki pages as databases

2013-02-19 Thread MZMcBride
Hi.

In the context of https://bugzilla.wikimedia.org/show_bug.cgi?id=10621,
the concept of using wiki pages as databases has come up. We're already
beginning to see this:

* https://en.wiktionary.org/wiki/Module:languages (over 30,000 lines)
* https://en.wikipedia.org/wiki/Module:Convertdata (over 7,400 lines)

At large enough sizes, the in-browser syntax highlighting is currently
problematic. But it's also becoming clear that the larger underlying
problem is that using a single request wiki page as a database isn't
really scalable or sane.

(ParserFunction #switch's performance used to prohibit most ideas of using
a wiki page as a database, as I understand it.)

Has any thought been given to what to do about this? Will it require
manually paginating the data over collections of wiki pages? Will this be
something to use Wikidata for?

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using wiki pages as databases

2013-02-19 Thread Johnuniq
On Feb 19, 2013 at 9:11 PM, MZMcBride wrote:
 https://en.wikipedia.org/wiki/Module:Convertdata

I'm guilty of that, and what's been worrying me is that there are
hundreds more units to add. Some guidance on using Lua as a database
would be very desirable.

Quick tests suggest that if {{convert}} is used 100 times on a page
(where that template invokes Module:Convert, which requires
Module:Convertdata), then Convertdata is loaded 100 times. I've
wondered if there might be a pragma in a module like that to set read
only (at least a promise of read only, even if it were not enforced),
then more aggressively cache the bytecode so it is loaded once only
per page render, or even once only until the cache memory is flushed.

Or, if performance due to such module abuse is a problem, the data
could be split into, say, ten modules, and the code accessing the data
could work out which of the smaller data modules needed to be
required. I'm not going to worry about that until I have to, but some
guidance would be good.

I just had a quick look at one test page which invokes the module 66
times, and the NewPP limit report in the html source says Lua time
usage: 0.324s (5 ms/invoke).
http://en.wikipedia.org/wiki/Template:Convert/testcases/bytype/time

Johnuniq

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using wiki pages as databases

2013-02-19 Thread Tim Starling
On 19/02/13 21:11, MZMcBride wrote:
 Hi.
 
 In the context of https://bugzilla.wikimedia.org/show_bug.cgi?id=10621,
 the concept of using wiki pages as databases has come up. We're already
 beginning to see this:
 
 * https://en.wiktionary.org/wiki/Module:languages (over 30,000 lines)
 * https://en.wikipedia.org/wiki/Module:Convertdata (over 7,400 lines)
 
 At large enough sizes, the in-browser syntax highlighting is currently
 problematic.

We can disable syntax highlighting over some size.

 But it's also becoming clear that the larger underlying
 problem is that using a single request wiki page as a database isn't
 really scalable or sane.

The performance of #invoke should be OK for modules up to
$wgMaxArticleSize (2MB). Whether the edit interface is usable at such
a size is another question.

 (ParserFunction #switch's performance used to prohibit most ideas of using
 a wiki page as a database, as I understand it.)

Both Lua and #switch have O(N) time order in this use case, but the
constant you multiply by N is hundreds of times smaller for Lua.

 Has any thought been given to what to do about this? Will it require
 manually paginating the data over collections of wiki pages? Will this be
 something to use Wikidata for?

Ultimately, I would like it to be addressed in Wikidata. In the
meantime, multi-megabyte datasets will have to be split up, for
$wgMaxArticleSize if nothing else.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit reports

2013-02-19 Thread Chad
On Tue, Feb 19, 2013 at 2:36 AM, MZMcBride z...@mzmcbride.com wrote:
 Hi.

 I wrote https://www.mediawiki.org/wiki/Gerrit/Reports over the weekend.


This is really cool--glad to see the new API getting some usage.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using wiki pages as databases

2013-02-19 Thread Tyler Romeo
So unfortunately I don't have a clear idea of what the problem is,
primarily because I don't know anything about the Parser and its inner
workings, but as far as having all the data in one page, here's something.
Maybe this is a bad idea, but how about having a PHP-array content type. In
other words, MyNamespace:MyPage would render the entire data structure, but
MyNamespace:MyPage/index/test/0 would take $arr['index']['test'][0]. In the
database, it would be stored as individual sub-pages, and leaf sub-pages
would render exactly like a normal page would, but non-leaf pages would
build the array from all child sub-pages and display it to the user. Would
this solve the problem? Because if so, I've put some thought into it and
would be willing to maybe draft an extension giving such a capability.

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com


On Tue, Feb 19, 2013 at 7:27 AM, Tim Starling tstarl...@wikimedia.orgwrote:

 On 19/02/13 21:11, MZMcBride wrote:
  Hi.
 
  In the context of https://bugzilla.wikimedia.org/show_bug.cgi?id=10621
 ,
  the concept of using wiki pages as databases has come up. We're already
  beginning to see this:
 
  * https://en.wiktionary.org/wiki/Module:languages (over 30,000 lines)
  * https://en.wikipedia.org/wiki/Module:Convertdata (over 7,400 lines)
 
  At large enough sizes, the in-browser syntax highlighting is currently
  problematic.

 We can disable syntax highlighting over some size.

  But it's also becoming clear that the larger underlying
  problem is that using a single request wiki page as a database isn't
  really scalable or sane.

 The performance of #invoke should be OK for modules up to
 $wgMaxArticleSize (2MB). Whether the edit interface is usable at such
 a size is another question.

  (ParserFunction #switch's performance used to prohibit most ideas of
 using
  a wiki page as a database, as I understand it.)

 Both Lua and #switch have O(N) time order in this use case, but the
 constant you multiply by N is hundreds of times smaller for Lua.

  Has any thought been given to what to do about this? Will it require
  manually paginating the data over collections of wiki pages? Will this be
  something to use Wikidata for?

 Ultimately, I would like it to be addressed in Wikidata. In the
 meantime, multi-megabyte datasets will have to be split up, for
 $wgMaxArticleSize if nothing else.

 -- Tim Starling


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using wiki pages as databases

2013-02-19 Thread Denny Vrandečić
2013/2/19 Tim Starling tstarl...@wikimedia.org

 On 19/02/13 21:11, MZMcBride wrote: Has any thought been given to what to
 do about this? Will it require
  manually paginating the data over collections of wiki pages? Will this be
  something to use Wikidata for?

 Ultimately, I would like it to be addressed in Wikidata. In the
 meantime, multi-megabyte datasets will have to be split up, for
 $wgMaxArticleSize if nothing else.



I expect that, in time, Wikidata will be able to serve some of those
usecase, e.g. the one given by the languages Module on Wiktionary. I am
quite excited about the possibilities that access to Wikidata together with
Lua will be enabling within a year or so... :)

Not all use cases though should and will be handled by Wikidata obviously,
but some of those huge switches definitively can be saved in Wikidata items.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using wiki pages as databases

2013-02-19 Thread Brad Jorsch
In the long term, Wikidata is probably the way to go on something like this.

In the short term, as far as dividing things up, note that you can
implement on-demand loading in Lua easily enough using the __index
metamethod.

  local obj = {}

  setmetatable( obj, {
  __index = function ( t, k )
  -- This will get called on access of obj[k] if it is not already
set.
  -- Do whatever you might need, e.g. require() a submodule,
  -- assign things to t for future lookups, then return the
requested k.
  end
  } )

  return obj

Also note that you can save space at the expense of code complexity by
accessing obj.us_name or obj.name rather than storing the same string in
both fields; remember in Lua only nil (unset) and boolean false are
considered false, the number 0 and the empty string are both considered
true.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Next Bugday: Feb. 19 17:00-23:00UTC

2013-02-19 Thread Valerie Juarez
We'll be starting the bugday in about 5 minutes!
Stop by #wikimedia-dev and help us clean up some old reports! Etherpad:
http://etherpad.wmflabs.org/pad/p/BugTriage-2013-02

See you there!

On Thu, Feb 14, 2013 at 12:48 PM, Valerie Juarez valerie.m.jua...@gmail.com
 wrote:

 Hello!

 Please join us on the next Wikimedia Bugday:

 Tuesday, February 19th, 17:00-23:00 UTC [1]
in #wikimedia-dev on Freenode IRC [2]

 Because of the recent upgrade, we will be looking at open bugs in
 Git/Gerrit [3]. Our focus will be on identifying bugs that are upstream
 issues, so we can close the bugs that have been fixed and comment on those
 that need status updates. Release notes for the issues fixed in the
 upgrade: [4] [5] [6].

 Everyone is welcome to join, and no technical knowledge needed! It's a
 nice and easy way to get involved in the community or to give something
 back.

 This information and more can be found here:
 https://www.mediawiki.org/wiki/Bug_management/Triage/20130219

 For more information on Triaging in general, check out
 https://www.mediawiki.org/wiki/Bug_management/Triage

 I look forward to seeing you there!
 Valerie


 [1] Timezone converter:
 http://www.timeanddate.com/worldclock/converter.html
 [2] See http://meta.wikimedia.org/wiki/IRC for more info on IRC chat
 [3]
 https://bugzilla.wikimedia.org/buglist.cgi?list_id=179823resolution=---query_format=advancedcomponent=Git%2FGerritproduct=Wikimedia(about
  70 bugs)
 [4]
 http://gerrit-documentation.googlecode.com/svn/ReleaseNotes/ReleaseNotes-2.5.html
 [5]
 http://gerrit-documentation.googlecode.com/svn/ReleaseNotes/ReleaseNotes-2.5.1.html
 [6]
 http://gerrit-documentation.googlecode.com/svn/ReleaseNotes/ReleaseNotes-2.5.2.html

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] wfMsg and wfMsgForContent are deprecated. What to use?

2013-02-19 Thread Yury Katkov
Hi!

wfMsg and wfMsgForContent are deprecated since 1.18 but the comment
doesn't say what functions are recommended to use instead. Does anyone
knows?
-
Yury Katkov, WikiVote

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wfMsg and wfMsgForContent are deprecated. What to use?

2013-02-19 Thread Krenair
See 
https://www.mediawiki.org/wiki/Manual:Messages_API#Deprecated_wfMsg.2A_functions


Alex Monk

On 19/02/13 17:07, Yury Katkov wrote:

Hi!

wfMsg and wfMsgForContent are deprecated since 1.18 but the comment
doesn't say what functions are recommended to use instead. Does anyone
knows?
-
Yury Katkov, WikiVote


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wfMsg and wfMsgForContent are deprecated. What to use?

2013-02-19 Thread Jeroen De Dauw
Hey,

Do try to use local context when available though. wfMessage uses global
state. If you have a context object available, you can use its msg method.
Would be nice if there was a clean and properly segregated interface for
message handling objects, though there is none AFAIK.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wfMsg and wfMsgForContent are deprecated. What to use?

2013-02-19 Thread Yury Katkov
Thanks, that helps!

Maybe it would be good to add the replacement function to the comments
in the code? The documentation for doxygen [1] says that @deprecated
can be use for describing alternatives. I think it's important for
extension developers to quickly see what's not deprecated way to do
things.
[1] http://www.stack.nl/~dimitri/doxygen/manual/commands.html#cmddeprecated
-
Yury Katkov, WikiVote



On Tue, Feb 19, 2013 at 9:10 PM, Krenair kren...@gmail.com wrote:
 See
 https://www.mediawiki.org/wiki/Manual:Messages_API#Deprecated_wfMsg.2A_functions

 Alex Monk


 On 19/02/13 17:07, Yury Katkov wrote:

 Hi!

 wfMsg and wfMsgForContent are deprecated since 1.18 but the comment
 doesn't say what functions are recommended to use instead. Does anyone
 knows?
 -
 Yury Katkov, WikiVote

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki BoF at the Southern California Linux Expo this weekend

2013-02-19 Thread Mark Holmquist
Hello all,

I just wanted to announce that SCaLE[0] has announced their Birds of a Feather 
schedule[1], and there's going to be a MediaWiki-related one! I intended mostly 
to have it be about extensions, gadgets, and modules, but if you want to come 
and talk about core development you're very welcome too. The conference is 
being held at the LAX Hilton in Los Angeles, CA, and it's relatively affordable 
to come into the conference. Even more so for students, who get half off the 
base price. I hope you can make it!

The BoF session is on Saturday at 19:00, in the Century CD room.

(and if you're interested, I'm also hosting an Etherpad Lite BoF at 21:00 in 
the Los Angeles B room)

[0] https://www.socallinuxexpo.org/scale11x
[1] 
https://docs.google.com/spreadsheet/pub?key=0AkLumNSkddf_dHRMVnhjZmxJTWdFT0NPckl4RzRjNlE

-- 
Mark Holmquist
Software Engineer
Wikimedia Foundation
mtrac...@member.fsf.org
https://wikimediafoundation.org/wiki/User:MHolmquist


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Lua rollout to en.wikipedia.org and a few others

2013-02-19 Thread Gabriel Wicke
On 02/18/2013 04:29 AM, Tim Starling wrote:
 As for the Wikidata
 application -- the interface would be awkward compared to something
 made specifically for interfacing Wikidata with Lua.

I am still not convinced that the interface would be awkward. A general
method like

dataTable = mw.data.wikidata({ param1=foo, param2=bar })

looks pretty simple to me. Maybe the wikidata-specific interface will be
more convenient to use, but I doubt that the difference will be significant.

Gabriel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wfMsg and wfMsgForContent are deprecated. What to use?

2013-02-19 Thread Antoine Musso
Le 19/02/13 18:44, Yury Katkov a écrit :
 
 Maybe it would be good to add the replacement function to the comments
 in the code? The documentation for doxygen [1] says that @deprecated
 can be use for describing alternatives. I think it's important for
 extension developers to quickly see what's not deprecated way to do
 things.

Even better, we could use a new deprecation function that will accept an
optional helpful message listing the replacement for the call.  Of
course we will have to mark wfDeprecated() as deprecated :-D


-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Grants available for anti-surveillance/anti-censorship design other tech work

2013-02-19 Thread Sumana Harihareswara
http://openitp.org/?q=openitp_first_round_of_2013_project_funding_now_open_for_proposals

OpenITP's first round of 2013 project funding is now open for proposals!

Deadline: 31 March 2013

OpenITP project grants are meant to support specific technical efforts
to improve users' ability to circumvent censorship and surveillance on
the Internet. Technical doesn't have to mean software or hardware --
for example, we also consider efforts to improve user experience through
translation, testing, projects to improve documentation, meetings that
get developers together in person to solve specific problems, etc. The
main thing we're looking for is that your proposed project is finite
(e.g. has a deadline, is scoped) and contributes to OpenITP's core
mission of enabling freedom of communication on the Internet.

We're interested in all good proposals, but note we're especially
receptive to proposals that improve user experience (UX) and in
translation (of both software and documentation). Don't take that as a
filter, though: if you have a good proposal that's not about UX or
translation, we still want to receive it.

While our grants don't have a hard limit, they tend to be in the
$5k-$30k USD range: enough to fund a specific piece of work, or to
provide seed funding for a new idea, but not enough to be a primary
long-term funding source. Therefore we try not to burden applicants with
a lot of bureaucratic overhead and paperwork to apply for a grant. It's
enough to send us a brief description of what you have in mind, and
point to public URLs for further details. Since we only fund open source
work, we expect that most proposals we receive will already have been
discussed in publicly-archived forums anyway, and perhaps written up on
a public web page -- though there may be exceptions, such as projects
that are becoming open source but aren't all the way there yet. In any
case, we're comfortable clicking on links and reading stuff on the Web.
You're not required to package everything up in one PDF to make a
proposal. Just tell us what you want to do, make it easy for us to find
what we need to find, and we'll take it from there. We'll ask you
questions as we have them.

The page also includes examples of things OpenITP funded in their last
round.  Please take a look!  It would be *amazing* if someone could use
this opportunity to help people read and contribute to Wikimedia safely.
-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Welcome Greg Grossmeier, Release Manager

2013-02-19 Thread Rob Lanphier
Hi everyone,

I'm excited to welcome Greg Grossmeier as our new Release Manager at
the Wikimedia Foundation.  Greg comes to us from Creative Commons,
where he served as Education Technology and Policy Coordinator, as
well as serving as an interim leader for their engineering group.
Prior to Creative Commons, Greg worked at the University of Michigan
Library on copyright issues.

In his spare time, Greg founded the Ubuntu LoCo team for Michigan
(LoCo==Local Community).

Greg lives here in San Francisco (down in Bernal Heights) with his
partner Carrie and 14 month old son Rowan.

Greg will be managing the deployment process for the Wikimedia
websites, focusing at first on improving release notes and outbound
communication, freeing up folks like Sam to focus the engineering
aspects of the role.  He'll help our Bug Wrangler (Andre) figure out
how to deal with high priority deployment-related issues; Andre will
continue to broadly manage the flow of all bugs, while Greg will
narrowly focus on very high priority issues through fix deployment.
He'll also take over coordination of our deployment calendar[1], and
will likely be a little nosier than many of us have had the time to
do. Over time, Greg will look more holistically at our deployment
practice, and potentially lead a change over to a more continuous
deployment model.

Greg's email is g...@wikimedia.org and is on Freenode as greg-g.

Please join me in welcoming Greg in his new role!

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome Greg Grossmeier, Release Manager

2013-02-19 Thread maiki
On 02/19/2013 01:09 PM, Rob Lanphier wrote:

 Please join me in welcoming Greg in his new role!
 
 Rob


I've heard of this guy... he seems legit. ^_^

Ahem.

Yay for Greg! I am looking forward to better release notes!

maiki

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome Greg Grossmeier, Release Manager

2013-02-19 Thread James Forrester
Welcome, Greg! Great to have you here.

J.


On 19 February 2013 13:09, Rob Lanphier ro...@wikimedia.org wrote:

 Hi everyone,

 I'm excited to welcome Greg Grossmeier as our new Release Manager at
 the Wikimedia Foundation.  Greg comes to us from Creative Commons,
 where he served as Education Technology and Policy Coordinator, as
 well as serving as an interim leader for their engineering group.
 Prior to Creative Commons, Greg worked at the University of Michigan
 Library on copyright issues.

 In his spare time, Greg founded the Ubuntu LoCo team for Michigan
 (LoCo==Local Community).

 Greg lives here in San Francisco (down in Bernal Heights) with his
 partner Carrie and 14 month old son Rowan.

 Greg will be managing the deployment process for the Wikimedia
 websites, focusing at first on improving release notes and outbound
 communication, freeing up folks like Sam to focus the engineering
 aspects of the role.  He'll help our Bug Wrangler (Andre) figure out
 how to deal with high priority deployment-related issues; Andre will
 continue to broadly manage the flow of all bugs, while Greg will
 narrowly focus on very high priority issues through fix deployment.
 He'll also take over coordination of our deployment calendar[1], and
 will likely be a little nosier than many of us have had the time to
 do. Over time, Greg will look more holistically at our deployment
 practice, and potentially lead a change over to a more continuous
 deployment model.

 Greg's email is g...@wikimedia.org and is on Freenode as greg-g.

 Please join me in welcoming Greg in his new role!

 Rob

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
James D. Forrester
Product Manager, VisualEditor
Wikimedia Foundation, Inc.

jforres...@wikimedia.org | @jdforrester
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome Greg Grossmeier, Release Manager

2013-02-19 Thread Marc A. Pelletier

On 02/19/2013 04:09 PM, Rob Lanphier wrote:

I'm excited to welcome Greg Grossmeier as our new Release Manager at
the Wikimedia Foundation.


Welcome!

-- Marc


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome Greg Grossmeier, Release Manager

2013-02-19 Thread Arthur Richards
welcome Greg!


On Tue, Feb 19, 2013 at 2:24 PM, Marc A. Pelletier m...@uberbox.org wrote:

 On 02/19/2013 04:09 PM, Rob Lanphier wrote:

 I'm excited to welcome Greg Grossmeier as our new Release Manager at
 the Wikimedia Foundation.


 Welcome!

 -- Marc



 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Congrats Yurik and Tyler Romeo, new core maintainers

2013-02-19 Thread Sumana Harihareswara
Just wanted to note that volunteers Yuri Astrakhan (yurik) and Tyler
Romeo (Parent5446) now have +2 rights in MediaWiki core and all
MediaWiki extensions.  They can thus give binding reviews to your
changesets.

You might remember Yuri from API work years ago and from his web API
proposals and work recently.  Parent5446 has been submitting changesets
since July 2012 and was reporting bugs as long ago as 2008, and cares a
lot about authentication and security.

Thank you both for your work past, present, and future!

Currently requesting consensus for maintainership in core: hoo (Hoo
man).
https://www.mediawiki.org/wiki/Git/Gerrit_project_ownership#Hoo_.28Marius_Hoch.29_for_core

Several other folks have gotten +2 on specific extensions recently, as
you can see in
https://www.mediawiki.org/wiki/Git/Gerrit_project_ownership/Archive .
Right now I am emailing wikitech-l when we have a new core maintainer
but not for the more frequent new extension maintainers; hope that makes
sense to list inhabitants. :-)

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome Greg Grossmeier, Release Manager

2013-02-19 Thread Brian Wolff
Welcome greg!

What does this mean (if anything) for tarball releases?

-bawolff
On 2013-02-19 5:30 PM, Arthur Richards aricha...@wikimedia.org wrote:

 welcome Greg!


 On Tue, Feb 19, 2013 at 2:24 PM, Marc A. Pelletier m...@uberbox.org
 wrote:

  On 02/19/2013 04:09 PM, Rob Lanphier wrote:
 
  I'm excited to welcome Greg Grossmeier as our new Release Manager at
  the Wikimedia Foundation.
 
 
  Welcome!
 
  -- Marc
 
 
 
  __**_
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/**mailman/listinfo/wikitech-l
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 



 --
 Arthur Richards
 Software Engineer, Mobile
 [[User:Awjrichards]]
 IRC: awjr
 +1-415-839-6885 x6687
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for accepting backported patch sets for maintained versions?

2013-02-19 Thread Sumana Harihareswara
On 02/10/2013 06:12 AM, Siebrand Mazeland (WMF) wrote:
 Today I submitted a few patch sets of master to be backported to 1.19 and
 1.20[1,2]. I asked Niklas to review and merge them. He then replied that he
 thought the Release manager should merge them at a convenient time.
 
 Our MediaWiki.org page Version lifecycle[3] mentions the role release
 manager twice. There however seems to no longer be anyone who formally has
 this role.
 
 In my opinion, there could be two people that have it, based on their
 recent actions:
 * Mark Hershberger, because he made 1.20 happen.
 * Chris Steipp, because he backports security fixes and then releases
 updated point releases that also contain the relevant security fixes he
 made and approved for master/Wikimedia.
 
 My immediate question is: Who can and will review and approve the 8 patch
 sets I submitted for backporting?
 
 My longer term question is: Who is MediaWiki's release manager, and what
 can we expect of the person who has that role?
 
 [1]
 https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/core+branch:REL1_19,n,z
 [2]
 https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/core+branch:REL1_20,n,z
 [3] https://www.mediawiki.org/wiki/Version_lifecycle
 
 Cheers!
 

I think the answer is now that Greg Grossmeier fills the role of
MediaWiki's release manager so he will have to answer this. :-)

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code Review Dashboards of other users in gerrit

2013-02-19 Thread Krinkle
On Feb 19, 2013, at 12:13 AM, Krenair kren...@gmail.com wrote:
 
 On 18/02/13 23:08, hoo wrote:
 Hello,
 
 after the last gerrit update I'm no longer able to visit the Code Review
 Dashboards of other gerrit users in case I don't know their user ids. If
 I do it's fine (eg. https://gerrit.wikimedia.org/r/#/dashboard/50 is
 mine).
 Is there a way to get to these dashboards or at least get to know the
 user id of an user? Those dashboards gave a rather good overview of what
 a user is currently doing and I want them back...
 
 Cheers,
 
 Marius Hoch (hoo)
 
 
 Inspect your browser's calls to 
 https://gerrit.wikimedia.org/r/gerrit_ui/rpc/ChangeDetailService. It returns 
 loads of info in JSON format, including the IDs of the users on the page.
 
 Alex Monk
 


I think he means whether there is a usable way (from the GUI) to get to a 
dashboard from a user.

Previously this was achieved by simply clicking the username on any page where 
it is mentioned.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome Greg Grossmeier, Release Manager

2013-02-19 Thread Andre Klapper
On Tue, 2013-02-19 at 13:09 -0800, Rob Lanphier wrote:
 I'm excited to welcome Greg Grossmeier as our new Release Manager at
 the Wikimedia Foundation.

Welcome Greg (and see you next week)!

andre
-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-02-19 Thread Platonides
On 18/02/13 18:53, Waldir Pimenta wrote:
 It somewhat breaks the pattern, considering that all the other access
 points (and their corresponding php5 files) are located in the root. So
 that leaves only overrides.php, which I'm not sure why it was kept in
 mw-config, considering that (quoting Platonides) the installer used to be
 in the config folder, until the rewrite, which *moved the classes* to
 includes/installer (emphasis mine). If the classes were moved to
 includes/installer, why did those of overrides.php's remain?

Read the beginning of overrides.php:
 ?php
 /**
  * MediaWiki installer overrides.
  * Modify this file if you are a packager who needs to modify the behavior of 
 the MediaWiki installer.
  * Altering it is preferred over changing anything in /includes.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using wiki pages as databases

2013-02-19 Thread Platonides
On 19/02/13 13:56, Tyler Romeo wrote:
 So unfortunately I don't have a clear idea of what the problem is,
 primarily because I don't know anything about the Parser and its inner
 workings, but as far as having all the data in one page, here's something.
 Maybe this is a bad idea, but how about having a PHP-array content type. In
 other words, MyNamespace:MyPage would render the entire data structure, but
 MyNamespace:MyPage/index/test/0 would take $arr['index']['test'][0]. In the
 database, it would be stored as individual sub-pages, and leaf sub-pages
 would render exactly like a normal page would, but non-leaf pages would
 build the array from all child sub-pages and display it to the user. Would
 this solve the problem? Because if so, I've put some thought into it and
 would be willing to maybe draft an extension giving such a capability.

You can already use subpages to store data. Access is then O(1) The
problem is that then you have one page per entry.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome Greg Grossmeier, Release Manager

2013-02-19 Thread Platonides
Welcome Greg!



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for accepting backported patch sets for maintained versions?

2013-02-19 Thread Mark A. Hershberger
On Tue 19 Feb 2013 04:39:25 PM EST, Sumana Harihareswara wrote:
 My longer term question is: Who is MediaWiki's release manager, and what
 can we expect of the person who has that role?

 I think the answer is now that Greg Grossmeier fills the role of
 MediaWiki's release manager so he will have to answer this. :-)

This subject has come up a couple of times in the past week so I look 
forward to working with Greg to implement some policy around MediaWiki 
releases -- especially the point releases for 1.19, the LTS release.  
There is a lot to discuss and I look forward to those conversations.

Mark.

--
http://hexmode.com/

There is no path to peace. Peace is the path.
   -- Mahatma Gandhi, Non-Violence in Peace and War


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome Greg Grossmeier, Release Manager

2013-02-19 Thread Sumana Harihareswara
On 02/19/2013 04:09 PM, Rob Lanphier wrote:
 Greg will be managing the deployment process for the Wikimedia
 websites, focusing at first on improving release notes and outbound
 communication, freeing up folks like Sam to focus the engineering
 aspects of the role.  He'll help our Bug Wrangler (Andre) figure out
 how to deal with high priority deployment-related issues; Andre will
 continue to broadly manage the flow of all bugs, while Greg will
 narrowly focus on very high priority issues through fix deployment.
 He'll also take over coordination of our deployment calendar[1], and
 will likely be a little nosier than many of us have had the time to
 do. Over time, Greg will look more holistically at our deployment
 practice, and potentially lead a change over to a more continuous
 deployment model.

This is great, and I look forward to faster and higher-quality
deployments!  Welcome, Greg.

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using wiki pages as databases

2013-02-19 Thread Tim Starling
I wrote:
 The performance of #invoke should be OK for modules up to
 $wgMaxArticleSize (2MB). Whether the edit interface is usable at such
 a size is another question.

The Wiktionary folk are gnashing their teeth today when they
discovered that in fact, loading a 742KB module 1200 times in a single
page does in fact take a long time, and it trips the CPU limit after
about 450 invocations . So, sorry for raising expectations about that.

-- Tim Starling



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: [Wmfall] Luis Villa joins WMF as Deputy General Counsel

2013-02-19 Thread Sumana Harihareswara
Luis is also a developer so I wanted you to hear about this. :-)

-Sumana


 Original Message 
Subject: [Wmfall] Luis Villa joins WMF as Deputy General Counsel
Date: Tue, 19 Feb 2013 15:02:07 -0800
From: Geoff Brigham gbrig...@wikimedia.org
To: Staff All wmf...@lists.wikimedia.org

*Hi everyone, *
* *
*I’m simply thrilled to welcome Luis Villa to the Foundation as our new
Deputy General Counsel.*
*
Thanks to Kat Walsh, I met Luis during my first months at the Foundation.
 Kat loves Luis, and it is no wonder why.  In addition to being a superb
lawyer, Luis is an open source developer, has worked with leaders in our
Internet legal circles, and has a great personality that embraces our
culture.*
*
His most recent adventure took place at the Palo Alto office of Greenberg
Traurig, one of the top  global law firms.  There he worked with well-known
Internet lawyers like Ian Ballon and Heather Meeker.   Luis focused on
technology transactions, helping clients create solutions to licensing
problems, with a particular emphasis on open source and software standards.
His clients included Mozilla, the Open Compute Project, and a variety of
clients large and small.  Luis successfully defended Google in the
Oracle-Google/Android lawsuit, primarily working on the question of API
copyrightability. I hired Luis as outside counsel to work on a tough legal
matter for us, and his answers were on point, clear, and practical. *
*
Luis’ first contact with free software came was when he was in college at
Duke University. There he studied political science and computer science,
began using Linux, and helped triage Mozilla's bugzilla. A professor paid
him to play with Lego, resulting in brief maintainership of the GPL’d LegOS
operating system and co-authorship of the book Extreme Mindstorms. *
*
After graduation, Luis worked at Ximian, a Linux desktop startup, doing
quality assurance and eventually managing the desktop team. As part of
that, he got heavily involved in the GNOME desktop project, becoming
bugmaster and then getting elected to the board of directors. After Ximian
was acquired, Luis became geek in residence at Harvard Law School's
Berkman Center. At Berkman, he translated from lawyer to geek, and managed,
maintained, and developed several software projects.*
*
After Berkman, Luis started his legal ventures in life at Columbia Law
School, where he was Editor in Chief of the Science and Technology Law
Review, was awarded honors each year, and was co-recipient of the class
prize for excellence in intellectual property scholarship. His thesis dealt
with the use of software standards as part of antitrust enforcement.
Outside of class, he participated in the GPL revision process, worked in
the General Counsel's office at Red Hat, and developed a surprisingly
strong attachment to New York City.*
*
After law school, Luis worked in the legal department at Mozilla, where his
major project was revising the Mozilla Public License. The license got over
a thousand words shorter, and gained stronger patent protections and
compatibility with the Apache and GPL licenses. Luis also worked on
privacy, contracts, standards bodies, and other issues.*
*
Outside of work, Luis is an invited expert to the World Wide Web
Consortium's Patents and Standards Interest Group, and a board member and
chair of the Licensing Committee at the Open Source Initiative. He also
enjoys biking, photography, history, Duke basketball (men's and women's),
and eating.*
*
Luis's first Wikipedia edit under his current user name dates to Feb. 2007.
Like any good pedant, he has also been making minor spelling and grammar
corrections anonymously for many years.*
*
So, as you can tell, we are extremely excited about having Luis on our team
and wish him a warm welcome. *
* *
*Cheers, *
* *
*Geoff*

-- 
Geoff Brigham
General Counsel
Wikimedia Foundation


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: [Wmfall] Luis Villa joins WMF as Deputy General Counsel

2013-02-19 Thread Quim Gil

On 02/19/2013 03:36 PM, Sumana Harihareswara wrote:

*I’m simply thrilled to welcome Luis Villa to the Foundation as our new
Deputy General Counsel.*



\o/  !

--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: [Wmfall] Luis Villa joins WMF as Deputy General Counsel

2013-02-19 Thread Luis Villa
Hah... I think calling me a developer is, at this point, a bit of a
stretch, but I look forward to working with the tech team :)


On Tue, Feb 19, 2013 at 3:36 PM, Sumana Harihareswara suma...@wikimedia.org
 wrote:

 Luis is also a developer so I wanted you to hear about this. :-)

 -Sumana


  Original Message 
 Subject: [Wmfall] Luis Villa joins WMF as Deputy General Counsel
 Date: Tue, 19 Feb 2013 15:02:07 -0800
 From: Geoff Brigham gbrig...@wikimedia.org
 To: Staff All wmf...@lists.wikimedia.org

 *Hi everyone, *
 * *
 *I’m simply thrilled to welcome Luis Villa to the Foundation as our new
 Deputy General Counsel.*
 *
 Thanks to Kat Walsh, I met Luis during my first months at the Foundation.
  Kat loves Luis, and it is no wonder why.  In addition to being a superb
 lawyer, Luis is an open source developer, has worked with leaders in our
 Internet legal circles, and has a great personality that embraces our
 culture.*
 *
 His most recent adventure took place at the Palo Alto office of Greenberg
 Traurig, one of the top  global law firms.  There he worked with well-known
 Internet lawyers like Ian Ballon and Heather Meeker.   Luis focused on
 technology transactions, helping clients create solutions to licensing
 problems, with a particular emphasis on open source and software standards.
 His clients included Mozilla, the Open Compute Project, and a variety of
 clients large and small.  Luis successfully defended Google in the
 Oracle-Google/Android lawsuit, primarily working on the question of API
 copyrightability. I hired Luis as outside counsel to work on a tough legal
 matter for us, and his answers were on point, clear, and practical. *
 *
 Luis’ first contact with free software came was when he was in college at
 Duke University. There he studied political science and computer science,
 began using Linux, and helped triage Mozilla's bugzilla. A professor paid
 him to play with Lego, resulting in brief maintainership of the GPL’d LegOS
 operating system and co-authorship of the book Extreme Mindstorms. *
 *
 After graduation, Luis worked at Ximian, a Linux desktop startup, doing
 quality assurance and eventually managing the desktop team. As part of
 that, he got heavily involved in the GNOME desktop project, becoming
 bugmaster and then getting elected to the board of directors. After Ximian
 was acquired, Luis became geek in residence at Harvard Law School's
 Berkman Center. At Berkman, he translated from lawyer to geek, and managed,
 maintained, and developed several software projects.*
 *
 After Berkman, Luis started his legal ventures in life at Columbia Law
 School, where he was Editor in Chief of the Science and Technology Law
 Review, was awarded honors each year, and was co-recipient of the class
 prize for excellence in intellectual property scholarship. His thesis dealt
 with the use of software standards as part of antitrust enforcement.
 Outside of class, he participated in the GPL revision process, worked in
 the General Counsel's office at Red Hat, and developed a surprisingly
 strong attachment to New York City.*
 *
 After law school, Luis worked in the legal department at Mozilla, where his
 major project was revising the Mozilla Public License. The license got over
 a thousand words shorter, and gained stronger patent protections and
 compatibility with the Apache and GPL licenses. Luis also worked on
 privacy, contracts, standards bodies, and other issues.*
 *
 Outside of work, Luis is an invited expert to the World Wide Web
 Consortium's Patents and Standards Interest Group, and a board member and
 chair of the Licensing Committee at the Open Source Initiative. He also
 enjoys biking, photography, history, Duke basketball (men's and women's),
 and eating.*
 *
 Luis's first Wikipedia edit under his current user name dates to Feb. 2007.
 Like any good pedant, he has also been making minor spelling and grammar
 corrections anonymously for many years.*
 *
 So, as you can tell, we are extremely excited about having Luis on our team
 and wish him a warm welcome. *
 * *
 *Cheers, *
 * *
 *Geoff*

 --
 Geoff Brigham
 General Counsel
 Wikimedia Foundation


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: [Wmfall] Luis Villa joins WMF as Deputy General Counsel

2013-02-19 Thread Brian Wolff
Wow. That's quite an impressive description!  :-)

-bawolff
On 2013-02-19 7:36 PM, Sumana Harihareswara suma...@wikimedia.org wrote:

 Luis is also a developer so I wanted you to hear about this. :-)

 -Sumana


  Original Message 
 Subject: [Wmfall] Luis Villa joins WMF as Deputy General Counsel
 Date: Tue, 19 Feb 2013 15:02:07 -0800
 From: Geoff Brigham gbrig...@wikimedia.org
 To: Staff All wmf...@lists.wikimedia.org

 *Hi everyone, *
 * *
 *I’m simply thrilled to welcome Luis Villa to the Foundation as our new
 Deputy General Counsel.*
 *
 Thanks to Kat Walsh, I met Luis during my first months at the Foundation.
  Kat loves Luis, and it is no wonder why.  In addition to being a superb
 lawyer, Luis is an open source developer, has worked with leaders in our
 Internet legal circles, and has a great personality that embraces our
 culture.*
 *
 His most recent adventure took place at the Palo Alto office of Greenberg
 Traurig, one of the top  global law firms.  There he worked with well-known
 Internet lawyers like Ian Ballon and Heather Meeker.   Luis focused on
 technology transactions, helping clients create solutions to licensing
 problems, with a particular emphasis on open source and software standards.
 His clients included Mozilla, the Open Compute Project, and a variety of
 clients large and small.  Luis successfully defended Google in the
 Oracle-Google/Android lawsuit, primarily working on the question of API
 copyrightability. I hired Luis as outside counsel to work on a tough legal
 matter for us, and his answers were on point, clear, and practical. *
 *
 Luis’ first contact with free software came was when he was in college at
 Duke University. There he studied political science and computer science,
 began using Linux, and helped triage Mozilla's bugzilla. A professor paid
 him to play with Lego, resulting in brief maintainership of the GPL’d LegOS
 operating system and co-authorship of the book Extreme Mindstorms. *
 *
 After graduation, Luis worked at Ximian, a Linux desktop startup, doing
 quality assurance and eventually managing the desktop team. As part of
 that, he got heavily involved in the GNOME desktop project, becoming
 bugmaster and then getting elected to the board of directors. After Ximian
 was acquired, Luis became geek in residence at Harvard Law School's
 Berkman Center. At Berkman, he translated from lawyer to geek, and managed,
 maintained, and developed several software projects.*
 *
 After Berkman, Luis started his legal ventures in life at Columbia Law
 School, where he was Editor in Chief of the Science and Technology Law
 Review, was awarded honors each year, and was co-recipient of the class
 prize for excellence in intellectual property scholarship. His thesis dealt
 with the use of software standards as part of antitrust enforcement.
 Outside of class, he participated in the GPL revision process, worked in
 the General Counsel's office at Red Hat, and developed a surprisingly
 strong attachment to New York City.*
 *
 After law school, Luis worked in the legal department at Mozilla, where his
 major project was revising the Mozilla Public License. The license got over
 a thousand words shorter, and gained stronger patent protections and
 compatibility with the Apache and GPL licenses. Luis also worked on
 privacy, contracts, standards bodies, and other issues.*
 *
 Outside of work, Luis is an invited expert to the World Wide Web
 Consortium's Patents and Standards Interest Group, and a board member and
 chair of the Licensing Committee at the Open Source Initiative. He also
 enjoys biking, photography, history, Duke basketball (men's and women's),
 and eating.*
 *
 Luis's first Wikipedia edit under his current user name dates to Feb. 2007.
 Like any good pedant, he has also been making minor spelling and grammar
 corrections anonymously for many years.*
 *
 So, as you can tell, we are extremely excited about having Luis on our team
 and wish him a warm welcome. *
 * *
 *Cheers, *
 * *
 *Geoff*

 --
 Geoff Brigham
 General Counsel
 Wikimedia Foundation


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using wiki pages as databases

2013-02-19 Thread Tyler Romeo

 You can already use subpages to store data. Access is then O(1) The
 problem is that then you have one page per entry.

I know. What I'm suggesting is an interface where the sub-pages aggregate
up the hierarchy, meaning you can still edit the main top-level page, and
the backend will simply update the sub-pages as appropriate.

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com


On Tue, Feb 19, 2013 at 5:52 PM, Platonides platoni...@gmail.com wrote:

 On 19/02/13 13:56, Tyler Romeo wrote:
  So unfortunately I don't have a clear idea of what the problem is,
  primarily because I don't know anything about the Parser and its inner
  workings, but as far as having all the data in one page, here's
 something.
  Maybe this is a bad idea, but how about having a PHP-array content type.
 In
  other words, MyNamespace:MyPage would render the entire data structure,
 but
  MyNamespace:MyPage/index/test/0 would take $arr['index']['test'][0]. In
 the
  database, it would be stored as individual sub-pages, and leaf sub-pages
  would render exactly like a normal page would, but non-leaf pages would
  build the array from all child sub-pages and display it to the user.
 Would
  this solve the problem? Because if so, I've put some thought into it and
  would be willing to maybe draft an extension giving such a capability.

 You can already use subpages to store data. Access is then O(1) The
 problem is that then you have one page per entry.


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome Greg Grossmeier, Release Manager

2013-02-19 Thread Alolita Sharma
Welcome Greg! Glad to see our release engineering process becoming
stronger with your joining :-)

-Alolita

On Wed, Feb 20, 2013 at 4:44 AM, Sumana Harihareswara
suma...@wikimedia.org wrote:
 On 02/19/2013 04:09 PM, Rob Lanphier wrote:
 Greg will be managing the deployment process for the Wikimedia
 websites, focusing at first on improving release notes and outbound
 communication, freeing up folks like Sam to focus the engineering
 aspects of the role.  He'll help our Bug Wrangler (Andre) figure out
 how to deal with high priority deployment-related issues; Andre will
 continue to broadly manage the flow of all bugs, while Greg will
 narrowly focus on very high priority issues through fix deployment.
 He'll also take over coordination of our deployment calendar[1], and
 will likely be a little nosier than many of us have had the time to
 do. Over time, Greg will look more holistically at our deployment
 practice, and potentially lead a change over to a more continuous
 deployment model.

 This is great, and I look forward to faster and higher-quality
 deployments!  Welcome, Greg.

 --
 Sumana Harihareswara
 Engineering Community Manager
 Wikimedia Foundation

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using wiki pages as databases

2013-02-19 Thread Victor Vasiliev
On 02/19/2013 06:21 PM, Tim Starling wrote:
 The Wiktionary folk are gnashing their teeth today when they
 discovered that in fact, loading a 742KB module 1200 times in a single
 page does in fact take a long time, and it trips the CPU limit after
 about 450 invocations . So, sorry for raising expectations about that.
 
 -- Tim Starling
 

Aren't modules which are already loaded cached, so if they load it 1200
times on a single page, how does it manage to affect CPU time that badly?

-- Victor.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using wiki pages as databases

2013-02-19 Thread Tim Starling
On 20/02/13 15:07, Victor Vasiliev wrote:
 On 02/19/2013 06:21 PM, Tim Starling wrote:
 The Wiktionary folk are gnashing their teeth today when they
 discovered that in fact, loading a 742KB module 1200 times in a single
 page does in fact take a long time, and it trips the CPU limit after
 about 450 invocations . So, sorry for raising expectations about that.

 -- Tim Starling

 
 Aren't modules which are already loaded cached, so if they load it 1200
 times on a single page, how does it manage to affect CPU time that badly?

Execution of the module chunk seems to be the main reason. I
benchmarked it locally at 10.6ms, so 450 of those would be 4.8s.

Lua has a lot of O(N) work to do when a large table literal is
executed. I'm experimenting with using large string literals instead:

https://en.wiktionary.org/w/index.php?title=Module:Languages_string_dbaction=edit

That module takes about 2us for module chunk execution, when I run it
locally, and around 30us for each lookup in a tight loop on the server
side. But when I use it in a large article, it seems to use about
1.4ms per #invoke, so maybe there's still some overhead that needs to
be tracked down.

The idea of storing a database in a large string literal could be made
to be fairly efficient and user-friendly if a helper module was
written to do parsing and a binary search.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l