So unfortunately I don't have a clear idea of what the problem is,
primarily because I don't know anything about the Parser and its inner
workings, but as far as having all the data in one page, here's something.
Maybe this is a bad idea, but how about having a PHP-array content type. In
other words, MyNamespace:MyPage would render the entire data structure, but
MyNamespace:MyPage/index/test/0 would take $arr['index']['test'][0]. In the
database, it would be stored as individual sub-pages, and leaf sub-pages
would render exactly like a normal page would, but non-leaf pages would
build the array from all child sub-pages and display it to the user. Would
this solve the problem? Because if so, I've put some thought into it and
would be willing to maybe draft an extension giving such a capability.

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com


On Tue, Feb 19, 2013 at 7:27 AM, Tim Starling <tstarl...@wikimedia.org>wrote:

> On 19/02/13 21:11, MZMcBride wrote:
> > Hi.
> >
> > In the context of <https://bugzilla.wikimedia.org/show_bug.cgi?id=10621
> >,
> > the concept of using wiki pages as databases has come up. We're already
> > beginning to see this:
> >
> > * https://en.wiktionary.org/wiki/Module:languages (over 30,000 lines)
> > * https://en.wikipedia.org/wiki/Module:Convertdata (over 7,400 lines)
> >
> > At large enough sizes, the in-browser syntax highlighting is currently
> > problematic.
>
> We can disable syntax highlighting over some size.
>
> > But it's also becoming clear that the larger underlying
> > problem is that using a single request wiki page as a database isn't
> > really scalable or sane.
>
> The performance of #invoke should be OK for modules up to
> $wgMaxArticleSize (2MB). Whether the edit interface is usable at such
> a size is another question.
>
> > (ParserFunction #switch's performance used to prohibit most ideas of
> using
> > a wiki page as a database, as I understand it.)
>
> Both Lua and #switch have O(N) time order in this use case, but the
> constant you multiply by N is hundreds of times smaller for Lua.
>
> > Has any thought been given to what to do about this? Will it require
> > manually paginating the data over collections of wiki pages? Will this be
> > something to use Wikidata for?
>
> Ultimately, I would like it to be addressed in Wikidata. In the
> meantime, multi-megabyte datasets will have to be split up, for
> $wgMaxArticleSize if nothing else.
>
> -- Tim Starling
>
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to