Hi,
There is a note here:
http://www.mediawiki.org/wiki/Extension:ParserFunctions saying you
should use a different version of ParserFunctions for 1.15.1. I'm not
at all sure what that actually means; but the problem definitely seems
to be within ParserFunctions ...
Robert
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
On 2/18/2010 5:30 AM, Robert Ullmann wrote:
Hi,
There is a note here:
http://www.mediawiki.org/wiki/Extension:ParserFunctions saying you
should use a different version of ParserFunctions for 1.15.1. I'm not
at all sure what that actually
On Tue, Feb 16, 2010 at 10:42 PM, Eric Sun e...@cs.stanford.edu wrote:
Even after setting $wgUseTidy = true, many of my pages show an error
Expression error: Missing operand for at the bottom in the
References section.
This is a ParserFunctions-related problem.
It looks like an error
Even after setting $wgUseTidy = true, many of my pages show an error
Expression error: Missing operand for at the bottom in the
References section.
It looks like an error message produced by the ParserFunctions
package. I've never seen this on en.wikipedia.org so I wonder if some
customization
On Sun, Feb 14, 2010 at 7:34 PM, Marco Schuster
ma...@harddisk.is-a-geek.org wrote:
What about turning wgUseTidy off for some time?
The doctype that we serve is XHTML, and various AJAX tools rely on
being able to parse the DOM tree as an XML document. But there are
certain valid wikitext
On Mon, Feb 15, 2010 at 2:19 PM, Carl (CBM) cbm.wikipe...@gmail.com wrote:
I hope that, before the doctype is changed to html5, a substantial
grace period is given for people to change to an HTML5 parser in their
javascript code.
We will continue with well-formed XML output for the foreseeable
Thanks Petr.
I installed the appropriate versions of the Parser hook extensions
that look relevant:
CategoryTree (Version r48218)
CharInsert (Version r36357)
Cite (Version r47190)
InputBox (Version r42791)
ParserFunctions (Version 1.1.1)
I'm using MediaWiki 1.15.1 and I imported the dump using
That solved the problem. Thanks!
On Sun, Feb 14, 2010 at 4:00 AM, Robert Ullmann rlullm...@gmail.com wrote:
Hi,
On Sun, Feb 14, 2010 at 11:03 AM, Eric Sun e...@cs.stanford.edu wrote:
I'm using MediaWiki 1.15.1 and I imported the dump using xml2sql.
Most enwiki pages render correctly, but a
On Sun, Feb 14, 2010 at 1:00 PM, Robert Ullmann rlullm...@gmail.com wrote:
Are you using $wgUseTidy? It is an HTML cleanup process that is always
enabled on WMF projects. Since it it is there, template creators often
miss closing spans and other things, or leave in extra close tags, and
never
On Sun, Feb 14, 2010 at 7:34 PM, Marco Schuster
ma...@harddisk.is-a-geek.org wrote:
What about turning wgUseTidy off for some time? Maybe some night hours... so
that our template magicians are forced to clean up the templates and the
other crap which resides deep buried into the wikitext.
They
On Mon, Feb 15, 2010 at 2:30 AM, Aryeh Gregor
simetrical+wikil...@gmail.comsimetrical%2bwikil...@gmail.com
wrote:
On Sun, Feb 14, 2010 at 7:34 PM, Marco Schuster
ma...@harddisk.is-a-geek.org wrote:
What about turning wgUseTidy off for some time? Maybe some night hours...
so
that our
On Sun, Feb 14, 2010 at 8:40 PM, Marco Schuster
ma...@harddisk.is-a-geek.org wrote:
Why? Why must software take care of the crap that users do? Either we force
them to write proper code, or they never will.
So they never will. So what? They're supposed to be writing an
encyclopedia, they're
REMOVE ME FROM THIS MAILING LIST!!
-Original Message-
From: Eric Sun [e...@cs.stanford.edu]
Date: 02/08/2010 12:45 AM
To: Wikimedia developers wikitech-l@lists.wikimedia.org
Subject: Re: [Wikitech-l] importing enwiki into local database
Note
On Mon, Feb 8, 2010 at 10:57 AM, Stefano Ronzoni endya...@excite.com wrote:
REMOVE ME FROM THIS MAILING LIST!!
You can remove yourself:
https://lists.wikimedia.org/mailman/options/wikitech-l/endya...@excite.com
--
Casey Brown
Cbrown1023
I stripped out the redirect /'s and imported enwiki using xml2sql,
but none of the templates rendered correctly--for example, navigating
to /The_Matrix results in a page with lots of mediawiki source like
{{#if: |This {{#ifeq:||article|page}} is about . }}For {{#if:the
series|the series|other
Yes, it was safe in my case (import of Russian and English Wiktionary).
See http://meta.wikimedia.org/wiki/Talk:Xml2sql
and example of script or shell command to strip out the redirect /
-- Andrew.
On Fri, Feb 5, 2010 at 6:38 AM, Eric Sun e...@cs.stanford.edu wrote:
Would it be safe to strip
On Thu, Feb 4, 2010 at 9:12 PM, Eric Sun e...@cs.stanford.edu wrote:
Hi,
I saw this thread back in October where someone was having trouble
importing the English Wikipedia XML dump:
http://lists.wikimedia.org/pipermail/wikitech-l/2009-October/045594.html
The thread back in October seemed to
I am still able to import the dumps using the old mwDumper (modified to fix
the contributor) and xml2SQL works also and it is quiet fast. importDump.php
continues after it breaks I think.
bilal
--
Verily, with hardship comes ease.
On Thu, Feb 4, 2010 at 9:24 PM, Chad innocentkil...@gmail.com
Would it be safe to strip out the redirect / tags from the xml and
reimport, or will that cause other problems?
Thanks,
Eric
On Thu, Feb 4, 2010 at 6:24 PM, Chad innocentkil...@gmail.com wrote:
On Thu, Feb 4, 2010 at 9:12 PM, Eric Sun e...@cs.stanford.edu wrote:
Hi,
I saw this thread
19 matches
Mail list logo