[Wikitech-l] Introduce phpQuery into MediaWiki?

2011-01-03 Thread Philip Tzou
According to its website, phpQuery is a server-side, chainable, CSS3
selector driven Document Object Model (DOM) API based on jQuery JavaScript
Library.

I feel it will be very convenient if we introduce such jquery-like tools
into MediaWiki since we do have the need to parse HTML text. For example, I
can replace the awful regex part of LanguageConverter::autoConvert with
phpQuery.

So I want to ask is it possible to introduce phpQuery into MediaWiki?

sincerely,

Philip Tzou
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Big problem to solve: good WYSIWYG on WMF wikis

2011-01-03 Thread Andreas Jonsson
2010-12-29 08:33, Andrew Dunbar skrev:
 I've thought a lot about this too. It certainly is not any type of
 standard grammar. But on the other hand it is a pretty common kind of
 nonstandard grammar. I call it a recursive text replacement grammar.

 Perhaps this type of grammar has some useful characteristics we can
 discover and document. It may be possible to follow the code flow and
 document each text replacement in sequence as a kind of parser spec
 rather than trying and failing again to shoehorn it into a standard
 LALR grammar.

 If it is possible to extract such a spec it would then be possible to
 implement it in other languages.

 Some research may even find that is possible to transform such a
 grammar deterministically into an LALR grammar...

 But even if not I'm certain it would demysitfy what happens in the
 parser so that problems and edge cases would be easier to locate.
   
From my experience of implementing a wikitext parser, I would say that
it might be possible to transform wikitext to a token stream that is
possible to parse with a LALR parser.  My implementation
(http://svn.wikimedia.org/svnroot/mediawiki/trunk/parsers/libmwparser)
uses Antlr (which is an LL parser generator) and only rely on context
sensitive parsing (Antlr's semantic predicates) for parsing
apostrophes (bold and italics), and this might be possible to solve in
a different way.  The rest of the complex cases are handled by the
lexical analyser that produce a well behaving token stream that can be
relatively straightforwardly parsed.

My implementation is not 100% compatible, but I think that a 100%
compatible parser is not desirable since the most exotic border cases
would probably be characterized as bugs anyway (e.g. [[Link|table
class=]]).  But I think that the basic idea can be used to produce
a sufficiently compatible parser.

Best Regards,

/Andreas


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] SpecialPages and Related users and titles

2011-01-03 Thread Ilmari Karonen
On 01/03/2011 07:23 AM, MZMcBride wrote:

 I assume you're referring to these revisions:
 * http://www.mediawiki.org/wiki/Special:Code/MediaWiki/79398
 * http://www.mediawiki.org/wiki/Special:Code/MediaWiki/79399

 It looks like a nice usability fix. :-)  (Now to get Special:MovePage turned
 into ?action=move)

I could see arguments for going the other way too.  For example, writing 
[[Special:History/Page|history]] would be _so_ much more convenient than 
span class=plainlinks[{{fullurl:Page|action=history}} 
history]/span (and very few people outside this list probably even 
know that one can do the latter).

But I do generally agree that the lack of a clear and logical 
distinction between actions and special pages is ugly and sometimes 
confusing (...not to even mention Special:Search...).

-- 
Ilmari Karonen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] SpecialPages and Related users and titles

2011-01-03 Thread Daniel Friesen
On 11-01-03 03:59 AM, Ilmari Karonen wrote:
 On 01/03/2011 07:23 AM, MZMcBride wrote:
 I assume you're referring to these revisions:
 * http://www.mediawiki.org/wiki/Special:Code/MediaWiki/79398
 * http://www.mediawiki.org/wiki/Special:Code/MediaWiki/79399

 It looks like a nice usability fix. :-)  (Now to get Special:MovePage turned
 into ?action=move)
 I could see arguments for going the other way too.  For example, writing
 [[Special:History/Page|history]] would be _so_ much more convenient than
 span class=plainlinks[{{fullurl:Page|action=history}}
 history]/span  (and very few people outside this list probably even
 know that one can do the latter).

 But I do generally agree that the lack of a clear and logical
 distinction between actions and special pages is ugly and sometimes
 confusing (...not to even mention Special:Search...).
I remember seeing some discussion somewhere about someone putting 
together alias special pages like [[Special:Edit/page]] for some of the 
page actions. I can't remember where that was.

Here's something Nikerabbit pointed out.
He was thinking that Special:Contributions should also include a 
userpage and user talk tab.

Initially I didn't do this because it wasn't a straight related title. A 
user has both a userpage and a talkpage and it doesn't make sense for 
Special:Contributions to include an edit/move/etc... tab for either page 
since it's not linked to a specific title but the user itself.
Hence, to give Special:Contributions those tabs, we would have to turn 
the namespace section into a Userpage / Usertalk / Specialpage triple 
set of tabs on special pages, adding a condition into that section of 
code which till now has been given the pattern of never outputting any 
tab other than specialpage.

-- 
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Introduce phpQuery into MediaWiki?

2011-01-03 Thread Brion Vibber
phpQuery itself builds on the DOM module already in PHP, so be aware that
using it for this purpose is equivalent to using DOM  Xpath functions
already available.

For one thing this means that HTML will have to be run through the libxml2
HTML parser (which I have found is very sketchy with perfectly legal implied
close tags and such). In addition to memory and performance concerns of
parsing the whole document into a DOM tree and reserializing it, you might
not get back the structure you put in... hopefully no surprises but keep an
eye out.

-- brion

On Jan 3, 2011 1:49 AM, Philip Tzou philip@gmail.com wrote:

 According to its website, phpQuery is a server-side, chainable, CSS3
 selector driven Document Object Model (DOM) API based on jQuery JavaScript
 Library.

 I feel it will be very convenient if we introduce such jquery-like tools
 into MediaWiki since we do have the need to parse HTML text. For example,
I
 can replace the awful regex part of LanguageConverter::autoConvert with
 phpQuery.

 So I want to ask is it possible to introduce phpQuery into MediaWiki?

 sincerely,

 Philip Tzou
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Missing Section Headings

2011-01-03 Thread Platonides
Aryeh Gregor wrote:
 On Fri, Dec 31, 2010 at 6:25 PM, Krinkle krinklem...@gmail.com wrote:
 I doubt the addition of overflow:hidden has this consequence since
 that has been broadly tested
 in all kinds of browsers and has been default on several wikis for a
 long while.
 
 IIRC, overflow: hidden does indeed cause this issue on at least one
 very old browser, maybe IE5/Mac.  I vaguely recall it being tried but
 reverted sometime several years ago.  If memory serves and IE5/Mac is
 the issue, of course, we should just ignore it, because it would be
 crazy to try supporting it.

He is indeed using IE5 for Mac.

I guess that adding
h1, h2, h3, h4, h5, h6 {
overflow: visible;
}

to his monobook.css will fix it.
Do we have some CSS trick to target IE5 Mac?
Removing the headers isn't too friendly, so if there's an easy fix, I
would apply it.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Missing Section Headings

2011-01-03 Thread Brion Vibber
The fix is to stop using IE for Mac -- it's been unmaintained for almost a
decade and is wildly broken in so many ways that it's pretty much useless
for everyday web browsing.

-- brion
 On Jan 3, 2011 9:58 AM, Platonides platoni...@gmail.com wrote:
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Missing Section Headings

2011-01-03 Thread Brion Vibber
On Jan 3, 2011 10:09 AM, Chad innocentkil...@gmail.com wrote:

 On Mon, Jan 3, 2011 at 1:06 PM, Brion Vibber br...@pobox.com wrote:
  The fix is to stop using IE for Mac -- it's been unmaintained for almost
a
  decade and is wildly broken in so many ways that it's pretty much
useless
  for everyday web browsing.
 
  -- brion

 Fwiw, it still runs in Snow Leopard ;-)

That's how I know it's so awful! ;)

-- brion


 -Chad

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Missing Section Headings

2011-01-03 Thread Fred Bauder
 The fix is to stop using IE for Mac -- it's been unmaintained for almost
 a
 decade and is wildly broken in so many ways that it's pretty much useless
 for everyday web browsing.

 -- brion
  On Jan 3, 2011 9:58 AM, Platonides platoni...@gmail.com wrote:

Actually, it is quite useful for downloading .svg files as it gives a
message that it doesn't know how to display them, and offers an option to
save file.

I'm apologize for referring Marc to this list. I could have caught that
myself if I had tried my IE 5.2 for Mac on a Wikipedia page.

Fred Bauder



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Missing Section Headings

2011-01-03 Thread Krinkle
 Actually, it is quite useful for downloading .svg files as it gives a
 message that it doesn't know how to display them, and offers an  
 option to
 save file.

 I'm apologize for referring Marc to this list. I could have caught  
 that
 myself if I had tried my IE 5.2 for Mac on a Wikipedia page.

 Fred Bauder

You can right click and choose Save as.. or something alike in any
other browser that does support SVG files.
No need for IE5 for Mac.

--
Krinkle

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What would be a perfect wiki syntax? (Re: WYSIWYG)

2011-01-03 Thread Ryan Kaldari
The perfect wiki syntax would be XML (at least behind the scenes). Then 
people could use whatever syntax they want and have it easily translated 
via XSLT.

Ryan Kaldari

On 1/1/11 9:51 AM, lampak wrote:
 I've been following the discussion and as I can see it's already become
 rather unproductive*. So I hope my cutting in will not be very much out
 of place (even if I don't really know what I'm talking about).

 Many people here has stated the main reason why a WYSIWYG editor is not
 feasible is the current wikitext syntax.

 What's actually wrong with it?

 The main thing I can thing of is the fact one template may include an
 opening of a table etc. and another one a closing (e.g. {{col-begin}},
 {{col-end}}). It makes it impossible to isolate the template from the
 rest of the article - draw a frame around it, say this box here is a
 template.

 It could be fixed by forbidding leaving unclosed tags in templates. As a
 replacement, a kind of foreach loop could be introduced to iterate
 through an unspecified number of arguments.

 Lack of standardisation has also been mentioned. Something else?

 I've tried to think how a perfect parser should work. Most of this has
 been already mentioned. I think it should work in two steps: first
 tokenise the code and transform it into an intermediate tree structure like
  *paragraph
  title:
* plain text: Section 1
  content:
* plain text: foo
* bold text:
  * plain text: bar
* template
  name: Infobox
  * argument
name: last name:
value:
* plain text: Shakespear
 and so on. Then this structure could be transformed into a) HTML for
 display, b) JSON for the WYSIWYG editor. Thanks for this you wouldn't
 need to write a whole new JS parser. The editor would get a half-ready
 product. The JS code would need to be able to: a) transform this
 structure into HTML, b) modify the structure, c) transform this
 structure back into wikitext.

 But I guess it's more realistic to write a new JS parser than to write a
 new PHP parser. The former can start as a stub, the latter would need to
 be fully operational from the beginning.

 Stephanie's suggestions are also interesting.

 lampak

 * (except the WYSIWTF, of course)


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] JavaScript access to uploaded file contents: SVGEdit gadget needs ApiSVGProxy or CORS

2011-01-03 Thread Brion Vibber
I did a little more hacking on the SVGEdit extension this weekend:

http://www.mediawiki.org/wiki/Extension:SVGEdit

The extension now uses SVG-edit's iframe embedding API, which lets us host
the actual editor widget on a separate domain from MediaWiki. This also
means that it's a short step to being able to slap together the smaller
MediaWiki-side JS/CSS code as a gadget, which could be deployed by wiki
admins without requiring system-level access to install the extension:

http://code.google.com/p/svg-edit/issues/detail?id=747

The primary holdup to being able to deploy it to Wikimedia sites is that
scripts running in the MediaWiki context won't have direct access to the
contents of files on upload.wikimedia.org. That means we can't load the
current version of the file into the editor, which brings things to a nice
halt. :(


My SVGEdit wrapper code is currently using the ApiSVGProxy extension to read
SVG files via the local MediaWiki API. This seems to work fine locally, but
it's not enabled on Wikimedia sites, and likely won't be generally around;
it looks like Roan threw it together as a test, and I'm not sure if
anybody's got plans on keeping it up or merging to core.

Since ApiSVGProxy serves SVG files directly out on the local domain as their
regular content type, it potentially has some of the same safety concerns as
img_auth.php and local hosting of upload files. If that's a concern
preventing rollout, would alternatives such as wrapping the file data 
metadata into a JSON structure be acceptable?


Alternately, we could look at using HTTP access control headers on
upload.wikimedia.org, to allow XMLHTTPRequest in newer browsers to make
unauthenticated requests to upload.wikimedia.org and return data directly:

https://developer.mozilla.org/En/HTTP_Access_Control

That would allow the front-end code to just pull the destination URLs from
imageinfo and fetch the image data directly. It also has the advantage that
it would work for non-SVG files; advanced HTML5 image editing tools using
canvas could benefit from being able to load and save PNG and JPEG images as
well.

https://bugzilla.wikimedia.org/show_bug.cgi?id=25886 requests this for
bits.wikimedia.org (which carries the stylesheets and such).


In the meantime I'll probably work around it with an SVG-to-JSONP proxy on
toolserver for the gadget, which should get things working while we sort it
out.

-- brion vibber (brion @ pobox.com)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Missing Section Headings

2011-01-03 Thread Marco Schuster
On Mon, Jan 3, 2011 at 6:59 PM, Platonides platoni...@gmail.com wrote:
 He is indeed using IE5 for Mac.

 I guess that adding
 h1, h2, h3, h4, h5, h6 {
    overflow: visible;
 }

 to his monobook.css will fix it.
 Do we have some CSS trick to target IE5 Mac?
 Removing the headers isn't too friendly, so if there's an easy fix, I
 would apply it.
There is some CSS conditional stuff for IE5/Mac, didn't we have a CSS
fix file especially for this browser?

Marco

-- 
VMSoft GbR
Nabburger Str. 15
81737 München
Geschäftsführer: Marco Schuster, Volker Hemmert
http://vmsoft-gbr.de

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing Section Headings

2011-01-03 Thread Chad
On Mon, Jan 3, 2011 at 3:22 PM, Marco Schuster
ma...@harddisk.is-a-geek.org wrote:
 There is some CSS conditional stuff for IE5/Mac, didn't we have a CSS
 fix file especially for this browser?


Yes there was such a file. I deleted it because it wasn't
being used and it didn't actually fix anything.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] SpecialPages and Related users and titles

2011-01-03 Thread Aryeh Gregor
On Mon, Jan 3, 2011 at 12:23 AM, MZMcBride z...@mzmcbride.com wrote:
 It looks like a nice usability fix. :-)  (Now to get Special:MovePage turned
 into ?action=move)

I'd do the opposite -- stop using actions other than view, and move
everything to special pages.  (Of course we'd still support the
old-style URLs forever for compat, just not generate them anywhere.)
The set of things that are done by actions is small, fixed, and
incoherent: edit, history, delete, protect, watch; but not move,
undelete, export, logs, related changes.  The distinction is
historical -- I assume most or all of the action-based ones came about
before we had such a thing as special pages.  It would be cleaner if
we only had special pages for doing non-view actions.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introduce phpQuery into MediaWiki?

2011-01-03 Thread Aryeh Gregor
On Mon, Jan 3, 2011 at 11:59 AM, Brion Vibber br...@pobox.com wrote:
 phpQuery itself builds on the DOM module already in PHP, so be aware that
 using it for this purpose is equivalent to using DOM  Xpath functions
 already available.

 For one thing this means that HTML will have to be run through the libxml2
 HTML parser (which I have found is very sketchy with perfectly legal implied
 close tags and such). In addition to memory and performance concerns of
 parsing the whole document into a DOM tree and reserializing it, you might
 not get back the structure you put in... hopefully no surprises but keep an
 eye out.

In theory, this problem should go away in a few years when everyone
converges on HTML5 parsing.  I think you can get a PHP HTML5 parser,
which is compatible with browser parsing, but the performance probably
isn't so good, and I don't know how well-maintained it is.
(Compatible with browser parsing means identical to Firefox 4 and
WebKit nightly parsing, and compatible enough with how they used to
parse things that no appreciable number of sites have broken in the
new browser versions.)

That said, we do generally output well-formed XML or something quite
close to it, so the cases where PHP's DOM library will do something
unexpected should be reasonably limited.

I thought we had compatibility problems with users who didn't have the
DOM module installed, including default RHEL5 configuration IIRC?  Or
was that something else?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] JavaScript access to uploaded file contents: SVGEdit gadget needs ApiSVGProxy or CORS

2011-01-03 Thread Aryeh Gregor
On Mon, Jan 3, 2011 at 3:22 PM, Brion Vibber br...@pobox.com wrote:
 Since ApiSVGProxy serves SVG files directly out on the local domain as their
 regular content type, it potentially has some of the same safety concerns as
 img_auth.php and local hosting of upload files. If that's a concern
 preventing rollout, would alternatives such as wrapping the file data 
 metadata into a JSON structure be acceptable?

Would it be enough to serve it with Content-Disposition: attachment?
I'd think that should block all direct use but still allow XHR to work
(although I'm not totally sure).

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] JavaScript access to uploaded file contents: SVGEdit gadget needs ApiSVGProxy or CORS

2011-01-03 Thread Neil Kandalgaonkar
On 1/3/11 12:22 PM, Brion Vibber wrote:

 Alternately, we could look at using HTTP access control headers on
 upload.wikimedia.org, to allow XMLHTTPRequest in newer browsers to make
 unauthenticated requests to upload.wikimedia.org and return data directly:

 https://developer.mozilla.org/En/HTTP_Access_Control

 That would allow the front-end code to just pull the destination URLs from
 imageinfo and fetch the image data directly.

Yes. I have no trouble only enabling this for modern browsers, with just 
Apache config. SVG isn't even available on any version of IE in general 
use, including IE8.

This doesn't seem to be terribly hard to config in Apache. Looks like 
something Commons should be doing generally for its image servers.

   http://enable-cors.org/

Michael Dale is the expert on proxying though, and it has a more legit 
use case for simpler uploads and searching. Any thoughts, Michael?


-- 
Neil Kandalgaonkar ne...@wikimedia.org

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] JavaScript access to uploaded file contents: SVGEdit gadget needs ApiSVGProxy or CORS

2011-01-03 Thread Brion Vibber
On Mon, Jan 3, 2011 at 1:24 PM, Neil Kandalgaonkar ne...@wikimedia.orgwrote:

 On 1/3/11 12:22 PM, Brion Vibber wrote:

  Alternately, we could look at using HTTP access control headers on
 upload.wikimedia.org, to allow XMLHTTPRequest in newer browsers to make
 unauthenticated requests to upload.wikimedia.org and return data
 directly:

 https://developer.mozilla.org/En/HTTP_Access_Control

 That would allow the front-end code to just pull the destination URLs from
 imageinfo and fetch the image data directly.


 Yes. I have no trouble only enabling this for modern browsers, with just
 Apache config. SVG isn't even available on any version of IE in general use,
 including IE8.


Note that SVGEdit does work on IE9 preview, if using the latest editor code!

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What would be a perfect wiki syntax? (Re: WYSIWYG)

2011-01-03 Thread George Herbert
On Sun, Jan 2, 2011 at 6:28 AM, Jay Ashworth j...@baylink.com wrote:
 [...]
 This has been done a dozen times in the last 5 years, lampak.  The short
 version, as much as *I* am displeased with the fact that we'll never have
 *bold*, /italic/ and _underscore_, is that the installed base, both of
 articles and editors, means that Mediawikitext will never change.


That we've multiply concluded that it will never change doesn't mean
it won't; as a thought exercise, as I suggested in OtherThread, we
should consider negating that conclusion and seeing what happens.


-- 
-george william herbert
george.herb...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] SpecialPages and Related users and titles

2011-01-03 Thread Ilmari Karonen
On 01/03/2011 11:09 PM, Aryeh Gregor wrote:
 On Mon, Jan 3, 2011 at 12:23 AM, MZMcBridez...@mzmcbride.com  wrote:
 It looks like a nice usability fix. :-)  (Now to get Special:MovePage turned
 into ?action=move)

 I'd do the opposite -- stop using actions other than view, and move
 everything to special pages.  (Of course we'd still support the
 old-style URLs forever for compat, just not generate them anywhere.)
 The set of things that are done by actions is small, fixed, and
 incoherent: edit, history, delete, protect, watch; but not move,
 undelete, export, logs, related changes.  The distinction is
 historical -- I assume most or all of the action-based ones came about
 before we had such a thing as special pages.  It would be cleaner if
 we only had special pages for doing non-view actions.

I think we've had both actions and special pages from the beginning 
(well, since r403 at least).  I suspect it's just that adding new 
special pages was easier than adding new actions, so people tended to 
pick the path of least resistance.

-- 
Ilmari Karonen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What would be a perfect wiki syntax? (Re: WYSIWYG)

2011-01-03 Thread Rob Lanphier
On Mon, Jan 3, 2011 at 4:59 PM, George Herbert george.herb...@gmail.com wrote:
 That we've multiply concluded that it will never change doesn't mean
 it won't; as a thought exercise, as I suggested in OtherThread, we
 should consider negating that conclusion and seeing what happens.

Agreed.  I think part of the problem in the past is that the
conversation generally focused on the actual syntax, and not enough on
the incremental changes that we can make to MediaWiki to make this
happen.

If, for example, we can build some sort of per-revision indicator of
markup language (sort of similar to mime type) which would let us
support multiple parsers on the same wiki, then it would be possible
to build alternate parsers that people could try out on a per-article
basis (and more importantly, revert if it doesn't pan out).  The
thousands of MediaWiki installs could try out different syntax
options, and maybe a clear winner would emerge.

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What would be a perfect wiki syntax? (Re: WYSIWYG)

2011-01-03 Thread Chad
On Mon, Jan 3, 2011 at 8:41 PM, Rob Lanphier ro...@wikimedia.org wrote:
 If, for example, we can build some sort of per-revision indicator of
 markup language (sort of similar to mime type) which would let us
 support multiple parsers on the same wiki, then it would be possible
 to build alternate parsers that people could try out on a per-article
 basis (and more importantly, revert if it doesn't pan out).  The
 thousands of MediaWiki installs could try out different syntax
 options, and maybe a clear winner would emerge.


Or you end up supporting 5 different parsers that people like
for slightly different reasons :)

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What would be a perfect wiki syntax? (Re: WYSIWYG)

2011-01-03 Thread Rob Lanphier
On Mon, Jan 3, 2011 at 5:54 PM, Chad innocentkil...@gmail.com wrote:
 On Mon, Jan 3, 2011 at 8:41 PM, Rob Lanphier ro...@wikimedia.org wrote:
 If, for example, we can build some sort of per-revision indicator of
 markup language (sort of similar to mime type) which would let us
 support multiple parsers on the same wiki, then it would be possible
 to build alternate parsers that people could try out on a per-article
 basis (and more importantly, revert if it doesn't pan out).  The
 thousands of MediaWiki installs could try out different syntax
 options, and maybe a clear winner would emerge.

 Or you end up supporting 5 different parsers that people like
 for slightly different reasons :)

Yup, that would definitely be a strong possibility without a
disciplined approach.  However, done correctly, killing off fringe
parsers on a particular wiki would be fairly easy to do.  Just because
the underlying wiki engine allows for 5 different parsers, doesn't
mean a particular wiki would need to allow the creation of new pages
or new revisions using any of the 5.  If we build the tools that allow
admins some ability to constrain the choices, it doesn't have to get
too out of hand on a particular wiki.

If we were to go down this development path, we'd need to commit ahead
of time to be pretty stingy about what we bless as a supported
parser, and brutal about killing off support for outdated parsers.

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Big problem to solve: good WYSIWYG on WMF wikis

2011-01-03 Thread Andrew Dunbar
On 3 January 2011 21:54, Andreas Jonsson andreas.jons...@kreablo.se wrote:
 2010-12-29 08:33, Andrew Dunbar skrev:
 I've thought a lot about this too. It certainly is not any type of
 standard grammar. But on the other hand it is a pretty common kind of
 nonstandard grammar. I call it a recursive text replacement grammar.

 Perhaps this type of grammar has some useful characteristics we can
 discover and document. It may be possible to follow the code flow and
 document each text replacement in sequence as a kind of parser spec
 rather than trying and failing again to shoehorn it into a standard
 LALR grammar.

 If it is possible to extract such a spec it would then be possible to
 implement it in other languages.

 Some research may even find that is possible to transform such a
 grammar deterministically into an LALR grammar...

 But even if not I'm certain it would demysitfy what happens in the
 parser so that problems and edge cases would be easier to locate.

 From my experience of implementing a wikitext parser, I would say that
 it might be possible to transform wikitext to a token stream that is
 possible to parse with a LALR parser.  My implementation
 (http://svn.wikimedia.org/svnroot/mediawiki/trunk/parsers/libmwparser)
 uses Antlr (which is an LL parser generator) and only rely on context
 sensitive parsing (Antlr's semantic predicates) for parsing
 apostrophes (bold and italics), and this might be possible to solve in
 a different way.  The rest of the complex cases are handled by the
 lexical analyser that produce a well behaving token stream that can be
 relatively straightforwardly parsed.

 My implementation is not 100% compatible, but I think that a 100%
 compatible parser is not desirable since the most exotic border cases
 would probably be characterized as bugs anyway (e.g. [[Link|table
 class=]]).  But I think that the basic idea can be used to produce
 a sufficiently compatible parser.

In that case what is needed is to hook your parser into our current code
and get it create output if you have not done that already. Then you will
want to run the existing parser tests on it. Then you will want to run both
parsers over a large sample of existing Wikipedia articles (make sure
you use the same revisions on both parsers!) and run them through diff.
Then we'll have a decent idea of whether there are any edge cases you
didn't spot or whether any of them are exploited in template magic.

Let us know the results!

Andrew Dunbar (hippietrail)


 Best Regards,

 /Andreas


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Does anybody have the 20080726 dump version?

2011-01-03 Thread Monica shu
File download is done. Thanks Anthony for sharing the data and Huib for
providing the server.


On Sat, Jan 1, 2011 at 7:38 AM, Anthony wikim...@inbox.org wrote:

 File transfer is done.  Thanks for helping with the transfer.

 Anthony

 On Fri, Dec 31, 2010 at 8:28 AM, Huib Laurens sterke...@gmail.com wrote:
  If it fails i can give you access on others ways, its a dedicated
  server that doesn't have a job right now...
 
  2010/12/31, Anthony wikim...@inbox.org:
  On Fri, Dec 31, 2010 at 1:47 AM, Huib Laurens sterke...@gmail.com
 wrote:
  Okay, I emailed to Anthony how he can upload it.
 
  Transfer is in progress.  ETA about 10 hours.  md5sum is
  30c9b48de3ede527289bcdb810126723
 
  Hopefully there aren't any problems as I'm not quite sure how to
  resume upload with ftp.
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
  --
  Verzonden vanaf mijn mobiele apparaat
 
  Regards,
  Huib Abigor Laurens
 
 
 
  Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] MediaWiki security release 1.16.1

2011-01-03 Thread Tim Starling

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

I would like to announce the release of MediaWiki 1.16.1, which is a
security and maintenance release.

Wikipedia user PleaseStand pointed out that MediaWiki has no
protection against clickjacking. With user or site JavaScript or CSS
enabled, clickjacking can lead to cross-site scripting (XSS), and thus
full compromise of the wiki account of any user who visits a malicious
external site. Clickjacking affects all previous versions of MediaWiki.

Our fix involves denying framing on all pages except normal page views
and a few selected special pages. To be protected, all users need to
use a browser which supports X-Frame-Options. For information about
supported browsers, see:

https://developer.mozilla.org/en/the_x-frame-options_response_header

For more information about this vulnerability and the related patch, see:

https://bugzilla.wikimedia.org/show_bug.cgi?id=26561

Other changes in MediaWiki 1.16.1:

* (bug 24981) Allow extensions to access SpecialUpload variables again
* (bug 24724) list=allusers was out by 1 (shows total users - 1)
* (bug 24166) Fixed API error when using rvprop=tags
* For wikis using French as a content language, Special:Téléchargement
works again as an alias for Special:Upload.
* (bug 25167) Correctly load JS fixes for IE6 (fixing a regression in
1.16.0)
* (bug 25248) Fixed paraminfo errors in certain API modules.
* The installer now has improved handling for situations where
safe_mode is active or exec() and similar functions are disabled.
* (bug 19593) Specifying --server in now works for all maintenance
scripts.
* Fixed $wgLicenseTerms register globals.

Full release notes:
http://svn.wikimedia.org/svnroot/mediawiki/tags/REL1_16_1/phase3/RELEASE-NOTES

**
Download:
http://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.1.tar.gz

Patch to previous version (1.16.0), without interface text:
http://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.1.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.16/mediawiki-i18n-1.16.1.patch.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.1.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.1.patch.gz.sig
http://download.wikimedia.org/mediawiki/1.16/mediawiki-i18n-1.16.1.patch.gz.sig

Public keys:
https://secure.wikimedia.org/keys.html

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.10 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iEYEARECAAYFAk0ixHAACgkQgkA+Wfn4zXmOcgCePqvDrlaw1FZLbtOfx/3tEIID
GQkAn3eSSdTbBCOqXLvXNiG4Vm0kXl7r
=haR1
-END PGP SIGNATURE-


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What would be a perfect wiki syntax? (Re: WYSIWYG)

2011-01-03 Thread Alex Brollo
2011/1/4 Rob Lanphier ro...@robla.net

 On Mon, Jan 3, 2011 at 5:54 PM, Chad innocentkil...@gmail.com wrote:
  On Mon, Jan 3, 2011 at 8:41 PM, Rob Lanphier ro...@wikimedia.org
 wrote:
  If, for example, we can build some sort of per-revision indicator of
  markup language (sort of similar to mime type) which would let us
  support multiple parsers on the same wiki, then it would be possible
  to build alternate parsers that people could try out on a per-article
  basis (and more importantly, revert if it doesn't pan out).  The
  thousands of MediaWiki installs could try out different syntax
  options, and maybe a clear winner would emerge.
 
  Or you end up supporting 5 different parsers that people like
  for slightly different reasons :)

 Yup, that would definitely be a strong possibility without a
 disciplined approach.  However, done correctly, killing off fringe
 parsers on a particular wiki would be fairly easy to do.  Just because
 the underlying wiki engine allows for 5 different parsers, doesn't
 mean a particular wiki would need to allow the creation of new pages
 or new revisions using any of the 5.  If we build the tools that allow
 admins some ability to constrain the choices, it doesn't have to get
 too out of hand on a particular wiki.

 If we were to go down this development path, we'd need to commit ahead
 of time to be pretty stingy about what we bless as a supported
 parser, and brutal about killing off support for outdated parsers.

 Rob

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l