Re: [Wikitech-l] Data Processing

2009-09-28 Thread 李琴
Why one page_title can have different page_id? For example,  the page_title 
'USA' has two page_id '98937','112696'.
 


Thanks

   vanessa 
lee
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Data Processing

2009-09-28 Thread Max Semenik
On 28.09.2009, 16:54 ?? wrote:

 Why one page_title can have different page_id? For example,  the page_title
 'USA' has two page_id '98937','112696'.
  

Because they have different page_namespace ;)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Software updates Wednesday morning

2009-09-28 Thread Robert Ullmann
Hi,

On Wed, Sep 23, 2009 at 8:28 PM, Platonides platoni...@gmail.com wrote:
 Aryeh Gregor wrote:

 I've been meaning to investigate this, but haven't found the time yet.
  Have you come up with a minimal test case, or filed a bug with
 Mozilla?  I'd be willing to look at this if I get the time, but I
 don't know how soon I will get the time, so it would help if someone
 else tried to debug it.  Does it occur if you turn off JavaScript
 and/or CSS?

 I tried with Firefox 3.5.2 on XP and still couldn't reproduce it.
 Reloading the page produced a CPU spike but nothing which lead me to
 attribute it to the reported behavior and not a normal rendering.
 I wonder if there might be an extension/badware checking all abbr to
 include ads on relevant keywords.

Got it; it had to be something not everyone was using (else everyone
would be screaming), so I went back though my standard set of stuff
(several font sets, other stuff, wasn't java/js/css).

On the wiktionary (and for some things on the 'pedia), Ruby support is
useful, so I have the extension. But it also has the serious
mis-feature of trying to improve on abbr tags, and a serious bug
when there are lots of them.

Fix is to disable the Ruby extension, or edit about:config and
change rubysupport.expand.list to remove abbr.

Should be noted somewhere (and reported as a bug on the extension, but
I have no idea where to do that).

Thanks for your help, Robert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] js2 coding style for html output

2009-09-28 Thread Michael Dale
My attachment did not make it into the JS2 design thread... and that 
thread is in summary mode so here is a new post around the html output 
question. Which of the following constructions are easier to read and 
understand. Is there some tab delimitation format we should use to make 
the jquery builder format easier? Are performance considerations 
relevant?  (email is probably a bad context for comparison since tabs 
will get messy and there is no syntax highlighting)

Tim suggested that in security review context dojBuild type html 
output is more strait forward to review.

I think both are useful and I like jquery style building of html since 
it gives you direct syntax errors rather than html parse errors which 
are not as predictable across browsers. But sometimes performance wise 
or from a quick get it working perspective its easier to write out an 
html string. Also I think tabbed html is a bit easier on the eyes for 
someone that has dealt a lot with html.

Something thats not fun about jquery style is there are many ways to 
build that same html string using .wrap or any of other dozen jquery 
html manipulation functions ... so the same html could be structured 
very differently in the code. Furthermore jquery chain can get pretty 
long or be made up of lots of other vars, potentially making it tricky 
to rearrange things or identify what html is coming from where.

But perhaps that could be addressed by having jquery html construction 
conventions (or a wrapper that mirrored our php side html construction 
conventions? )

In general I have used the html output style but did not really think 
about it a-priori and I am open to transitioning to more jquery style 
output.

here is the html: you can copy and paste this in... on my system Firefox 
nightly str builder hovers around 20ms while jquery builder hovers 
around 150ms (hard to say what would be a good target number of dom 
actions or what is a fair test...) ...jquery could for example output to 
a variable instead of direct to dom output shaving 10ms or so and many 
other tweaks are possible.

html
head
titleJquery vs str buider/title
script type=text/javascript 
src=http://jqueryjs.googlecode.com/files/jquery-1.3.2.min.js;/script
script type=text/javascript
var repetCount = 200;
function runTest( mode ){
$('#cat').html('');
var t0 = new Date().getTime();
if( mode =='str'){
doStrBuild();
}else{
dojBuild();
}
$('#rtime').html( (new Date().getTime() - t0)  + 'ms');
}

function doStrBuild(){
var o = '';
for(var i =0 ;i  repetCount;i++){
o+=''+
'span id=' + escape(i) + ' class=fish' +
'p class=dog rel=foo ' +
escape(i) +
'/p' +
'/span';
}
$('#cat').append(o);
}
function dojBuild(){
for(var i =0 ;i  repetCount;i++){
$('span/')
.attr({
'id': i,
'class':'fish'
})   
.append( $('p/')
.attr({
'class':'dog',
'rel' : 'foo'
})
.text( i )
).appendTo('#cat');
}
}
/script
/head
body
h3Jquery vs dom insert/h3
Run Time:span id=rtime/span/divbr
a onClick=javascript:runTest('str'); href=#Run Str/abr
a onClick=javascript:runTest('dom'); href=#Run Jquery/abr
br
div id=cat/div

/body
/html

--michael

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] js2 coding style for html output

2009-09-28 Thread Tei
On Mon, Sep 28, 2009 at 6:44 PM, Michael Dale md...@wikimedia.org wrote:
..
 I think both are useful and I like jquery style building of html since
 it gives you direct syntax errors rather than html parse errors which
 are not as predictable across browsers. But sometimes performance wise
 or from a quick get it working perspective its easier to write out an
 html string. Also I think tabbed html is a bit easier on the eyes for
 someone that has dealt a lot with html.

probabbly not the intend of your message, but your first and second
examples can be mixed

function dojBuild2(){
  var box = document.createElement(div);
   for(var i =0 ;i  repetCount;i++){
   var thing = document.createElement(span);
  thing.innerHTML ='span id=' + escape(i) + ' class=fish' +
   'p class=dog rel=foo ' +
   escape(i) +
   '/p' +
   '/span';

   box.appendChild(thing);
   }

   document.getElementById(cat).appendChild(box);
}

what I think we have here, is that  $('#cat') is expensive, and run
inside a loop in dojBuild

Since your post is about coding style, and not perfomance (and not
about the particular speed of this style), ignore this post.

-- 
--
ℱin del ℳensaje.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] js2 coding style for html output

2009-09-28 Thread Michael Dale
[snip]
 what I think we have here, is that  $('#cat') is expensive, and run
 inside a loop in dojBuild
   
you can build and append in the jquery version and it only shaves 10ms. 
ie the following still incurs the jquery html building function call costs:

function dojBuild(){
var o ='';
for(var i =0 ;i  repetCount;i++){
o+=$('span/')
.attr({
'id': i,
'class':'fish'
})   
.append( $('p/')
.attr({
'class':'dog',
'rel' : 'foo'
})
.text( i )
).html()
}
$('#cat').append(o);
}


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Software updates Wednesday morning

2009-09-28 Thread Brion Vibber
On 9/28/09 7:57 AM, Robert Ullmann wrote:
 On the wiktionary (and for some things on the 'pedia), Ruby support is
 useful, so I have the extension. But it also has the serious
 mis-feature of trying to improve on abbr tags, and a serious bug
 when there are lots of them.

 Fix is to disable the Ruby extension, or edit about:config and
 change rubysupport.expand.list to remove abbr.

 Should be noted somewhere (and reported as a bug on the extension, but
 I have no idea where to do that).

If it's this extension, it looks like there's a support board or you can 
e-mail the author directly with bug reports; links are on this page:

http://piro.sakura.ne.jp/xul/_rubysupport.html.en

And I'll have to try that ext out sometime... wacky Japanese ruby fun! :D

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Announce: Brion moving to StatusNet

2009-09-28 Thread Brion Vibber
I'd like to share some exciting news with you all... After four awesome
years working for the Wikimedia Foundation full-time, next month I'm
going to be starting a new position at StatusNet, leading development on
the open-source microblogging system which powers identi.ca and other sites.

I've been contributing to StatusNet (formerly Laconica) as a user, bug
reporter, and patch submitter since 2008, and I'm really excited at the
opportunity to get more involved in the project at this key time as we
gear up for a 1.0 release, hosted services, and support offerings.

StatusNet was born in the same free-culture and free-software community
that brought me to Wikipedia; many of you probably already know founder
Evan Prodromou from his longtime work in the wiki community, launching
the awesome Wikitravel and helping out with MediaWiki development on
various fronts. The big idea driving StatusNet is rebalancing power in
the modern social web -- pushing data portability and open protocols to
protect your autonomy from siloed proprietary services... People need
the ability to control their own presence on the web instead of hoping
Facebook or Twitter always treat you the way you want.

This does unfortunately mean that I'll have less time for MediaWiki as
I'll be leaving my position as Wikimedia CTO sooner than originally
anticipated, but that doesn't mean I'm leaving the Wikimedia community
or MediaWiki development!

Just as I was in the MediaWiki development community before Wikimedia
hired me, you'll all see me in the same IRC channels and on the same
mailing lists... I know this is also a busy time with our fundraiser
coming up and lots of cool ongoing developments, so to help ease the
transition I've worked out a commitment to come into the WMF office one
day a week through the end of December to make sure all our tech staff
has a chance to pick my brain as we smooth out the code review processes
and make sure things are as well documented as I like to think they are. ;)

We've got a great tech team here at Wikimedia, and we've done so much
with so little over the last few years. A lot of really good work is
going on now, modernizing both our infrastructure and our user
interface... I have every confidence that Wikipedia and friends will
continue to thrive!

I'll start full-time at StatusNet on October 12. My key priorities until
then are getting some of our key software rollouts going, supporting the
Usability Initiative's next scheduled update and getting a useful but
minimally-disruptive Flagged Revisions configuration going on English
Wikipedia. I'm also hoping to make further improvements to our code
review process, based on my experience with our recent big updates as
well as the git-based workflow we're using at StatusNet -- I've got a
lot of great ideas for improving the CodeReview extension...

Erik Moeller will be the primary point of contact for WMF tech
management issues starting October 12, until the new CTO is hired. I'll
support the hiring process as much as I can, and we're hoping to have a
candidate in the door by the end of the year.

-- brion vibber (brion @ wikimedia.org)
CTO, Wikimedia Foundation
San Francisco


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] JS2 design (was Re: Working towards branching MediaWiki 1.16)

2009-09-28 Thread Michael Dale
Tim Starling wrote:
 Michael Dale wrote:
   
 That is part of the idea of centrally hosting reusable client-side 
 components so we control the jquery version and plugin set. So a
 new version won't come along until its been tested and
 integrated.
 

 You can't host every client-side component in the world in a
 subdirectory of the MediaWiki core. Not everyone has commit access to
 it. Nobody can hope to properly test every MediaWiki extension.

 Most extension developers write an extension for a particular site,
 and distribute their code as-is for the benefit of other users. They
 have no interest in integration with the core. If they find some
 jQuery plugin on the web that defines an interface that conflicts with
 MediaWiki, say jQuery.load() but with different parameters, they're
 not going to be impressed when you tell them that to make it work with
 MediaWiki, they need to rewrite the plugin and get it tested and
 integrated.

 Different modules should have separate namespaces. This is a key
 property of large, maintainable systems of code.
   

Right..  I agree the client side code needs more deployable modularly.

If designing a given component as a jquery plug-in, then I think it 
makes sense to put it in the jQuery namespace ... otherwise you won't be 
able to reference jquery things in a predictable way. Alternativly you


 I agree that the present system of parsing top of the javascipt
 file on every script-loader generation request is un-optimized.
 (the idea is those script-loader generations calls happen rarely
 but even still it should be cached at any number of levels. (ie
 checking the filemodifcation timestamp, witting out a php or
 serialized file .. or storing it in any of the other cache levels
 we have available, memcahce, database, etc )
 

 Actually it parses the whole of the JavaScript file, not the top, and
 it does it on every request that invokes WebStart.php, not just on
 mwScriptLoader.php requests. I'm talking about
 jsAutoloadLocalClasses.php if that's not clear.
   
Ah right... previously I had it in php. I wanted to avoid listing it 
twice but obviously thats a pretty costly way to do that.
This will make more sense to put in php if we start splitting up 
components into the extension folders and generate the path list 
dynamically for a given feature set.

 Have you looked at the profiling? On the Wikimedia app servers,
 even the simplest MW request takes 23ms, and gen=js takes 46ms. A
 static file like wikibits.js takes around 0.5ms. And that's with
 APC. You say MW on small sites is OK, I think it's slow and
 resource-intensive.

 That's not to say I'm sold on the idea of a static file cache, it
  brings its own problems, which I listed.

   
 yea... but almost all script-loader request will be cached.  it
 does not need to check the DB or anything its just a key-file
 lookup (since script-loader request pass a request key either its
 there in cache or its not ...it should be on par with the simplest
 MW request. Which is substantially shorter then around trip time
 for getting each script individually, not to mention gziping which
 can't otherwise be easily enabled for 3rd party installations.
 

 I don't think that that comparison can be made so lightly. For the
 server operator, CPU time is much more expensive than time spent
 waiting for the network. And I'm not proposing that the client fetches
 each script individually, I'm proposing that scripts be concatentated
 and stored in a cache file which is then referenced directly in the HTML.
   

I understand. We could even check gziping support at page output time 
and point to the gziped cached versions (analogous to making direct 
links to the /script-cache folder of the of the present script-loader 
setup )

My main question is how will this work for dynamic groups of scripts set 
post page load that are dictated by user interaction or client state?

Its not as easy to setup static combined output files to point to when 
you don't know what set of scripts you will be requesting ahead of time.

 $wgSquidMaxage is set to 31 days (2678400 seconds) for all wikis
 except wikimediafoundation.org. It's necessary to have a very long
 expiry time in order to fill the caches and achieve a high hit rate,
 because Wikimedia's access pattern is very broad, with the long tail
 dominating the request rate.
   
oky... so to preserve high cache level you could then have a single 
static file that lists versions of js with a low expire and the rest 
with high expire? Or maybe its so cheep to serve static files that it 
does not mater and just leave everything with a low expire?

--michael


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] JS2 design (was Re: Working towards branching MediaWiki 1.16)

2009-09-28 Thread Brion Vibber
On 9/27/09 4:15 AM, Aryeh Gregor wrote:
 On Fri, Sep 25, 2009 at 9:55 PM, Michael Dalemd...@wikimedia.org  wrote:
 ...right... we would want to avoid lots of live hacks. But I think we want
 to avoid lots of live hacks anyway.  A serious javascript bug would only
 affect the pages that where generated in thous hours that it was a bug was
 present not the 30 days that your characterizing the lag time of page
 generation.

 Do you have stats on that?... its surprising to me that pages are
 re-generated that rarely... How do central notice campaigns work?

 They insert the notice client-side using JavaScript.  The HTML served
 is thus always the same.

Yeah, it's kind of tricky to do right; but if you can keep the loader 
consistent and compatible, and have predictable expirations on the JS, 
such things can work pretty reliably.

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] JS2 design (Read this Not Previous)

2009-09-28 Thread Michael Dale
~ dough ~ Disregard previous, bad key stroke sent rather than save to draft.

Tim Starling wrote:
 Michael Dale wrote:
   
 That is part of the idea of centrally hosting reusable client-side 
 components so we control the jquery version and plugin set. So a
 new version won't come along until its been tested and
 integrated.
 

 You can't host every client-side component in the world in a
 subdirectory of the MediaWiki core. Not everyone has commit access to
 it. Nobody can hope to properly test every MediaWiki extension.

 Most extension developers write an extension for a particular site,
 and distribute their code as-is for the benefit of other users. They
 have no interest in integration with the core. If they find some
 jQuery plugin on the web that defines an interface that conflicts with
 MediaWiki, say jQuery.load() but with different parameters, they're
 not going to be impressed when you tell them that to make it work with
 MediaWiki, they need to rewrite the plugin and get it tested and
 integrated.

 Different modules should have separate namespaces. This is a key
 property of large, maintainable systems of code.
   

Right..  I agree the client side code needs more deployable modularly. 
It just tricky to manage all those relationships in php, but it appears 
it will be necessary to do so...

If designing a given component as a jQuery plug-in, then I think it 
makes sense to put it in the jQuery namespace ... otherwise you won't be 
able to reference jQuery things locally and no-conflict compatible way.  
Unless we create a mw wrapper of some sorts but I don't know how 
necessary that is atm... i guess it would be slightly cleaner.


 I agree that the present system of parsing top of the javascipt
 file on every script-loader generation request is un-optimized.
 (the idea is those script-loader generations calls happen rarely
 but even still it should be cached at any number of levels. (ie
 checking the filemodifcation timestamp, witting out a php or
 serialized file .. or storing it in any of the other cache levels
 we have available, memcahce, database, etc )
 

 Actually it parses the whole of the JavaScript file, not the top, and
 it does it on every request that invokes WebStart.php, not just on
 mwScriptLoader.php requests. I'm talking about
 jsAutoloadLocalClasses.php if that's not clear.
   
Ah right... previously I had it in php. I wanted to avoid listing it 
twice but obviously thats a pretty costly way to do that.
This will make more sense to put in php if we start splitting up 
components into the extension folders and generate the path list 
dynamically for a given feature set.

 Have you looked at the profiling? On the Wikimedia app servers,
 even the simplest MW request takes 23ms, and gen=js takes 46ms. A
 static file like wikibits.js takes around 0.5ms. And that's with
 APC. You say MW on small sites is OK, I think it's slow and
 resource-intensive.

 That's not to say I'm sold on the idea of a static file cache, it
  brings its own problems, which I listed.

   
 yea... but almost all script-loader request will be cached.  it
 does not need to check the DB or anything its just a key-file
 lookup (since script-loader request pass a request key either its
 there in cache or its not ...it should be on par with the simplest
 MW request. Which is substantially shorter then around trip time
 for getting each script individually, not to mention gziping which
 can't otherwise be easily enabled for 3rd party installations.
 

 I don't think that that comparison can be made so lightly. For the
 server operator, CPU time is much more expensive than time spent
 waiting for the network. And I'm not proposing that the client fetches
 each script individually, I'm proposing that scripts be concatentated
 and stored in a cache file which is then referenced directly in the HTML.
   

I understand. (its analogous to making direct links to the /script-cache 
folder instead of requesting the files through the script-loader entry 
point )

My main question is how will this work for dynamic groups of scripts set 
post page load that are dictated by user interaction or client state?

Do we just ignore this possibly and grab any necessary module components 
based on pre-defined module sets in php that get passed down to javascript?

Its not as easy to setup static combined output files to point to when 
you don't know what set of scripts you will be requesting...

hmm... if we had a predictable key format we could do a request for the 
static file. if we get a 404 then we do a request a dynamic request to 
generate the static file?.. Subsequent interactions would hit that 
static file? that seems ugly though.

 $wgSquidMaxage is set to 31 days (2678400 seconds) for all wikis
 except wikimediafoundation.org. It's necessary to have a very long
 expiry time in order to fill the caches and achieve a high hit rate,
 because Wikimedia's access pattern is very broad, with the long tail
 

Re: [Wikitech-l] [Foundation-l] Announce: Brion moving to StatusNet

2009-09-28 Thread Aude
On Mon, Sep 28, 2009 at 2:32 PM, Brion Vibber br...@wikimedia.org wrote:

 I'd like to share some exciting news with you all... After four awesome
 years working for the Wikimedia Foundation full-time, next month I'm
 going to be starting a new position at StatusNet, leading development on
 the open-source microblogging system which powers identi.ca and other
 sites.


* *Oppose* - It won't necessarily be so easy to find someone to fill your
shoes and manage things as well as you have with improvements in MediaWiki,
as well as site operations. You will be missed. 

Seriously, I'm disappointed to see you go, though wish you the best with
your new position.

-Aude



 -- brion vibber (brion @ wikimedia.org)
 CTO, Wikimedia Foundation
 San Francisco


 ___
 foundation-l mailing list
 foundatio...@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Proposal for editing template calls within pages

2009-09-28 Thread Daniel Friesen
David Gerard wrote:
 2009/9/24 Aryeh Gregor simetrical+wikil...@gmail.com:
   
 On Thu, Sep 24, 2009 at 10:48 AM, dgerard dger...@gmail.com wrote:
 

   
 WYSIWYG editing is
 getting there bit by bit - FCKeditor would be fine on a fresh wiki
 without the unspeakable atrocities inventive geeks have perpetrated
 upon wikitext on en:wp and should continue to get better at dealing
 with more obtusities.
   

   
 Funnily enough, I just talked to a user in #mediawiki asking why
 FCKeditor didn't work right on his wiki when copy-pasting from MS
 Word.
 


 *facepalm* And then there's the obvious things users will do that I
 hadn't thought of, yes.


 - d.
   
I had a user who copied an article from the html of Wikipedia (no edit
button) into Wikia's RTE.

The real disturbing part is unlike when people copy Wikipedia's html
into the edit box where it results in tell tale ?'s (Anime related
articles, 99.99% of articles copied in my case will be using the nihongo
template) and various other patterns, something copied html to rte looks
almost perfectly fine when looking at it, up till you hit the edit
button and see the mangled source.

T_T RTE makes it easier for inexperienced people to do stupid things.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]



-- 
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] JS2 design (was Re: Working towards branching MediaWiki 1.16)

2009-09-28 Thread Daniel Friesen
Side note. Multiple versions of jQuery can live happily on the same page.
jQuery handles isolation and noConflict so well that it can work on the
same page as incompatible versions of itself (which isn't the case for
basically any other js library, 90% of which prototype stuff in).

I like to use a variable like `jQuery13`, basically saving the jQuery
variable in an alternate variable identified by major version number.
Should an upgrade come along it's a little easier to migrate code.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

Tim Starling wrote:
 Michael Dale wrote:
   
 That is part of the idea of centrally hosting reusable client-side 
 components so we control the jquery version and plugin set. So a
 new version won't come along until its been tested and
 integrated.
 

 You can't host every client-side component in the world in a
 subdirectory of the MediaWiki core. Not everyone has commit access to
 it. Nobody can hope to properly test every MediaWiki extension.

 Most extension developers write an extension for a particular site,
 and distribute their code as-is for the benefit of other users. They
 have no interest in integration with the core. If they find some
 jQuery plugin on the web that defines an interface that conflicts with
 MediaWiki, say jQuery.load() but with different parameters, they're
 not going to be impressed when you tell them that to make it work with
 MediaWiki, they need to rewrite the plugin and get it tested and
 integrated.

 Different modules should have separate namespaces. This is a key
 property of large, maintainable systems of code.
 ...
 -- Tim Starling
   


-- 
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] JS2 design (was Re: Working towards branching MediaWiki 1.16)

2009-09-28 Thread Daniel Friesen
I got another, not from the thread of course. I'd like addOnloadHook to
be replaced by jQuery's ready which does a much better job of handling
load events.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

Tim Starling wrote:
 Here's what I'm taking out of this thread:

 * Platonides mentions the case of power-users with tens of scripts loaded via
 gadgets or user JS with importScript().
 * Tisza asks that core onload hooks and other functions be overridable by 
 user JS.
 * Trevor and Michael both mention i18n as an important consideration which I
 have not discussed.
 * Michael wants certain components in the js2 directory to be usable as
 standalone client-side libraries, which operate without MediaWiki or any other
 server-side application.

 -- Tim Starling
   


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Proposal for editing template calls within pages

2009-09-28 Thread Daniel Friesen
Brion Vibber wrote:
 ...
 Having this infrastructure in place further means we're in a better 
 position to someday make a major markup transition (say to a different 
 markup system or not exposing markup at all in a pure-WYSIWYG 
 environment)... something we're now very far from... but doesn't commit 
 us to any markup changes in the near or medium term.
 ...
 -- brion
   
On that note, what happened to Creole?
I looked at the Creole wiki some time ago, and I remember a good deal of
notes on how MediaWiki could handle Creole implementation.

I haven't heard anything about creole on the wikitech list though.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Proposal for editing template calls within pages

2009-09-28 Thread Daniel Friesen
Like YAML?

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

Aryeh Gregor wrote:
 On Fri, Sep 25, 2009 at 6:48 PM, Brion Vibber br...@wikimedia.org wrote:
   
 The field metadata can be fairly straightforwardly displayed and edited
 through a nice web interface. XML as such is simply a conveniently
 well-defined structured data tree format which can be used both for
 storing the field metadata in the DB and exposing it to the template
 invocation editor interface (whether client-side JS or server-side PHP
 or a custom bot-based tool speaking to our API).
 

 Depending on what features are added, exactly, I'd imagine a
 lighter-weight markup language would be more than sufficient.
 Something where you can describe the syntax in less than a page and
 write a correct implementation in half an hour or less would probably
 be good enough.  Like maybe just newline-delimited key=value pairs, or
 something like JSON at worst.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Template editing

2009-09-28 Thread Daniel Friesen
Hmmm... reminds me of
http://www.wikicreole.org/wiki/Placeholder

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

Magnus Manske wrote:
 ...
 * It will replace all other templates with strings like
 ##TEMPLATEnumber:name##, e.g., ##TEMPLATE1:Infobox VG##
 ...

 Cheers,
 Magnus


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Foundation-l] Announce: Brion moving to StatusNet

2009-09-28 Thread Brion Vibber
On 9/28/09 12:55 PM, Aude wrote:
 On Mon, Sep 28, 2009 at 2:32 PM, Brion Vibberbr...@wikimedia.org  wrote:

 I'd like to share some exciting news with you all... After four awesome
 years working for the Wikimedia Foundation full-time, next month I'm
 going to be starting a new position at StatusNet, leading development on
 the open-source microblogging system which powers identi.ca and other
 sites.


 * *Oppose* - It won't necessarily be so easy to find someone to fill your
 shoes and manage things as well as you have with improvements in MediaWiki,
 as well as site operations. You will be missed. 

:D

 Seriously, I'm disappointed to see you go, though wish you the best with
 your new position.

Thanks! And I'll still be here poking my fingers in wiki stuff, but on a 
more limited scope...

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Proposal for editing template calls within pages

2009-09-28 Thread Steve Bennett
On Tue, Sep 29, 2009 at 5:55 AM, Daniel Friesen
li...@nadir-seen-fire.com wrote:
 I had a user who copied an article from the html of Wikipedia (no edit
 button) into Wikia's RTE.

Theoretically that use case could be supported, right? If there were
enough id's in the HTML source, then we could map back onto the
underlying wikitext and paste that in instead. Difficult though...

OTOH, it might be easier to detect when this is happening and warn the user.

Steve

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] JS2 design (was Re: Working towards branching MediaWiki 1.16)

2009-09-28 Thread Michael Dale
we have js2AddOnloadHook that gives you jquery in no conflict as $j 
variable the idea behind using a different name is to separate jquery 
based code from the older non-jquery based code... but if taking a more 
iterative approach we could replace the addOnloadHook function.

--michael

Daniel Friesen wrote:
 I got another, not from the thread of course. I'd like addOnloadHook to
 be replaced by jQuery's ready which does a much better job of handling
 load events.

 ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

 Tim Starling wrote:
   
 Here's what I'm taking out of this thread:

 * Platonides mentions the case of power-users with tens of scripts loaded via
 gadgets or user JS with importScript().
 * Tisza asks that core onload hooks and other functions be overridable by 
 user JS.
 * Trevor and Michael both mention i18n as an important consideration which I
 have not discussed.
 * Michael wants certain components in the js2 directory to be usable as
 standalone client-side libraries, which operate without MediaWiki or any 
 other
 server-side application.

 -- Tim Starling
   
 


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
   


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Proposal for editing template calls within pages

2009-09-28 Thread Yaron Koren
That would certainly simplify the format; on the other hand, it would lead
to a lot of redundancy between the different documentation tags, which
could lead to conflicting data structures; so it's probably not a workable
solution.
-Yaron


There could be documentation lang=en, documentation lang=fr,

documentation lang=de...
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l