[Wikitech-l] Request for test wiki admin access

2010-09-18 Thread Cacycle
I would like to get wiki-admin access at the test wiki in order to
debug and test the MediaWiki editor wikEd (I am an en-admin). Right
now, I have an issue where I need to test the code for the next
version as a gadget-installation and, for obvious reasons, I do not
want to do this on Wikipedia. Thanks, Cacycle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Request for test wiki admin access

2010-09-18 Thread K. Peachey
See here: http://test.wikipedia.org/wiki/Wikipedia:Requests

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Request for test wiki admin access

2010-09-18 Thread Cacycle
Thanks a lot, my request has been added here:
http://test.wikipedia.org/wiki/Wikipedia:Requests/Permissions/Cacycle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] list of things to do for image dumps

2010-09-18 Thread emijrp
Thanks! : )

2010/9/17 Lars Aronsson l...@aronsson.se

 On September 10, emijrp wrote:
  Hi Lars, are you going to upload more logs to Internet Archive?

 No, I can't. I have not downloaded more recent logs. I only uploaded
 what was on my disk, because I needed to free some space.

  Domas
  website only shows the last 3 (?) months. I think that there are many of
  these files at Toolserver, but we must preserve this raw data in another
  secure (for posterity) place.

 Must? Says who? That sounds like a naive opinion. If you have an
 interest, you can do the job. Otherwise they will get lost. In the
 future, maybe this should be a task for the paid staff, but so far
 it has not been.


 --
Lars Aronsson (l...@aronsson.se)
   Aronsson Datateknik - http://aronsson.se



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] using parserTests code for selenium test framework

2010-09-18 Thread Platonides
Dan Nessett wrote:
 On Sat, 18 Sep 2010 00:53:04 +0200, Platonides wrote:
 

 + Switch the wiki to use this db and return a cookie or some other
 state information that identifies this test run configuration.

 I think you mean for remote petitions, not just for internal queries,
 where do you expect to store that data?
 
 Not sure what you mean by remote petitions.

What you are calling going through the web portal, as opposed to
parser tests and most phpunit tests, which are done in one run.

 Selenium requests always come through the web portal from the selenium 
 server. So, no internal queries are involved.

Note that although easier, other types of tests should also be confined.


 Where to store the data is an open question, one that requires 
 consultation with others. However, here are some thoughts:
 
 + The data must be persistent. If the wiki crashes for some reason, there 
 may be cloned dbs and test-specific copies of images and images/math 
 hanging around. (Depending how we handle the cache information, there may 
 also be fossil cache data). This requires cleanup after a wiki crash.
 
 + It would be possible to store the data in a file or in a master db 
 table. Which is best (or if something else is better) is a subject for 
 discussion.

What about memcached?
(that would be a key based on the original db name)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Accidental 302 hijack by interwiki index: Google v Wikimedia bug

2010-09-18 Thread MZMcBride
Aryeh Gregor wrote:
 Right now third-party software can do stuff like a
 href=http://en.wikipedia.org/wiki/$1; and replace $1 by user input,
 and it will work basically like [[$1]] typed on Wikipedia, and that's
 good.

http://en.wikipedia.org/wiki/mw:Not_really...

I still don't understand why users (third-party or not) are forced to use
links http://en.wikipedia.org/wiki/Special:Search?search=mw:like_this in
order to redirect properly.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] cite/ref error

2010-09-18 Thread MZMcBride
Brent Palmer wrote:
 We are creating an off-line version of the wikipedia, but we continue to
 have problems getting some citation items to render correctly.

If you're trying to emulate Wikipedia's configuration, I hope you know about
Wikimedia's NOC.[1]

 When using the cite template and citing a paper or journal and the URL
 tag is missing, you get an error in the citation list at the bottom with
 error Missing operand for .
 
 We haven't been able to find exactly where the error occurs since there
 are several layers of templates. Does anybody know if this was just a
 template problem briefly or if there is way to track it down. We are
 using the 6/22/2010 dump. I did look at the change history of several
 templates but I haven't found any that were fixed between that dump date
 and the current online version.

Error messages like this are pretty easy to grep for, either in the PHP
source code or in Special:AllMessages.[2] In this case, it's displaying the
pfunc_expr_missing_operand MediaWiki message,[3] which means it's likely
that there's something wrong with your ParserFunctions extension
installation. That's at least a start in the right direction. Make sure
you're using a compatible version of the extension with your version of
MediaWiki. MediaWiki's ExtensionDistributor tool[4] can help with this.

wikitech-l isn't really an appropriate venue for these types of questions.
You should try #mediawiki on irc.freenode.net or mediawiki-l.[5][6]

Hope that helps.

MZMcBride

[1] http://noc.wikimedia.org/
[2] http://en.wikipedia.org/w/api.php?action=querymeta=allmessages
[3] http://en.wikipedia.org/wiki/MediaWiki:Pfunc_expr_missing_operand
[4] http://www.mediawiki.org/wiki/Special:ExtensionDistributor
[5] http://www.mediawiki.org/wiki/MediaWiki_on_IRC
[6] http://www.mediawiki.org/wiki/Mailing_lists



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] using parserTests code for selenium test framework

2010-09-18 Thread Dan Nessett
On Sun, 19 Sep 2010 00:28:42 +0200, Platonides wrote:

 Where to store the data is an open question, one that requires
 consultation with others. However, here are some thoughts:
 
 + The data must be persistent. If the wiki crashes for some reason,
 there may be cloned dbs and test-specific copies of images and
 images/math hanging around. (Depending how we handle the cache
 information, there may also be fossil cache data). This requires
 cleanup after a wiki crash.
 
 + It would be possible to store the data in a file or in a master db
 table. Which is best (or if something else is better) is a subject for
 discussion.
 
 What about memcached?
 (that would be a key based on the original db name)

The storage has to be persistent to accommodate wiki crashes (e.g., httpd 
crash, server OS crash, power outage). It might be possible to use 
memcachedb, but as far as I am aware that requires installing Berkeley 
DB, which complicated deployment.

Why not employ the already installed DB software used by the wiki? That 
provides persistent storage and requires no additional software.


-- 
-- Dan Nessett


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] using parserTests code for selenium test framework

2010-09-18 Thread Platonides
Dan Nessett wrote:
 What about memcached?
 (that would be a key based on the original db name)
 
 The storage has to be persistent to accommodate wiki crashes (e.g., httpd 
 crash, server OS crash, power outage). It might be possible to use 
 memcachedb, but as far as I am aware that requires installing Berkeley 
 DB, which complicated deployment.
 
 Why not employ the already installed DB software used by the wiki? That 
 provides persistent storage and requires no additional software.

My original idea was to use whatever ObjectCache the wiki used, but it
could be forced to use the db as backend (that's the objectcache table).




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l