Re: [Wikitech-l] getThumbVirtualUrl() behavior

2009-08-06 Thread Gerard Meijssen
Hoi,
When based on criteria, files can be stored elsewhere, you may consider to
safe .tiff files separately as well. They are always big and they function
mainly for their archival value of the work done in restorations. As it is
they are not usable because they do not get rendered.
Thanks.
 Gerard

2009/8/5 Brion Vibber br...@wikimedia.org

 On 8/4/09 11:22 AM, dan nessett wrote:
  1) testGetThumbVirtualUrl(LocalFileTest) Failed asserting that two
  strings are equal. expected
  stringmwrepo://test/public/thumb/Test%21 difference
  x??? got stringmwrepo://test/thumb/Test%21
 [snip]
  Obviously, the function is coded to return something different than
  the test expects. The problem is, I don't know if this is a bug in
  the test or in the code.

 Looks like an old test that needs to be updated. IIRC Tim recently
 changed the system to use a separate virtual path for thumbs so we can
 easily configure the system to store thumbs in a different path (and
 hence different file server) from the primary media files.

 -- brion

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] ParseTree generated by the API and tag extension

2009-08-06 Thread Magnus Manske
On Wed, Aug 5, 2009 at 9:20 AM, Alex Bernieralex.bern...@free.fr wrote:
 Hello,

 I have what I think is strange behaviour with parser called by the API
 and tag extensions.

snip/

 A little bit of debug show that the includes/parser/Parser.php, the
 extensionSubstitution function is only called one time when I access my
 page via the API and three times when I access it via the browser. Is the
 text parsed not the same in the two cases ? Is it something wrong in my API
 call ?

If you want XML parsing of wiki text, the closest you can come to tat
is, AFAFIK, still my wiki2xml:
http://toolserver.org/~magnus/wiki2xml/w2x.php

Or as an extension:
http://www.mediawiki.org/wiki/Extension:Wiki2xml

It kinda works for most simple wikitext. Feel free to fix/improve.

Cheers,
Magnus

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] site requests for Hungarian Wikipedia

2009-08-06 Thread Tisza Gergő
Since I have run out of ideas of where to nag devs with shell access
(bugzilla is completely ignored most of the time, and mail/irc didn't
have much effect either), I'll try here.
The following are all one-line configuration changes:

Bug 14716 – Grant noratelimit right to the editor group in the
Hungarian Wikipedia (open for over a year)
Bug 19109 – Enable AbuseFilter in Hungarian Wikipedia (open for two months)
Bug 19315 – Set $wgCategoryPrefixedDefaultSortkey=false on Hungarian
Wikipedia (open for one and half month)
Bug 19885 – Restore autoreview for confirmed usergroup on huwiki
(severity:major, open for 10 days)

It would be really nice if someone could finally take a look at these,
especially the last one which completely sabotages the use of FlagRev
on huwiki (and which resulted from careless sysadmin action in the
first place).

More generally, I think the procedure for responding to site requests
(if there is a procedure at all) needs fixing. My impression is that
some requests are resolved quickly, and the rest leave the range of
whatever method shell people use to monitor new requests, and are
never picked up again (much like recent changes patroling on the
wiki). Some sort of backlog of open site requests might help the
situation. (Again, these are one-liners that need neither much thought
nor much effort, nor are they terribly frequent - there were 3 site
requests alltogether this year for huwiki, which is a top20 wikipedia
- so I'm sure it's an attention problem, not a resource problem.)

convenience links:
https://bugzilla.wikimedia.org/show_bug.cgi?id=14716
https://bugzilla.wikimedia.org/show_bug.cgi?id=19109
https://bugzilla.wikimedia.org/show_bug.cgi?id=19315
https://bugzilla.wikimedia.org/show_bug.cgi?id=19885

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] getThumbVirtualUrl() behavior

2009-08-06 Thread Brion Vibber
On 8/6/09 3:02 AM, Gerard Meijssen wrote:
 When based on criteria, files can be stored elsewhere, you may consider to
 safe .tiff files separately as well. They are always big and they function
 mainly for their archival value of the work done in restorations. As it is
 they are not usable because they do not get rendered.

I'm pretty sure you asked for the ability to upload them specifically, 
even as a first step without inline rendering. :)

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] PHPUnit tests now fixed

2009-08-06 Thread dan nessett
I have fixed 5 bugs in /tests/ and added one feature to run-tests.php (a 
--runall option so testers can run the PHPUnit tests without using make - 
although make test still works). With these changes all of the tests in /tests/ 
now work. A unified diff patch is attached to bug ticket 20077.

I had to make an architectural decision when enhancing run-test.php with the 
--runall option. The bug ticked describes this decision and suggests two other 
ways to achieve the same objective. I chose the approach implemented by the 
patch because it required no changes to the directory structure of /tests/. 
However, I actually prefer the second possibility. So, if senior developers 
could look at the bug ticket description and give me some feedback (especially 
if they also think the second option is better), that would be great.

I also would appreciate some feedback on the following question. One of the 
tests referenced the global variables $wgDBadminname and $wgDBadminuser. When I 
ran the configuration script during Mediawiki installation on my machine, the 
LocalSettings.php file created defined the globals $wgDBname and $wgDBuser. So, 
I changed the test to use these variables rather than the 'admin' versions. 
However, I don't remember if the script gave me a choice to use the 'admin' 
versions or not. Also, if the configuration script has changed, then some 
installations may use the 'admin' versions and some may not. In either case, I 
would have to modify the bug fix to accept both types of global variable. If 
someone would fill me in, I can make any required changes to the bug fix.


  

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Site Requests for hu.wiki

2009-08-06 Thread Rob Halsell
I had to kick myself onto this list, somehow I was not on it

So I cannot reply directly to the site request email, thus this new thread.

I will look at the lists bugs and either get to them today or tomorrow
if I can.

I do understand the frustration level inherent in bugzilla tickets
getting older and not being completed.  Rest assured that it is not on
purpose, we just have a limited bandwidth of folks who can do this sort
of thing. Primarily I take care of the normal site requests, but they
are far from my only responsibility.

I do apologize that some of them have waited this long, I will do my
best to get to them.  It was mentioned that this must be an attention
issue, not a resource issue.  Unfortunately, I feel it is both.  We are
working on improving the process for site requests, but it is not all
procedure, it is a lot of resource as well.

However, I will get to these, and other bugs as quickly as I can.  I am
sorry if the delay hurts folks, it is not intentional!

Best regards,

-- 
Rob Halsell
IT Manager and Systems Administrator
Wikimedia Foundation, Inc.
E-Mail: rhals...@wikimedia.org
Office: 415.839.6885 x620
Fax: 415.882.0495

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] PHPUnit tests now fixed

2009-08-06 Thread Happy-melon
dan nessett dness...@yahoo.com wrote in message 
news:630381.19130...@web32503.mail.mud.yahoo.com...
 I also would appreciate some feedback on the following question. One of 
 the tests referenced the global variables $wgDBadminname and 
 $wgDBadminuser. When I ran the configuration script during Mediawiki 
 installation on my machine, the LocalSettings.php file created defined the 
 globals $wgDBname and $wgDBuser. So, I changed the test to use these 
 variables rather than the 'admin' versions. However, I don't remember if 
 the script gave me a choice to use the 'admin' versions or not. Also, if 
 the configuration script has changed, then some installations may use the 
 'admin' versions and some may not. In either case, I would have to modify 
 the bug fix to accept both types of global variable. If someone would fill 
 me in, I can make any required changes to the bug fix.

I think these are to allow you to define a separate MySQL user (in 
adminsettings.php) that can be given higher privileges, as a security 
device.  IIRC this is now deprecated (cf bug18768).

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] PHPUnit tests now fixed

2009-08-06 Thread Chad
On Thu, Aug 6, 2009 at 12:56 PM, Happy-melonhappy-me...@live.com wrote:
 dan nessett dness...@yahoo.com wrote in message
 news:630381.19130...@web32503.mail.mud.yahoo.com...
 I also would appreciate some feedback on the following question. One of
 the tests referenced the global variables $wgDBadminname and
 $wgDBadminuser. When I ran the configuration script during Mediawiki
 installation on my machine, the LocalSettings.php file created defined the
 globals $wgDBname and $wgDBuser. So, I changed the test to use these
 variables rather than the 'admin' versions. However, I don't remember if
 the script gave me a choice to use the 'admin' versions or not. Also, if
 the configuration script has changed, then some installations may use the
 'admin' versions and some may not. In either case, I would have to modify
 the bug fix to accept both types of global variable. If someone would fill
 me in, I can make any required changes to the bug fix.

 I think these are to allow you to define a separate MySQL user (in
 adminsettings.php) that can be given higher privileges, as a security
 device.  IIRC this is now deprecated (cf bug18768).

 --HM



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Yes, it's been deprecated. AdminSettings will still load in maintenance
environments (commandLine.inc and Maintenance.php) if it still exists,
but it is no longer _required_ for anything. These variables can safely
be set in LocalSettings.

HM is right on what these users are for. Some (not all) maintenance
scripts require higher permissions than your normal $wgDBuser, so
$wgDBadminuser is supposed to have those privileges.

In practice, I've found that the vast majority of maintenance scripts
don't actually need this much permission. I'm trying to clean that
up over time, so we're not using a user with higher permissions when
it's not needed. (see the Maintenance constants and the getDbType()
function).

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] site requests for Hungarian Wikipedia

2009-08-06 Thread Aryeh Gregor
On Thu, Aug 6, 2009 at 9:39 AM, Tisza Gergőgti...@gmail.com wrote:
 More generally, I think the procedure for responding to site requests
 (if there is a procedure at all) needs fixing. My impression is that
 some requests are resolved quickly, and the rest leave the range of
 whatever method shell people use to monitor new requests, and are
 never picked up again (much like recent changes patroling on the
 wiki). Some sort of backlog of open site requests might help the
 situation. (Again, these are one-liners that need neither much thought
 nor much effort, nor are they terribly frequent - there were 3 site
 requests alltogether this year for huwiki, which is a top20 wikipedia
 - so I'm sure it's an attention problem, not a resource problem.)

Isn't Rob supposed to be doing these?  This should be the full list:

https://bugzilla.wikimedia.org/buglist.cgi?keywords=shellbug_status=NEWbug_status=ASSIGNEDbug_status=REOPENED

There are a lot of things on there that aren't simple fixes, though.
Maybe the one-line changes are lost in all the clutter.  Things like
Enable .odt upload or Install PdfHandler extension probably need
consideration and testing, and things like Configuration files should
have versioning system or Support OpenID extension on all wikimedia
projects would require nontrivial development effort.

Perhaps we should make a new component Configuration requests,
mandate that it only be used for simple one-line changes, and ensure
that there's a good default assignee who will clear the backlog every
workday?  Either by making the change, or marking LATER/WONTFIX if
it's against policy or doesn't have demonstrated consensus, or
changing the component if it's not a simple fix.  Something like that.
 It used to be we didn't really have the manpower to keep on top of
shell bugs, but that should no longer be true.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit tests now fixed

2009-08-06 Thread Aryeh Gregor
On Thu, Aug 6, 2009 at 1:05 PM, Chadinnocentkil...@gmail.com wrote:
 HM is right on what these users are for. Some (not all) maintenance
 scripts require higher permissions than your normal $wgDBuser, so
 $wgDBadminuser is supposed to have those privileges.

$wgDBuser needs to have DELETE rights on pretty much all tables, so
what's the security gain of bothering with a different user for ALTER
TABLE/CREATE TABLE/etc.?  $wgDBadminuser doesn't need to be able to
create new databases or reconfigure replication or anything, right?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] site requests for Hungarian Wikipedia

2009-08-06 Thread Gerard Meijssen
Hoi,
Do the creation of new projects qualify as configuration requests ?
Thanks,
  Gerard

PS at the moment 5 are in front of the board of trustees.. It would be cool
to have them go life by Wikimania


2009/8/6 Aryeh Gregor
simetrical+wikil...@gmail.comsimetrical%2bwikil...@gmail.com


 On Thu, Aug 6, 2009 at 9:39 AM, Tisza Gergőgti...@gmail.com wrote:
  More generally, I think the procedure for responding to site requests
  (if there is a procedure at all) needs fixing. My impression is that
  some requests are resolved quickly, and the rest leave the range of
  whatever method shell people use to monitor new requests, and are
  never picked up again (much like recent changes patroling on the
  wiki). Some sort of backlog of open site requests might help the
  situation. (Again, these are one-liners that need neither much thought
  nor much effort, nor are they terribly frequent - there were 3 site
  requests alltogether this year for huwiki, which is a top20 wikipedia
  - so I'm sure it's an attention problem, not a resource problem.)

 Isn't Rob supposed to be doing these?  This should be the full list:


 https://bugzilla.wikimedia.org/buglist.cgi?keywords=shellbug_status=NEWbug_status=ASSIGNEDbug_status=REOPENED

 There are a lot of things on there that aren't simple fixes, though.
 Maybe the one-line changes are lost in all the clutter.  Things like
 Enable .odt upload or Install PdfHandler extension probably need
 consideration and testing, and things like Configuration files should
 have versioning system or Support OpenID extension on all wikimedia
 projects would require nontrivial development effort.

 Perhaps we should make a new component Configuration requests,
 mandate that it only be used for simple one-line changes, and ensure
 that there's a good default assignee who will clear the backlog every
 workday?  Either by making the change, or marking LATER/WONTFIX if
 it's against policy or doesn't have demonstrated consensus, or
 changing the component if it's not a simple fix.  Something like that.
  It used to be we didn't really have the manpower to keep on top of
 shell bugs, but that should no longer be true.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit tests now fixed

2009-08-06 Thread Chad
On Thu, Aug 6, 2009 at 1:20 PM, Aryeh
Gregorsimetrical+wikil...@gmail.com wrote:
 On Thu, Aug 6, 2009 at 1:05 PM, Chadinnocentkil...@gmail.com wrote:
 HM is right on what these users are for. Some (not all) maintenance
 scripts require higher permissions than your normal $wgDBuser, so
 $wgDBadminuser is supposed to have those privileges.

 $wgDBuser needs to have DELETE rights on pretty much all tables, so
 what's the security gain of bothering with a different user for ALTER
 TABLE/CREATE TABLE/etc.?  $wgDBadminuser doesn't need to be able to
 create new databases or reconfigure replication or anything, right?

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Depends on which maintenance script you're talking about. Update.php
certainly does, as does renameDbPrefix (just to grab one off the top of
my head). The vast majority of scripts can function just fine with normal
DB access. Some (mcc and digit2html, to name a few) don't need any
DB access at all.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] getThumbVirtualUrl() behavior

2009-08-06 Thread Gerard Meijssen
Hoi,
We are extremely grateful that we CAN upload them. No question there. I can
imagine that you have all kinds of storage, my suggestion is to have them
where they are not part of the most active files. Again, being able to
upload is extremely valuable. When I talk to a museum and archive, I always
inform them that we  can and do save our .tiff files and this has been an
important factor on establishing our bona fides on several occasions. :)
Thanks,
   GerardM

2009/8/6 Brion Vibber br...@wikimedia.org

 On 8/6/09 3:02 AM, Gerard Meijssen wrote:
  When based on criteria, files can be stored elsewhere, you may consider
 to
  safe .tiff files separately as well. They are always big and they
 function
  mainly for their archival value of the work done in restorations. As it
 is
  they are not usable because they do not get rendered.

 I'm pretty sure you asked for the ability to upload them specifically,
 even as a first step without inline rendering. :)

 -- brion

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] site requests for Hungarian Wikipedia

2009-08-06 Thread Rob Halsell
Creation of new projects do indeed fall under site requests.  Just make
sure to link to consensus that the project is needed/required/approved.



Gerard Meijssen wrote:
 Hoi,
 Do the creation of new projects qualify as configuration requests ?
 Thanks,
   Gerard
 
 PS at the moment 5 are in front of the board of trustees.. It would be cool
 to have them go life by Wikimania
 
 
 2009/8/6 Aryeh Gregor
 simetrical+wikil...@gmail.comsimetrical%2bwikil...@gmail.com
 
 On Thu, Aug 6, 2009 at 9:39 AM, Tisza Gergőgti...@gmail.com wrote:
 More generally, I think the procedure for responding to site requests
 (if there is a procedure at all) needs fixing. My impression is that
 some requests are resolved quickly, and the rest leave the range of
 whatever method shell people use to monitor new requests, and are
 never picked up again (much like recent changes patroling on the
 wiki). Some sort of backlog of open site requests might help the
 situation. (Again, these are one-liners that need neither much thought
 nor much effort, nor are they terribly frequent - there were 3 site
 requests alltogether this year for huwiki, which is a top20 wikipedia
 - so I'm sure it's an attention problem, not a resource problem.)
 Isn't Rob supposed to be doing these?  This should be the full list:


 https://bugzilla.wikimedia.org/buglist.cgi?keywords=shellbug_status=NEWbug_status=ASSIGNEDbug_status=REOPENED

 There are a lot of things on there that aren't simple fixes, though.
 Maybe the one-line changes are lost in all the clutter.  Things like
 Enable .odt upload or Install PdfHandler extension probably need
 consideration and testing, and things like Configuration files should
 have versioning system or Support OpenID extension on all wikimedia
 projects would require nontrivial development effort.

 Perhaps we should make a new component Configuration requests,
 mandate that it only be used for simple one-line changes, and ensure
 that there's a good default assignee who will clear the backlog every
 workday?  Either by making the change, or marking LATER/WONTFIX if
 it's against policy or doesn't have demonstrated consensus, or
 changing the component if it's not a simple fix.  Something like that.
  It used to be we didn't really have the manpower to keep on top of
 shell bugs, but that should no longer be true.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


-- 
Rob Halsell
IT Manager and Systems Administrator
Wikimedia Foundation, Inc.
E-Mail: rhals...@wikimedia.org
Office: 415.839.6885 x620
Fax: 415.882.0495

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] site requests for Hungarian Wikipedia

2009-08-06 Thread Casey Brown
On Thu, Aug 6, 2009 at 1:48 PM, Rob Halsellrhals...@wikimedia.org wrote:
 Creation of new projects do indeed fall under site requests.


...but there's also a specific tracking bug to make everyone's lives
easier. :-) https://bugzilla.wikimedia.org/show_bug.cgi?id=16976

-- 
Casey Brown
Cbrown1023

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] site requests for Hungarian Wikipedia

2009-08-06 Thread Chad
On Thu, Aug 6, 2009 at 2:12 PM, Ilmari Karonennos...@vyznev.net wrote:
 Aryeh Gregor wrote:

 Perhaps we should make a new component Configuration requests,
 mandate that it only be used for simple one-line changes, and ensure
 that there's a good default assignee who will clear the backlog every
 workday?  Either by making the change, or marking LATER/WONTFIX if

 Or perhaps just a new keyword for these one-line config changes?  I'm
 thinking either oneliner or trivial -- the latter may be usable for
 two-lines changes as well ;)

 Obviously, anyone tagging nontrivial requests with these keywords should
 have a suitable clue-imparting implement applied to them.

 (Or we could use the existing easy keyword, if we're not afraid of
 confusing the noob developers with occasional configuration requests
 in the easy bug list.  I just checked, and there currently seem to be
 two open bugs tagged as easy, shell: 19310 and 19437.)

 --
 Ilmari Karonen

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Typically, easy is reserved for newbie dev stuff. If the shells (ie Rob) don't
mind, I'm sure it would be pretty easy to use it for both. Then Rob (or whoever)
can have a simple query set up for 'shell, easy' for those 1-line config changes
that can easily (through nobody's individual fault) slip through the cracks.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] site requests for Hungarian Wikipedia

2009-08-06 Thread Brion Vibber
On 8/6/09 10:08 AM, Aryeh Gregor wrote:
 Perhaps we should make a new component Configuration requests,
 mandate that it only be used for simple one-line changes, and ensure
 that there's a good default assignee who will clear the backlog every
 workday?

For reference, here's the rough current workflow:

* items get filed into Bugzilla into site requests category
* generally a couple folks will peek at it, add the 'shell' keyword or 
recategorize as appropriate, and add some clarifying comments helping to 
flesh out the request w/ implementation details or making sure there's 
consensus
* once or twice a week I try to do a quick pass through all the 
still-open reqs; the ones that are ready to go I'll stick in Rob's 
queue. Ones that aren't, I may put additional comments on them to ask 
for clarification.
* Rob goes through his assigned queue intermittently between other things

If there are questions about how it works or requests for more detail on 
consensus, usually these responses are already put on there by the time 
we reach it.

Some reqs have slipped through the cracks, unfortunately, especially 
older ones which haven't come up on scans of recent bugs. :(

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] site requests for Hungarian Wikipedia

2009-08-06 Thread Rob Halsell
All outstanding requests in the original email for this thread have been
addressed and processed to completion.

Now, I am nearly always in IRC, even when I am not at the keyboard.
While it is not a permanent solution, everyone should feel free to PM me
with any requests that seem to have been outstanding and ready for
processing a bit too long.  I may not be able to get to them right that
second, but I am trying my best!  =]

-- 
Rob Halsell
IT Manager and Systems Administrator
Wikimedia Foundation, Inc.
E-Mail: rhals...@wikimedia.org
Office: 415.839.6885 x620
Fax: 415.882.0495

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] PHPUnit tests now fixed

2009-08-06 Thread Brion Vibber
On 8/6/09 10:30 AM, Chad wrote:
 Depends on which maintenance script you're talking about. Update.php
 certainly does, as does renameDbPrefix (just to grab one off the top of
 my head). The vast majority of scripts can function just fine with normal
 DB access. Some (mcc and digit2html, to name a few) don't need any
 DB access at all.

Generally, only updaters need create/alter/etc privs; a few script that 
do mass rebuilds of indexes will also want to do things like dropping 
and re-adding indexes -- for example rebuildTextIndex drops the fulltext 
index on the searchindex table, then re-adds it after refilling the 
table's contents.

And a few extreme maintenance ops like MySQL replication master 
switches of course will need all kinds of fun privs. ;)

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] getThumbVirtualUrl() behavior

2009-08-06 Thread Brion Vibber
On 8/6/09 10:31 AM, Gerard Meijssen wrote:
 Hoi,
 We are extremely grateful that we CAN upload them. No question there. I can
 imagine that you have all kinds of storage, my suggestion is to have them
 where they are not part of the most active files. Again, being able to
 upload is extremely valuable. When I talk to a museum and archive, I always
 inform them that we  can and do save our .tiff files and this has been an
 important factor on establishing our bona fides on several occasions. :)

;)

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] PHPUnit tests now fixed

2009-08-06 Thread Chad
On Thu, Aug 6, 2009 at 3:04 PM, Brion Vibberbr...@wikimedia.org wrote:
 On 8/6/09 10:30 AM, Chad wrote:
 Depends on which maintenance script you're talking about. Update.php
 certainly does, as does renameDbPrefix (just to grab one off the top of
 my head). The vast majority of scripts can function just fine with normal
 DB access. Some (mcc and digit2html, to name a few) don't need any
 DB access at all.

 Generally, only updaters need create/alter/etc privs; a few script that
 do mass rebuilds of indexes will also want to do things like dropping
 and re-adding indexes -- for example rebuildTextIndex drops the fulltext
 index on the searchindex table, then re-adds it after refilling the
 table's contents.

 And a few extreme maintenance ops like MySQL replication master
 switches of course will need all kinds of fun privs. ;)

 -- brion

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Right, which is what my idea behind getDbType() was (which still needs
actual implementation, it's more an idea than practice at the moment).
If we don't need root DB access, we shouldn't be using it! If we don't need
DB access at all, don't bother connecting.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] PHPUnit tests now fixed

2009-08-06 Thread Brion Vibber
On 8/6/09 12:10 PM, Chad wrote:
 Right, which is what my idea behind getDbType() was (which still needs
 actual implementation, it's more an idea than practice at the moment).
 If we don't need root DB access, we shouldn't be using it! If we don't need
 DB access at all, don't bother connecting.

nom nom nom...

Might be good to be able to req increased privileges on command line if 
we can't do it with default user as well.

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How do I get $wgLocalisationCacheConf in scope for getLocalisationCache()?

2009-08-06 Thread dan nessett
Never mind. I found it. It is in DefaultSettings.php

--- On Thu, 8/6/09, dan nessett dness...@yahoo.com wrote:

 From: dan nessett dness...@yahoo.com
 Subject: [Wikitech-l] How do I get $wgLocalisationCacheConf in scope for 
 getLocalisationCache()?
 To: wikitech-l@lists.wikimedia.org
 Date: Thursday, August 6, 2009, 2:46 PM
 I am working on the tests in /t/.
 One, Revision.t, attempts to create a Language object and
 croaks in getLocalisationCache(), which is called by the
 Language class constructor. The problem is
 $wgLocalisationCacheConf is undefined, but referenced. When
 I Googled $wgLocalisationCacheConf I got the page:
 
 http://www.mediawiki.org/wiki/Manual:$wgLocalisationCacheConf
 
 It states that this global is introduced in 1.16.0. It
 isn't in my LocalSettings.php file (understandable, since I
 ran the install script on a version of MW prior to r52503,
 in which the variable is introduced). I'm not sure how the
 localization cache works, so would someone let me know what
 I have to do to get this variable in the scope of
 getLocalisationCache()?
 
 Thanks.
 
 
       
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 


  

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Video Quality for Derivatives (was Re:w...@home Extension)

2009-08-06 Thread Michael Dale
So I committed ~basic~ derivate code support for oggHandler in r54550 
(more solid support on the way)

Based input from the w...@home thread;  here are updated target 
qualities expressed via the firefogg api to ffmpeg2thoera

Also j^ was kind enough to run these settings on some sample input files:
http://firefogg.org/j/encoding_samples/ so you can check them out there.

We want to target 400 wide for the web stream to be consistent with 
archive.orgs which encodes mostly to 400x300 (although their 16:9 stuff 
can be up to 530 wide) ...

Updated mediaWiki firefogg integration and the stand alone encoder app 
these default transcode settings. in r54552  r54554 (should be pushed 
out to http://firefogg.org/make shortly ... or can be run @home with a 
trunk check out at:
/js2/mwEmbed/example_usage/Firefogg_Make_Advanced.html

anyway on to the settings:

$wgDerivativeSettings[ WikiAtHome::ENC_SAVE_BANDWITH ] =
array(
'maxSize'= '200',
'videoBitrate'= '164',
'audioBitrate'= '32',
'samplerate'= '22050',
'framerate'= '15',
'channels'= '1', 
'noUpscaling'= 'true'
);
$wgDerivativeSettings[ WikiAtHome::ENC_WEB_STREAM ] =
array(
'maxSize'= '400',
'videoBitrate'= '544',
'audioBitrate'= '96',
'noUpscaling'= 'true'
);
$wgDerivativeSettings[ WikiAtHome::ENC_HQ_STREAM ] =
array(
'maxSize' = '1080',
'videoQuality'= 6,
'audioQuality'= 3,
'noUpscaling'= 'true'
);

--michael


Brion Vibber wrote:
 On 8/3/09 9:56 PM, Gregory Maxwell wrote:
 [snip]
   
 Based on 'what other people do' I'd say the low should be in the
 200kbit-300kbit/sec range.  Perhaps taking the high up to a megabit?

 There are also a lot of very short videos on Wikipedia where the whole
 thing could reasonably be buffered prior to playback.


 Something I don't have an answer for is what resolutions to use. The
 low should fit on mobile device screens.
 

 At the moment the defaults we're using for Firefogg uploads are 400px 
 width (eg, 400x300 or 400x225 for the most common aspect rations) 
 targeting a 400kbps bitrate. IMO at 400kbps at this size things don't 
 look particularly good; I'd prefer a smaller size/bitrate for 'low' and 
 higher size/bitrate for medium qual.


  From sources I'm googling up, looks like YouTube is using 320x240 for 
 low-res, 480x360 h.264 @ 512kbps+128kbps audio for higher-qual, with 
 720p h.264 @ 1024Kbps+232kbps audio available for some HD videos.

 http://www.squidoo.com/youtubehd

 These seem like pretty reasonable numbers to target; offhand I'm not 
 sure the bitrates used for the low-res version but I think that's with 
 older Flash codecs anyway so not as directly comparable.

 Also, might we want different standard sizes for 4:3 vs 16:9 material?

 Perhaps we should wrangle up some source material and run some test 
 compressions to get a better idea what this'll look like in practice...

   
 Normally I'd suggest setting
 the size based on the content: Low motion detail oriented video should
 get higher resolutions than high motion scenes without important
 details. Doubling the number of derivatives in order to have a large
 and small setting on a per article basis is probably not acceptable.
 :(
 

 Yeah, that's way tougher to deal with... Potentially we could allow some 
 per-file tweaks of bitrates or something, but that might be a world of 
 pain. :)

   
 As an aside— downsampled video needs some makeup sharpening like
 downsampled stills will. I'll work on getting something in
 ffmpeg2theora to do this.
 

 Woohoo!

   
 There is also the option of decimating the frame-rate. Going from
 30fps to 15fps can make a decent improvement for bitrate vs visual
 quality but it can make some kinds of video look jerky. (Dropping the
 frame rate would also be helpful for any CPU starved devices)
 

 15fps looks like crap IMO, but yeah for low-bitrate it can help a lot. 
 We may wish to consider that source material may have varying frame 
 rates, most likely to be:

 15fps - crappy low-res stuff found on internet :)
 24fps / 23.98 fps - film-sourced
 25fps - PAL non-interlaced
 30fps / 29.97 fps - NTSC non-interlaced or many computer-generated vids
 50fps - PAL interlaced or PAL-compat HD native
 60fps / 59.93fps - NTSC interlaced or HD native

 And of course those 50 and 60fps items might be encoded with or without 
 interlacing. :)

 Do we want to normalize everything to a standard rate, or maybe just cut 
 50/60 to 25/30?

 (This also loses motion data, but not as badly as decimation to 15fps!)

   
 This brings me to an interesting point about instant gratification:
 Ogg was intended from day one to be a streaming format. This has
 pluses and minuses, but one thing we should take 

Re: [Wikitech-l] Video Quality for Derivatives (was Re:w...@home Extension)

2009-08-06 Thread Gregory Maxwell
On Thu, Aug 6, 2009 at 8:00 PM, Michael Dalemd...@wikimedia.org wrote:
 So I committed ~basic~ derivate code support for oggHandler in r54550
 (more solid support on the way)

 Based input from the w...@home thread;  here are updated target
 qualities expressed via the firefogg api to ffmpeg2thoera

Not using two-pass on the rate controlled versions?

It's a pretty consistent performance improvement[8], and it eliminates
the first frame blurry issue that sometimes comes up for talking
heads. (Note, that by default two-pass cranks the keyframe interval to
256 and makes the buf-delay infinite. So you'll need to set those to
sane values for streaming).


[1] For example:
http://people.xiph.org/~maikmerten/plots/bbb-68s/managed/psnr.png

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Video Quality for Derivatives (was Re:w...@home Extension)

2009-08-06 Thread Gregory Maxwell
On Thu, Aug 6, 2009 at 8:17 PM, Gregory Maxwellgmaxw...@gmail.com wrote:
 On Thu, Aug 6, 2009 at 8:00 PM, Michael Dalemd...@wikimedia.org wrote:
 So I committed ~basic~ derivate code support for oggHandler in r54550
 (more solid support on the way)

 Based input from the w...@home thread;  here are updated target
 qualities expressed via the firefogg api to ffmpeg2thoera

 Not using two-pass on the rate controlled versions?

 It's a pretty consistent performance improvement[8], and it eliminates
 the first frame blurry issue that sometimes comes up for talking
 heads. (Note, that by default two-pass cranks the keyframe interval to
 256 and makes the buf-delay infinite. So you'll need to set those to
 sane values for streaming).

I see r54562 switching to two-pass, but as-is this will produce files
which are not really streamable (because they streams can and will
burst to 10mbits even though the overall rate is 500kbit or whatever
is requested).

We're going to want to do something like -k 64 --buf-delay=256.

I'm not sure what key-frame interval we should be using— Longer
intervals lead to clearly better compression, with diminishing returns
over 512 or so depending on the content... but lower seeking
granularity during long spans without keyframes.  The ffmpeg2theora
defaults are 64 in one-pass mode, 256 in two-pass mode.

Buf-delay indicates the amount of buffering the stream is targeting.
I.e. For a 30fps stream at 100kbit/sec a buf-delay of 60 means that
the encoder expects that the decoder will have buffered at least
200kbit (25kbyte) of video data before playback starts.

If the buffer runs dry the playback stalls— pretty crappy for the
user's experience.  So bigger buff delays either mean a longer
buffering time before playback or more risk of stalling.

In the above (30,60,100) example the client would require 2 seconds to
fill the buffer if they were transferring at 100kbit/sec, 1 second if
they are transferring at 200kbit/sec. etc.

The default is the same as the keyframe interval (64) in one pass
mode, and infinite in two-pass mode.  Generally you don't want the
buf-delay to be less than the keyframe interval, as quality tanks
pretty badly at that setting.

Sadly the video tag doesn't currently provide any direct way to
request a minimum buffering. Firefox just takes a guess and every time
it stalls it guesses more. Currently the guesses are pretty bad in my
experience, though this is something we'll hopefully get addressed in
future versions.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l