Re: [Wikitech-l] Vector skin not working on BlackBerry?

2010-05-13 Thread Russell Blau
"David Gerard"  wrote in message 
news:aanlktin9qhcgvataegsdrwjuc-jegj-iwdg6f9oyt...@mail.gmail.com...
> There's a few comments on the Wikimedia blog saying they can't access
> en:wp any more using their BlackBerry. Though we tried it here on an
> 8900 and it works. Any other reports?

Works fine using Opera Mini.

Trying to load any page on the native Blackberry Browser, with JavaScript 
disabled, results in a browser crash with message "Uncaught exception: 
java.lang.ClassCastException"  (after the page has fully loaded).  Trying 
with JavaScript enabled (while not logged in) results in "HTTP Error 413: 
Request Entity Too Large", even on a page known to be very small.

This is for a Blackberry 8800 v.4.5.0.110.




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Problem with Upload API

2009-10-21 Thread Russell Blau
[adding mediawiki-api since this seems to be more relevant to that list]

"Bryan Tong Minh"  wrote:
>On Wed, Oct 21, 2009 at 4:29 PM, Russell Blau  wrote:
>>> --abc
>>> Content-Disposition: form-data; name="%s"; filename="%s"
>>> Content-Type: application/octet-stream
>>>
>> What is the second "%s" in the above line? Is this instead of having a
>> separate form-data element with name="filename", or is it a duplicate of
>> that element, or is it something entirely different?
>>
>name is the name of the form element (wpUploadFile) and filename the
>name of the file (picture.jpg)
>
>> If RFC 2388 says that the sending application MAY supply a file name, why 
>> is
>> the API treating this as a REQUIRED parameter?
>It is not, as far as I can see. It is requiring the form element
>"filename" and ignoring the filename attribute from the file all
>together.

So, basically, the API for action=upload is (a) not compliant with RFC 2388, 
and (b) failing with a misleading error message when the client fails to 
supply a parameter that isn't used at all?

Russ




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Problem with Upload API

2009-10-21 Thread Russell Blau

"Bryan Tong Minh"  wrote in message 
news:fd5886130910210324l409d2cc6m31831366ae4bb...@mail.gmail.com...
> On Wed, Oct 21, 2009 at 12:15 PM, Jan Luca  wrote:
> [...]
>> Content-Type: multipart/form-data
>> Content-Length: ".strlen($file)."
>> Content-Disposition: form-data; name=\"".$filename."\";
>> filename=\"".$filename."\"
>>
>> ".$file."
>> \r\n\r\n";
>
> You do set your content-type to multipart/form-data, but your content
> is not actually multipart/form-data encoded. A multipart/form-data
> encoded request looks something like this:
>
> POST / HTTP/1.1
> Content-Type: multipart/form-data; boundary=abc
> Content-Length: 1234
>
> --abc
> Content-Disposition: form-data; name="%s"; filename="%s"
> Content-Type: application/octet-stream
>
> 
> --abc

What is the second "%s" in the above line?  Is this instead of having a 
separate form-data element with name="filename", or is it a duplicate of 
that element, or is it something entirely different?

I note that RFC 2388 says: "The original local file name may be supplied as 
well, either as a "filename" parameter either of the "content-disposition: 
form-data" header or, in the case of multiple files, in a 
"content-disposition: file" header of the subpart. The sending application 
MAY supply a file name; if the file name of the sender's operating system is 
not in US-ASCII, the file name might be approximated, or encoded using the 
method of RFC 2231."

If RFC 2388 says that the sending application MAY supply a file name, why is 
the API treating this as a REQUIRED parameter?

Russ





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Dump throughput

2009-05-07 Thread Russell Blau

"Tomasz Finc"  wrote in message 
news:4a032be3.60...@wikimedia.org...
>
> Commons finished just fine along with every single one of the other
> small & mid size wiki's waiting to be picked up. Now were just left with
> the big sized wiki's to finish.

This is probably a stupid question (because it depends on umpteen different 
variables), but would the remaining "big sized wiki's" finish any faster if 
you stopped the dump processes for the smaller wikis that have already had a 
dump complete within the past week and are now starting on their second 
rounds?

Russ




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Dump throughput

2009-05-05 Thread Russell Blau
"Erik Zachte"  wrote in message 
news:002d01c9cd8d$3355beb0$9a013c...@com...
> Tomasz, the amount of dump power that you managed to activate is 
> impressive.
> 136 dumps yesterday, today already 110 :-) Out of 760 total.
> Of course there are small en large dumps, but this is very encouraging.
>

Yes, thank you Tomasz for your attention to this.  The commonswiki process 
looks like it *might* be dead, by the way.

Russ




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Dump process does not work

2009-05-01 Thread Russell Blau
"Tomasz Finc"  wrote in message 
news:49fb3ca6.90...@wikimedia.org...
Brion Vibber wrote:
>> El 5/1/09 5:51 PM, Andreas Meier escribió:
>>> Since today the dump process does not work correctly. It is running, but
>>> without any success
>>
>> Tomasz is on it... we've upgraded the machine they run on and it needs
>> some more tweaking. :)
>>
> Indeed. The backup job was missing the php normalize library. Putting
> that into place now. Then I'll see if there is any db weirdness.

But, on the bright side, every database in the system now has a dump that 
was completed within the last nine hours (roughly).  When's the last time 
you could say *that*?  :-)





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dump status update

2009-02-24 Thread Russell Blau
"Brion Vibber"  wrote in message 
news:49a42729.4070...@wikimedia.org...
> Quick update on dump status:
>
> * Dumps are back up and running on srv31, the old dump batch host.
>
> Please note that unlike the wikis sites themselves, dump activity is
> *not* considered time-critical -- there is no emergency requirement to
> get them running as soon as possible.
...
> * Dump runner redesign is in progress.
...
> * Dump format changes are in progress.

Brion -- thanks very much for the status report.  I think most of us 
understand that you need to prioritize your resouces, particularly when 
server alarms are going off left and right; just knowing that someone is 
aware of an issue and has it somewhere on their to-do list is very positive.

Russ




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Dump processes seem to be dead

2009-02-23 Thread Russell Blau
"Russell Blau"  wrote in message 
news:gnuacf$hf...@ger.gmane.org...
>
> I have to second this.  I tried to report this outage several times last 
> week - on IRC, on this mailing list, and on Bugzilla.  All reports -- NOT 
> COMPLAINTS, JUST REPORTS -- were met with absolute silence.

Two updates on this.

1)  Brion did respond to the Bugzilla report (albeit two+ days after it was 
posted), which I overlooked when posting earlier.  He said "The box they 
were running on (srv31) is dead. We'll reassign them over the weekend if we 
can't bring the box back up."

2)  Within the last hour, the server log at 
http://wikitech.wikimedia.org/wiki/Server_admin_log indicates that Rob found 
and fixed the cause of srv31 (and srv32-34) being down -- a circuit breaker 
was tripped in the data center.

Russ





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Dump processes seem to be dead

2009-02-23 Thread Russell Blau
"Lars Aronsson"  wrote in message 
news:pine.lnx.4.64.0902231202140.1...@localhost.localdomain...
>
> However, quite independent of your development work, the current
> system for dumps seems to have stopped on February 12. That's the
> impression I get from looking at
> http://download.wikimedia.org/backup-index.html
>
> Despite all its shortcomings (3-4 weeks between dumps, no history
> dumps for en.wikipedia), the current dump system is very useful.
> What's not useful is that it was out of service from July to
> October 2008 and now again appears to be broken since February 12.
>
...
> Still today, February 23, no explanation has been posted on that
> dump website or on these mailing lists. That's the real surprise.

I have to second this.  I tried to report this outage several times last 
week - on IRC, on this mailing list, and on Bugzilla.  All reports -- NOT 
COMPLAINTS, JUST REPORTS -- were met with absolute silence.  I fully 
understand that time and resources are limited, and not everything can be 
fixed immediately, but at least some acknowledgement of the reports would be 
appreciated.  It is extremely disheartening to members of the user community 
of what is supposed to be a collaborative project when attempts to 
contribute by reporting a service outage are ignored.

Russ




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Double redirects

2009-02-19 Thread Russell Blau
"Chad"  wrote in message 
news:5924f50a0902020609q542047cfme84b9237eec38...@mail.gmail.com...
> On Mon, Feb 2, 2009 at 8:43 AM, Russell Blau  wrote:
...
>> 3)  How can the users (not sysadmins) of a given wiki determine what the
>> value of $wgMaxRedirects is for their wiki?  In particular, what value is
>> being used currently on enwiki and other WMF projects?
>
> The defaults for every setting are available for viewing on Mediawiki.org
> For this setting, the default appears to be 1, meaning that we keep long-
> standing behavior for the default. As for seeing your current settings, 
> you
> can check out the configuration files at [1].
...
> [1] - http://noc.wikimedia.org/conf/ - CommonSettings and 
> InitialiseSettings
> are the two places to look.

Chad: AFAICT, the default value of $wgMaxRedirects is 1 and is not changed 
in any of the WMF settings files you pointed me to.  Still, the software is 
bypassing double-redirects on enwiki as of this morning; check out 
http://en.wikipedia.org/wiki/Dubya for an example.  Perhaps a bug?

Russ




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Dump processes seem to be dead

2009-02-17 Thread Russell Blau
"Andreas Meier"  wrote in message 
news:4997d645.8050...@gmx.de...
> Hello,
>
> the current dump building seem to be dead and perhaps should be killed
> by hand.
>

Reported: https://bugzilla.wikimedia.org/show_bug.cgi?id=17535




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Special pages

2009-02-17 Thread Russell Blau
"Rotem Liss"  wrote in message 
news:499a7134.5090...@gmail.com...
> Brion Vibber wrote:
>> Last week some of the background batch jobs were reorganized to run
>> under a less privileged user account, a good security practice.
>>
>> Unfortunately the log files the batch jobs logged to didn't have their
>> ownership updated, so the batch jobs were unable to actually run, due to
>> being unable to open their output files. :P
>>
>> I've fixed the file ownership; they should start updating in the next
>> day or two per regular schedule.
>>
>> -- brion
>
> I've checked enwiki and hewiki, and the special pages are still not 
> updated.

Reported: https://bugzilla.wikimedia.org/show_bug.cgi?id=17534 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Special pages

2009-02-13 Thread Russell Blau
"Brion Vibber"  wrote in message 
news:49949007.4090...@wikimedia.org...
> On 2/12/09 2:02 AM, Huib Laurens wrote:
>> Hello,
>>
>> Can somebody give a status update about when the special pages will be
>> updated again? (like broken redirects or double redirects) The stopt
>> updating for almost a week now.
>>
>> And what is the reason that the are not working?
>
> Last week some of the background batch jobs were reorganized to run
> under a less privileged user account, a good security practice.
>
> Unfortunately the log files the batch jobs logged to didn't have their
> ownership updated, so the batch jobs were unable to actually run, due to
> being unable to open their output files. :P
>
> I've fixed the file ownership; they should start updating in the next
> day or two per regular schedule.

Brion, you might want to double-check that.  The "regular schedule" would 
have had the special pages on enwiki updated today (13 Feb), early morning 
US time, but so far it has not happened.




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Double redirects

2009-02-02 Thread Russell Blau
"Chad"  wrote in message 
news:5924f50a0902020609q542047cfme84b9237eec38...@mail.gmail.com...
>> 3)  How can the users (not sysadmins) of a given wiki determine what the
>> value of $wgMaxRedirects is for their wiki?  In particular, what value is
>> being used currently on enwiki and other WMF projects?
>>
>
> The defaults for every setting are available for viewing on Mediawiki.org
> For this setting, the default appears to be 1, meaning that we keep long-
> standing behavior for the default. As for seeing your current settings, 
> you
> can check out the configuration files at [1].
>
Hmm.  Enwiki currently seems to be following double-redirects 
notwithstanding this setting.  I am aware that the initial implementation of 
this fix contained some bugs that (we hope) will be fixed in the next 
software update, so probably best to wait until after that update is done 
before worrying further.

Russ




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Double redirects

2009-02-02 Thread Russell Blau
It appears that since r45973, there is an enhancement to MediaWiki that 
allows the software to follow double redirects, and conceivably triple- and 
higher-level redirects, up to a configurable limit ($wgMaxRedirects).  This 
raises a few questions:

1)  Should [[Special:DoubleRedirects]] be changed to show only those 
redirect chains that exceed $wgMaxRedirects in length, since those are the 
only ones that really need fixing under the current software?

2)  If not, should there be a new Special: page to list such "excessive" 
redirect chains?

3)  How can the users (not sysadmins) of a given wiki determine what the 
value of $wgMaxRedirects is for their wiki?  In particular, what value is 
being used currently on enwiki and other WMF projects?

4)  Shouldn't this configuration change have been announced to the user 
community of each project so that they could consider how it affects their 
internal policies on what types of redirects to allow?

Russ




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Enwiki dump crawling since 10/15/2008

2009-01-28 Thread Russell Blau
"Brion Vibber"  wrote in message 
news:497f9c35.9050...@wikimedia.org...
> On 1/27/09 2:55 PM, Robert Rohde wrote:
>> On Tue, Jan 27, 2009 at 2:42 PM, Brion Vibber 
>> wrote:
>>> On 1/27/09 2:35 PM, Thomas Dalton wrote:
 The way I see it, what we need is to get a really powerful server
>>> Nope, it's a software architecture issue. We'll restart it with the new
>>> arch when it's ready to go.
>> The simplest solution is just to kill the current dump job if you have
>> faith that a new architecture can be put in place in less than a year.
>
> We'll probably do that.
>
> -- brion

FWIW, I'll add my vote for aborting the current dump *now* if we don't 
expect it ever to actually be finished, so we can at least get a fresh dump 
of the current pages.

Russ




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Enwiki Dump Crawling since 10/15/2008

2009-01-05 Thread Russell Blau
 wrote in message 
news:1c624fe40901040620g1c69d070q9f830da33e84f...@mail.gmail.com...
> The current enwiki database dump
> (http://download.wikimedia.org/enwiki/20081008/) has been crawling
> along since 10/15/2008.
...
> Is this purposeful?  And is there anything I (or other community
> members) can do about it?  I personally just need the pages-articles
> part.  Would it be possible to dump up to that part on a different
> thread?

That portion of the dump is already done, and available at 
http://download.wikimedia.org/enwiki/20081008/enwiki-20081008-pages-articles.xml.bz2

Russ




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l