[Wikitech-l] Collection extension

2009-04-10 Thread Bence Damokos
Hi, Is there a planned deployment date for the Collection (PediaPress) extension on Wikimedia wikis that currently don't have them? The blog post about it [1] mentioned March if everything went well. Best regards, Bence [1]

[Wikitech-l] Dealing with Large Files when attempting a wikipedia database download.

2009-04-10 Thread Jameson Scanlon
Does anyone on the wikitech mailing list happen to know whether it would be possible for some of the larger wikipedia database downloads (which are, say, 16GB or so in size) to be split into parts so that they can be downloaded. For whatever reason, whenever I have attempted to download the ~14GB

Re: [Wikitech-l] Dealing with Large Files when attempting a wikipedia database download.

2009-04-10 Thread David Gerard
2009/4/10 Jameson Scanlon jameson.scan...@googlemail.com: Does anyone on the wikitech mailing list happen to know whether it would be possible for some of the larger wikipedia database downloads (which are, say, 16GB or so in size) to be split into parts so that they can be downloaded.  For

Re: [Wikitech-l] Dealing with Large Files when attempting a wikipedia database download.

2009-04-10 Thread Daniel Kinzler
David Gerard schrieb: 2009/4/10 Jameson Scanlon jameson.scan...@googlemail.com: Does anyone on the wikitech mailing list happen to know whether it would be possible for some of the larger wikipedia database downloads (which are, say, 16GB or so in size) to be split into parts so that they

Re: [Wikitech-l] Dealing with Large Files when attempting a wikipedia database download.

2009-04-10 Thread Finne Boonen
http://en.wikipedia.org/wiki/Wikipedia_database has some information on how to deal with the large files henna On Fri, Apr 10, 2009 at 21:43, Daniel Kinzler dan...@brightbyte.de wrote: David Gerard schrieb: 2009/4/10 Jameson Scanlon jameson.scan...@googlemail.com: Does anyone on the wikitech

Re: [Wikitech-l] Dealing with Large Files when attempting a wikipedia database download.

2009-04-10 Thread Bilal Abdul Kader
I have downloaded the history dump file (~150 GB) using Firefox on XP and using wget on Ubuntu and it works fine. I have downloaded it using a download manager on Vista and it is fine also. A more probable reason is the file system limitations. bilal On Fri, Apr 10, 2009 at 3:49 PM, Finne

Re: [Wikitech-l] Dealing with Large Files when attempting a wikipedia database download.

2009-04-10 Thread Brian
I'm pretty sure it's impossible to encourage people to include relevant information in their OPs. You don't suppose you could have at least told us your operating system, whether you are running 32 or 64 bits? Are you on linux with no large file support? On Fri, Apr 10, 2009 at 12:21 PM, Jameson