On Sun, Dec 11, 2011 at 10:47 AM, Platonides platoni...@gmail.com wrote:
You seem to think that piping the output from bzip2 will hold the xml
dump uncompressed in memory until your script processes it. That's wrong.
bzip2 will begin uncompressing and writing to the pipe, when the pipe
fills,
Hello,
I've got couple questions about CouchDB:
* Do we have it installed and available?
If yes,
* Is there any documentation (I tried to search the wiki for couch - no
results)?
* Anybody using it with PHP, could please share any example?
If no,
* Could it be installed, please?
Thanks
I don't think we have CouchDB installed, but agreed it would be a
great addition. Especially for data crunching and doing stuff with
lots of data it's a very nice tool.
-- Hay
On Mon, Dec 12, 2011 at 2:18 PM, Danny B. wikipedia.dann...@email.cz wrote:
Hello,
I've got couple questions about
On 12/12/11 13:59, Carl (CBM) wrote:
This is correct, but the overall memory usage depends on the XML
library and programming technique being used. For XML that is too
large to comfortably fit in memory, there are techniques to allow for
the script to process the data before the entire XML
In a message off-list, Platonides wrote:
I think pretty much evryone using them would want the last dump, so I
don't see a problem in keeping world readable just the last two dumps or
so (I chose the number two in the case someone started using one dump
and wanted to finish with that, and
On Mon, Dec 12, 2011 at 11:04 AM, Lars Aronsson l...@aronsson.se wrote:
These files now take 160 GB, which is a fraction of a 2 TB disk that cost
100 euro to purchase. We're talking disk space at the cost of a lunch.
How hard can it be to get enough disk space on the toolserver? I think
many
Consumer grade 2TB ($160):
http://www.newegg.com/Product/Product.aspx?Item=N82E16822148681
Server Grade (Raid Edition) ($320):
http://www.newegg.com/Product/Product.aspx?Item=N82E16822136579
Getting the disks in the servers is an additional cost and space may not be
available.
On Mon, Dec 12,
Lars Aronsson wrote:
How hard can it be to get enough disk space on the toolserver? I think
many chapters contribute money to its operation. Is it not enough?
Probably not terribly difficult, but first you'd need to be able answer
questions such as:
* How much disk space is needed right now?
*
On Tue, Dec 13, 2011 at 3:04 AM, Lars Aronsson l...@aronsson.se wrote:
How hard can it be to get enough disk space on the toolserver? I think
many chapters contribute money to its operation. Is it not enough?
Getting the HDD space in the TS isn't as simple as just grabbing a few
server level
We might probably get away with 7200 RPM SATA drives if they are primarily
for storage and large sequential reads/writes.
-Aaron
On Mon, Dec 12, 2011 at 4:12 PM, OQ overlo...@gmail.com wrote:
Look at 10K+ rpm SAS drives then you're in the right ballpark. It's
closer to $2/GB then $.2/GB
On
Op 12-12-2011 23:27, Aaron Halfaker schreef:
We might probably get away with 7200 RPM SATA drives if they are
primarily for storage and large sequential reads/writes.
-Aaron
Nice discussion. We have too many books, we need to select. No, just
build a bigger library! Now we're about at the
11 matches
Mail list logo