I'm also curious, what is the estimated amount of time to decompress this
thing?

On Wed, Dec 24, 2008 at 7:24 PM, Brian <[email protected]> wrote:

> But at least this would allow Erik, researchers and archivers to get the
> dump faster than they can get the compressed version. The number of people
> who want this can't be > 100, can it? It would need to be metered by an API
> I guess.
>
> Cheers,
> Brian
>
>
> On Wed, Dec 24, 2008 at 7:18 PM, Robert Rohde <[email protected]> wrote:
>
>> On Wed, Dec 24, 2008 at 6:05 PM, Brian <[email protected]> wrote:
>> > Hi Robert,
>> >
>> > I'm not sure I agree with you..
>> >
>> > (3 terabytes / 10 megabytes) seconds in days = 3.64 days
>> >
>> > That is, on my university connection I could download the dump in just a
>> few
>> > days. The only cost is bandwidth.
>>
>> While you might be correct, most connections are reported as megaBITS
>> per second.  For example, AT&T's highest grade of residential DSL
>> service is 6 Mbps, which would result in 46 day download.  Comcast
>> goes up to 16 Mbps, which is 17 days.
>>
>> -Robert Rohde
>>
>> _______________________________________________
>> foundation-l mailing list
>> [email protected]
>> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
>>
>
>
>
> --
> (Not sent from my iPhone)
>



-- 
(Not sent from my iPhone)
_______________________________________________
foundation-l mailing list
[email protected]
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

Reply via email to