One fact about big data is it keeps getting bigger.

On Friday, October 14, 2016 at 1:00:35 PM UTC-4, Páll Haraldsson wrote:
>
> On Thursday, October 13, 2016 at 7:49:51 PM UTC, cdm wrote:
>>
>> from CloudArray.jl:
>>
>> "If you are dealing with big data, i.e., your RAM memory is not enough 
>> to store your data, you can create a CloudArray from a file."
>>
>>    
>> https://github.com/gsd-ufal/CloudArray.jl#creating-a-cloudarray-from-a-file
>>
>
> Good to know, and seems cool.. (like CatViews.jl) indexes could need to be 
> bigger than 32-bit this way.. even for 2D.
>
> But has anyone worked with more than 70 terabyte arrays, that would 
> otherwise have been a limitation?
>
> Anyone know biggest (or just big over 2 GB) one-dimensional array people 
> are working with?
>

Reply via email to