RocksDB keeps a hot set in memory. So you can have more data than memory. 
How fast you can query the data will depend on your use-case and how much 
can be cached by RocksDB.

What kind of question do you want to ask given your log files? Extract a 
time-range?


Am Freitag, 14. April 2017 17:30:58 UTC+2 schrieb Ahmad Ibrahim:
>
> Wow what a fast response!
>
> So this means if I have a 1TB collection I don't need 1TB memory capacity?
>
> On Friday, April 14, 2017 at 5:27:42 PM UTC+2, Frank Celler wrote:
>>
>> We are currently preparing the 3.2 release. An alpha version will be 
>> available next week that uses a storage engine based on Facebook's RocksDB. 
>> This will give you better performance with large datasets.
>>
>>
>> Am Freitag, 14. April 2017 17:22:37 UTC+2 schrieb Ahmad Ibrahim:
>>>
>>>
>>>
>>> Hi all, ArangoDB is a very interesting product and I would love to use 
>>> it in my next project but I have just one question, is it suitable for 
>>> large data collections like logs or I still have to use another data 
>>> storage like elastic? As far as I understand ArangoDB is almost in memory 
>>> database so loading such large collection in memory will be a problem.. am 
>>> I wrong?
>>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"ArangoDB" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to