Hi Donald,

Interesting.  One possibility would be to have an open CellStore cache.
Frequently accessed CellStores would remain open, while seldom used ones get
closed.  The effectiveness of this solution would depend on the workload.
Do you think this might work for your use case?

- Doug

On Thu, Feb 19, 2009 at 7:09 PM, donald <[email protected]> wrote:

>
> Hi all,
>
> I recently run into the problem that HdfsBroker throws out of memory
> exception, because too many CellStore files in HDFS are kept open - I
> have over 600 ranges per range server, with a maximum of 10 cell
> stores per range, that'll be 6,000 open files at the same time, making
> HdfsBroker to take gigabytes of memory.
>
> If we open the CellStore file on demand, i.e. when a scanner is
> created on it, this problem is gone. However random-read performance
> may drop due to the the overhead of opening a file in HDFS. Any better
> solution?
>
> Donald
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Hypertable Development" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/hypertable-dev?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to