Hello!
I'm implementing a persistent hash trie (like in clojure/scala).
Every 'persisting' insertion involves allocating a fixed number
(6) of nodes (each chunk is a fixed width ranging between 1 and
~33 words).
Basically, this data structure always allocates a whole branch at
a time, but then nodes are deallocated individually.
Is there a way to tell an allocator allocate n chunks at a time?
Or, alternatively, is there a way to allocate all the memory
needed at once, and then free just chunks of it at a time? It
seems like this would provide at least some speed improvement.
And it could also be useful for batch operations on other
node-based data structures (such as adding ranges of nodes to a
graph at a time).
Thanks!
- Bulk allocation and partial deallo... Filip Bystricky via Digitalmars-d-learn
-