Am 15.08.2011 20:50, schrieb Mattias Gaertner: > On Mon, 15 Aug 2011 11:29:27 +0100 > Martin <[email protected]> wrote: > >> [...] >> It's a dream of mine to optimize that. eg. to have synedit allocate a >> bigger chunk, and have synedit use it knowlege about lifetime of data, >> to organize it better. But there is just to much other important work... > > Indeed. > But small chunks have a high chance of reuse. That's why > most mem managers have special optimizations for them and that's why > many small chunks can actually be a good thing. > I did some experiments with codetools allocating bigger chunks. In > artificial tests it helped but in normal context it decreased > performance. The mem fragmentation was worse than the gain of cache > locality.
I think also that it's a better idea to leave pooling to the heap manager. Allocating bigger chunks and split them is only useful if this requires no additional book keeping about used/free memery. -- _______________________________________________ Lazarus mailing list [email protected] http://lists.lazarus.freepascal.org/mailman/listinfo/lazarus
