Andres,

* Andres Freund (and...@anarazel.de) wrote:
> For that I added new functions/defines which allocate all the needed memory 
> in 
> one hunk:
> list_immutable_make$n(),
> List *list_new_immutable_n(NodeTag t, size_t n);
> List *list_make_n(NodeTag t, size_t n, ...);

A while back, I posted a patch to try and address this same issue.  The
approach that I took was to always pre-allocate a certain (#defined)
amount (think it was 5 or 10 elements).  There were a number of places
that caused problems with that approach because they hacked on the list
element structures directly (instead of using the macros/functions)-
you'll want to watch out for those areas in any work on lists.

That patch is here:
http://archives.postgresql.org/pgsql-hackers/2011-05/msg01213.php

The thread on it might also be informative.

I do like your approach of being able to pass the ultimate size of the
list in..  Perhaps the two approaches could be merged?  I was able to
make everything work with my approach, provided all the callers used the
list API (I did that by making sure the links, etc, actually pointed to
the right places in the pre-allocated array).  One downside was that the
size ended up being larger that it might have been in some cases.

        Thanks,

                Stephen

Attachment: signature.asc
Description: Digital signature

Reply via email to