Hi, recently, I wrote a really complex SELECT statement which consists of about 20 relations using NATURAL JOIN method and every single relation consists 50 columns. It looks like: PREPARE ugly_stmt AS SELECT * FROM t1 NATURAL JOIN t2 NATURAL JOIN t3 ... NATURAL JOIN t20 WHERE id = $1;
All tables share only one common column "id" which is also defined as primay key. I set join_collapse_limit to 1 and just write a prepare statement for calling multi-times. It seems Postgres cost lots of time to copyObject(). So can I just allocate a new context from TopMemoryContext before doing QueryRewrite() and pg_plan_queries(), and save them into hash table without copying query_list and plan_list again(i think they are lived in the context I created). I know I am subjected a long term memory leak until I deallocate the prepared statement, but it save lots of time in my situation. And I can bear with it... Thanks in advance -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers