On 03/21/2018 02:18 PM, Jaime Soler wrote: > Hi, > > We still get out of memory error during pg_dump execution > ... > pg_dump: reading row security enabled for table "public.lo_table" > pg_dump: reading policies for table "public.lo_table" > pg_dump: reading publications > pg_dump: reading publication membership > pg_dump: reading publication membership for table "public.lo_table" > pg_dump: reading subscriptions > pg_dump: reading large objects > out of memory >
Hmmmm ... that likely happens because of this for loop copying a lot of data: https://github.com/postgres/postgres/blob/master/src/bin/pg_dump/pg_dump.c#L3258 But I'm having trouble verifying that, because the query fetching the list of objects is rather expensive with this number of large objects. How long does it take for you? I wonder if there's a way to make the query faster. regards -- Tomas Vondra http://www.2ndQuadrant.com PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services