On Sun, Aug 06, 2006 at 04:42:47PM -0700, Geoff Parker wrote:
>  I've got a database with about 155GB of binary data, however
> when I run the unix utility df, it reports only 60GB of disk space
> is being used.   I've extracted random samples of data from the
> database, and it all appears correct, so I presume it's not corrupt.
> Can anyone tell me whether there's some sort of disk compression
> happening with large objects?

Large objects are stored in pg_largeobject as chunks of bytea data,
which can be compressed.

-- 
Michael Fuhr

---------------------------(end of broadcast)---------------------------
TIP 9: In versions below 8.0, the planner will ignore your desire to
       choose an index scan if your joining column's datatypes do not
       match

Reply via email to