Hidey hidey hidey hi,

Have you tried piping the output to gzip then to another file? So
pg_dump -c --verbose database | gzip > /foo/bar.gzip?

I also use ftpbackup to move the gzip file to another server that has a ton 
o diskspace and large file support.  Here's a line from teh script

pg_dump CES | gzip | /usr/local/bin/ftpbackup -h bessie -u foo -p bar -b 
/lboxbak/$MONTH$DAY.CES.gz

$MONTH and $DAY are set earlier in the script.
hope this helps
~corey

-----Original Message-----
From:   Naomi Walker [SMTP:[EMAIL PROTECTED]]
Sent:   Wednesday, March 20, 2002 7:46 AM
To:     Tom Lane; Fred Moyer
Cc:     [EMAIL PROTECTED]
Subject:        Re: [ADMIN] pg_dump max file size exceeded

At 12:15 AM 3/20/02 -0500, Tom Lane wrote:
>"Fred Moyer" <[EMAIL PROTECTED]> writes:
> > ran time pg_dump -c --verbose database > datafile.psql from the command 
> line
> > and got a file size limit exceeded.  datafile.psql stopped at 2 gigs. 
 any
> > ideas how to exceed that limit?
>
> > redhat 7.2, 2.4.9-31 kernel
>
>[ scratches head... ]  If you were on Solaris or HPUX I'd tell you to
>recompile with 64-bit file offset support enabled.  But I kinda thought
>this was standard equipment on recent Linux versions.  Anyone know the
>magic incantation for large-file support on Linux?

depending on the shell being used, i'd try limit or ulimit

We've seen a case where large file support had to be tweaked in the Veritas 
file systems as well.

--
Naomi Walker
Chief Information Officer
Eldorado Computing, Inc.
602-604-3100  ext 242


---------------------------(end of broadcast)---------------------------
TIP 2: you can get off all lists at once with the unregister command
    (send "unregister YourEmailAddressHere" to [EMAIL PROTECTED])


---------------------------(end of broadcast)---------------------------
TIP 4: Don't 'kill -9' the postmaster

Reply via email to