On Mon, Mar 29, 2010 at 02:08:23PM +, Edgardo Portal wrote:
> On 2010-03-29, Juan Backson wrote:
> > --0016e64ccb10fb54050482f07924
> > Content-Type: text/plain; charset=ISO-8859-1
> >
> > Hi,
> >
> > I am using Postgres to store CDR data for voip switches. The data
> > size quickly goes abou
On 2010-03-29, Juan Backson wrote:
> --0016e64ccb10fb54050482f07924
> Content-Type: text/plain; charset=ISO-8859-1
>
> Hi,
>
> I am using Postgres to store CDR data for voip switches. The data size
> quickly goes about a few TBs.
>
> What I would like to do is to be able to regularly archive the
Hi
Instead of dropping the table, I would like to archive the old table into a
format that can be read and retrieved.
Can I db_dump on each child table?
What is the best way to do it? db_dump and make the data into csv and then
tar.gz it or backup it up into a pg archived format?
thanks,
jb
O
In response to Juan Backson :
> Hi,
>
> I am using Postgres to store CDR data for voip switches. The data size
> quickly
> goes about a few TBs.
>
> What I would like to do is to be able to regularly archive the oldest data so
> only the most recent 6 months of data is available.
>
> All t
Hi,
I am using Postgres to store CDR data for voip switches. The data size
quickly goes about a few TBs.
What I would like to do is to be able to regularly archive the oldest data
so only the most recent 6 months of data is available.
All those old data will be stored in a format that can be re