On Mon, 23 Aug 2004 12:27:28 +0800, Miguel A Paraz <[EMAIL PROTECTED]> wrote:

> Is it practical to get a 'diff' between mysqldump output from
> day-to-day, and back that up for an incremental backup?

that depends on how large the database is.  i used to do this with a
database (postgres, but the principle is the same) that i worked with.
after a while, the database got big enough that just dumping the data
took a significant amount of time and then doing the diff took even
more.

eventually i gave up because the diff would fail, the box was running
out of RAM and swap to do the diff (this was when we were at a million
rows already though).  adding RAM helps (adding swap maybe helps, but
i think it just slows you down, when you can't add RAM anymore, it's
probably time to give up and find some other solution).

it's easy to test.  if it runs in less than 30 minutes or so (the
pg_dump and the diff against yesterday's pg_dump), you're probably
fine and can use that solution for at least six months to a year.  at
some point, you're going to hit a wall and will either have to throw
more hardware at the problem or find some other solution.

good luck.

tiger

-- 
Gerald Timothy Quimpo http://bopolissimus.sni.ph
[EMAIL PROTECTED] [EMAIL PROTECTED] [EMAIL PROTECTED]
Public Key: "gpg --keyserver pgp.mit.edu --recv-keys 672F4C78"
                         Mene sakhet ur-seveh
--
Philippine Linux Users' Group (PLUG) Mailing List
[EMAIL PROTECTED] (#PLUG @ irc.free.net.ph)
Official Website: http://plug.linux.org.ph
Searchable Archives: http://marc.free.net.ph
.
To leave, go to http://lists.q-linux.com/mailman/listinfo/plug
.
Are you a Linux newbie? To join the newbie list, go to
http://lists.q-linux.com/mailman/listinfo/ph-linux-newbie

Reply via email to