Russell Howe wrote:
Well, the time is nigh. Our SQLite database has hit 1.3GB and creates a
900MB dump file, which is big enough to cause our bacula Xen virtual
machine to run out of space when running a catalog backup.
We have a Postgres (7.4) server with plenty of space, so I thought I'd
try importing the catalog into that.. things didn't go so smoothly, so I
ended up knocking up a script to do the import.
I haven't actually finished it yet, and I know there are some
optimisations which could be made, but I hope this is useful to someone:
What could be better:
* Automatic reorganisation of CREATE statements so that they execute in
the correct order
* Commandline parameters
* Handling of arbitrary conversions (Postgres -> MySQL, MySQL ->
Postgres, SQLite -> MySQL, MySQL -> SQLite, Postgres -> SQLite)
Note that it would be a good idea to turn off the fsync stuff on
Postgres while you do the import - I haven't benchmarked, but it should
make things go a good bit quicker.
I haven't actually tested this yet, so it's probably broken and riddled
with bugs. Still, enjoy :)
not it wouldn't be nice to have scripts for most database changes (no
need from anything to change to sqLite though)
but i would just change the database and wait for it to be populated
with the new entries .. no real need to import data since you can
restore without having it in the database
Florian
-------------------------------------------------------
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=110944&bid=241720&dat=121642
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users