Hmm As you can read it in the other thread, I can do some stuff with shell
scripts but do not count on me for manipulating java classes (if you want
a good job)...

For information I play with the dump file done after a backup backend,

1 ) grep -v 'INSERT' <path_to_backup> > /tmp/dump.sql
2 ) edit this file to cut the section where it disable the constraints and
keys
3 ) grep -A 3 -B 3 'INSERT' <path_to_backup> >> /tmp/dump.sql
4 ) paste the section copied at 2) at the end of the /tmp/dump.sql
5 ) replace <path_to_backup> by /tmp/dump.sql

My restore command passed from 7 hours to 3h40 and surely less space spent
(re-used the same mysql ibdata file so I don't know the real size of my
data)

It is possible to gain more on multi core architecture by splitting the
INSERT section statments and insert it by multiple mysql parallel query
process (as much as CPU core in fact). That is what must to do Maatkit.

Cheers,

-- 
Damien Hardy

>
> Hi Damien,
>
> Note that you can build your own backuper easily using the
> ScriptBackuper and invoke your own scripts. Feel free also to modify the
> existing MySQL Backuper to fit your needs.
> If you have modification to contribute back, I'd be happy to commit them
> for you.
>
> Thanks again for your feedback,
> manu
>
>> Another way to do the thing :
>> - "mysqldump -d" .. to get only database structure.
>> - add the "alter table" statements
>> - and after only the result of "mysqldump -t" to get the data without
>> create table.
>>
>> Cheers,
>>
>>
>
>
> --
> Emmanuel Cecchet
> FTO @ Frog Thinker
> Open Source Development & Consulting
> --
> Web: http://www.frogthinker.org
> email: m...@frogthinker.org
> Skype: emmanuel_cecchet
>
> _______________________________________________
> Sequoia mailing list
> Sequoia@lists.forge.continuent.org
> http://forge.continuent.org/mailman/listinfo/sequoia
>
>

_______________________________________________
Sequoia mailing list
Sequoia@lists.forge.continuent.org
http://forge.continuent.org/mailman/listinfo/sequoia

Reply via email to