Mario Ruiz wrote:
Not really,
a little bit of hard disk is all you need to have a dual boot machine
and have your own sandbox. Not that I would do it but is a way of
jailing and minimizing your risk.
Of course you drop the data once from your backup source and at the
end of the test you wipe it out, you do not leave or take the data
back with you to your real network. The data must be disposable to
minimize risk.
Mario
I really like the idea of Google looking after my backups. GFS may be
the optimal file system for reliability of data access.
Virtualisation is going to add a whole new dimension to the practice of
performing back ups and restores (not to mention uptimes). However
having data snapshots on "permanent" media still sounds like a good
idea. Burning data to DVD remains my mid term solution.
David
P.S. I note even the Herald is getting its head around virtualisation
(http://www.smh.com.au/news/biztech/are-you-being-served/2007/03/26/1174761375455.html?page=fullpage#)
although they've haven't yet cottoned on to the advantages, both
developmental and technical, in the open source implementations such as
Xen.
_______________________________________________
Gpcg_talk mailing list
[email protected]
http://ozdocit.org/cgi-bin/mailman/listinfo/gpcg_talk