On 07/29/2012 04:58 PM, Z F wrote:
Hello Again
I need to backup data from a scientific instrument. The problem is that the
instrument has limited space, so
data has to be moved from the instrument onto other drive. From time to
time, the old data needs to be copied
back to the instrument. Rdiff-backup is handy because regular
"cp" command will restore the latest version of the datafile.
Thus users can be given read-access to the backup and they can restore data
they need by themselves.
The above confuses me. You say the old data needs to copied back, but
then you say rdiff-backup is handy because it allows the latest version
to be copied back. Not sure I am following the logic:)
Sorry for not being clear about what I would like to do.
Basically, I need an easy access to files which are deleted from the system.
The files which are present on the system at the backup time and were deleted
before a subsequent backup can be easily found (using find) and restored with a
cp command.
Files which were deleted from the source directory before the last backup are
also deleted from the "mirror" but are present in the rdiff-backup history.
I do not see an easy way to browse or search the deleted files by creation date
or file name.
I see
rdiff-backup --list-at-time ??D
out-dir/subdir
which will work if I know the date when the file was created. Is this my only
option?
http://www.nongnu.org/rdiff-backup/rdiff-backup.1.html
See section Time formats.
Using --list-changed-since you can go back an interval or a
backup(where 0B is the current backup and 1B would be the next oldest)
Still not sure I am following correctly. Still, why not move the data
from the instrument to a directory on the hard drive and then
rdiff-backup from that directory to another directory?
Ideally, old data do not change only new data are created. But uses sometime do
stuff
to old data and I have to keep track in backup of what they do. It is possible
that they
changed old data by mistake or on purpose. thus, moving data to a different
drive does not
always work. If old data gets modified, I need to create a "revision". It is
rare, but might happen.
I thught rdiff-backup will keep the revision history for me.
It does.
Should I consider CVS revision control system as a backup tool? it works well
on text files, not sure
if it good for binary data. in this scenario, new data will be simply added to
the repository. The old data
gets revisions (if it happens) and I backup the CVS repository . CVS repository
can be searched for filenames
so the old deleted files can be found...
The CVS route does not sound correct to me, though.
Yea, not sure how this helps. To get revisions you need to do commits.
You still need to know that a file was changed to schedule a commit.
Do you have any thoughts or suggestions? Am I talking nonsense again?
I still think the below will work:
instrument --> copy_directory --> rdiff_mirror_directory
Per another post you could use rsync from the instrument to
copy_directory. If you run rysnc without the --del switch it will keep
the deleted files in copy_directory. Changed files will naturally be
changed but the revisions will be kept in the rdiff_mirror_directory.
Thanks
ZF
--
Adrian Klaver
adrian.kla...@gmail.com
_______________________________________________
rdiff-backup-users mailing list at rdiff-backup-users@nongnu.org
https://lists.nongnu.org/mailman/listinfo/rdiff-backup-users
Wiki URL: http://rdiff-backup.solutionsfirst.com.au/index.php/RdiffBackupWiki