Thanks. I'm taking a look at it. May take awhile to see if it fits my
needs. It sounds promising.
Joe
On 01/31/2013 04:36 PM, Henri Shustak wrote:
You may be interested in having a look at LBackup http://www.lbackup.org,
an open source (released under the GNU GPL) backup system.
Here's my rsync based backup system.
http://wikisend.com/download/377440/rsync_backup-0.26.tar.gz
It's an rsync based backup system utilizing hard links
to reduce storage requirements. It supports both push
and pull. It uses public keys with ssh for the transport.
It works and I've used it
You may be interested in having a look at LBackup http://www.lbackup.org, an
open source (released under the GNU GPL) backup system.
Essentially, LBackup is a wrapper for rsync. If you are working on your own
script. Feel free to look at how LBackup works (primely written in bash at
present)
Kevin Korb wrote:
On 01/22/13 18:12, Kevin Korb wrote:
That is the old way that pre-dates --link-dest. Instead of cp -al
daily.02 daily.01 you can do a mkdir daily.01 then an rsync ...
--link-dest=../daily.02 daily.01
Rsync then doesn't need any --delete and you don't bother making
On 01/23/2013 02:15:06 AM, Voelker, Bernhard wrote:
Kevin Korb wrote:
On 01/22/13 18:12, Kevin Korb wrote:
That is the old way that pre-dates --link-dest. Instead of cp -
al
daily.02 daily.01 you can do a mkdir daily.01 then an rsync ...
--link-dest=../daily.02 daily.01
I'm
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
I handle this by actually backing up to backupname.incomplete. Once
the backup is complete I then rename it to
backupname.-mm-dd.HH-MM-SS. That way all of the backups with
date+time stamps in the name are completed backups and if a backup
fails
Oh, absolutely my script has flaws in it. I will never claim anything I
create or use to not have any flaws. :)
The machines it runs on are strictly servers that rarely get shutdown
though. The only time that happens if if there's a power failure that
lasts longer than 30 minutes (which has
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Here is one I wrote up for a LUG presentation that is specifically
about doing it yourself:
http://sanitarium.net/golug/rsync_backups_2010.html
On 01/22/13 02:31, Joe wrote:
There have been a lot of posts on the list lately about issues with
hard
Thanks for the reply. I know what hard and soft links are and have some
idea of how they relate to backup.
What I need is a tutorial on how all of that works with rsync. I can
see that there are a lot of considerations as to which options to use
for different situations and maybe some general
Hi Joe,
If you want to understand hard-links, just take a look at Wikipedia :
http://en.wikipedia.org/wiki/Hard_link#Example
I think it's pretty easy to understand.
To understand how hard-links (and rsync) can help you make strong incremental
backups, head over
Thank you! I will read it and see where to go from there.
Joe
On 01/22/2013 12:44 PM, Kevin Korb wrote:
Here is one I wrote up for a LUG presentation that is specifically
about doing it yourself:
http://sanitarium.net/golug/rsync_backups_2010.html
On 01/22/13 02:31, Joe wrote:
There
Thanks. Will read.
Joe
On 01/22/2013 05:31 PM, François wrote:
Hi Joe,
If you want to understand hard-links, just take a look at Wikipedia :
http://en.wikipedia.org/wiki/Hard_link#Example
I think it's pretty easy to understand.
To understand how hard-links (and rsync) can help you make
Joe, this is specific to having a backup with rsync. The way I use links
for rsync is by not using the link (ln) command at all, but instead using
cp's build-in -l (link) option. It looks something like this:
1) delete the oldest backup (simple 'rm' command)
2) shift the rest (with 'rm') by 1,
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
That is the old way that pre-dates --link-dest. Instead of cp -al
daily.02 daily.01 you can do a mkdir daily.01 then an rsync ...
- --link-dest=../daily.02 daily.01
Rsync then doesn't need any --delete and you don't bother making any
hard links that
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Also, if you put dates and times in the file names instead of .01,
.02, etc you don't have to do any mv's, you can easily tell when each
backup was run, and ls can tell you which the newest and oldest are.
On 01/22/13 18:12, Kevin Korb wrote:
That
Yeah I know, there's more I need to do to optimize. That script is
probably nearing ten years old. It's been running without a single problem
so I never bothered to revisit it, even after updating machines, I just
copy it over and keep in trucking.
On Tue, Jan 22, 2013 at 4:14 PM, Kevin Korb
There have been a lot of posts on the list lately about issues with hard
links. It has been very interesting, but I don't understand it very
thoroughly. I haven't used hard links for anything yet. I've used
symlinks - not for backups, of course - and have seen them get broken or
deleted in
This may help: (man ln)
A hard link to a file is
indistinguishable from the original directory entry; any changes to a
file are effectively independent of the name used to reference the file.
Hard links may not normally refer to directories and may not span file
systems.
Assuming you do many
18 matches
Mail list logo