Am 15.02.2021 um 09:13 schrieb Spadajspadaj:
On 15.02.2021 08:04, 'Frank Kirschner | Celebrate Records GmbH' via
bareos-users wrote:
Finally, can I discuss the following example:
I have to archive from 3 departments audio, video and print files as
"cold data",
stored on tape:
First, I will will do copy all audio files to a local hard disk on
the same host,
where the tape is connected directly, because copying files of the
network from
different host a slower than writing to tape.
Second, creating a job, using "Enabled = no" for starting it manually
via GUI.
Setting the client to the local fd,
Setting a file set, which points to a local directory, where the data
are stored
from step #1
Setting also the storage to the tape
Define a pool where I have predefined some empty tapes
Now run the job and archive the audio files.
When finished successful, would it be a good idea to delete the
collected audio
files from the hard disk and go ahead with copying now the video
files and start
the job again or will it be better, to make for each types of data
(audio, video,
print) an own job with an own pool and so tape like named:
audio1, audio2 / video1, video2 ..?
Sorry for asking but you guys have more experience with such tape
scenario than a
green horn like me :-)
My first thought is that archiving is much more than just using backup
solution to copy files. Archiving is a whole process which should be
designed with proper data security in mind (i.e. appropriate copy
redundancy and data verifiability).
Secondly, "First, I will will do copy all audio files to a local hard
disk on the same host, where the tape is connected directly, because
copying files of the network from different host a slower than writing
to tape". Not necessarily. That's what you use spooling for.
Spooling is not working for this scenario, because I have to backup
multiple clients, the manual says: "Each Job will reference only a
single client."
So I use a "run before" script which collects from the 3 clients the
data. On each client are placed the files in a "archiving" folder
manually by the operator.
Thirdly - I used to do a "copy and delete" scenario few years ago but
I had a slightly different setup so my solution is not directly
copy-paste appliable to you but I'd suggest you look into:
1) Dynamically create a list of files to backup (might involve
checking client files for ctime or querying bareos database to verify
if the file has already been backed up)
2) Create a post-job script which removes files that have already been
backed up in a proper way (i.e. included in a given number of backup
jobs if you want to have several copies) - this definitely involves
querying director's database.
That's a good idea for my scenario. Thanks for this good hint,
Best regards,
MK
--
You received this message because you are subscribed to the Google Groups
"bareos-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/bareos-users/fea94b50-c16d-11a3-f7e2-0d82d03d4aed%40celebrate.de.