On 1/22/22 08:08, Guillermo Martin via Bacula-users wrote:
> Hello everyone,
>
> I've setup bacula server 11.0.5.7 community edition with libs3 and postgres 
> in centos 7 to upload the backups in AWS S3
>
> Everything is working fine but I have a doubt with Archive Device as we just 
> want to save the backups in aws s3 (one
> location), not in local disk and aws. If we setup a local server path it 
> saves data in local server and aws s3, if I mount
> the s3 bucket like this example, it duplicate the backups. If I 
> remove"Archive Device" line it doesn't work... I couldn't
> find how to do this in bacula documentation
>
> I don't know how to set up the bacula server to save the backups just in AWS
>
> Any one could explain me please how can I do it?


Hello Guille,

The S3 Cloud plugin (well, actually all of the SD cloud plugins) were designed 
with a couple things in mind:

- The cloud companies are usually happy to take in your data for free, but they 
charge egress fees when you download it.

- Keeping this in mind, the local cache was designed so that you could have 
fast, local (ie: free) restores for some of your
data - data that has not been pruned or purged from the local cache.

- Then, for other data that must be retrieved from the cloud, the cloud volume 
'parts' were designed so that you have control
over the size of each part (MaximumPartSize). This way, Bacula will be able to 
download the least amount of cloud parts when
you are restoring some files from a job. (again, trying to keep your download 
costs down)


There are a couple settings that you have control over to help keep the 
required amount of local cache storage to a minimum:

- TruncateCache = AfterUpload   (Options: No, AfterUpload, AtEndOfJob - default 
is No)
- Upload = EachPart

Using these two Cloud resource settings Bacula will upload each part as soon as 
it is finished writing to the local cache,
and then, once the part has been successfully uploaded to the Cloud, it will be 
truncated (ie: deleted)

You can also speed things along with these settings:

- MaximumConcurrentUploads: The default is 3
- MaximumConcurrentDownloads: The default is 3
- MaximumUploadBandwidth: The default is unlimited
- MaximumDownloadBandwidth: The default is unlimited


And finally, consider that using the above TruncateCache and Upload settings 
will keep the required space in the local cache
to a minimum during backups, if you need to restore an entire job, the FULL 
space the job used will be required in the cache
because Bacula will download all the required parts to complete the restore to 
the local cache, then, once all parts have
been downloaded the data will be sent to the FD.

There is a feature request for the SD to stream the restore directly to the FD, 
but I am not sure if this has made the roadmap.


Hope this help,
Bill

--
Bill Arlofski
w...@protonmail.com



_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to