Re: [Bacula-users] bacula archive/off-site-backup to S3 storage, what's the best way?

2020-10-07 Thread Erik Geiger
Hi Kern,

Thanks for the update on that. Looks like I missed that one (your update
_and_ the documentation, while I haven't read release notes though).

Kind regards,

Erik


On Thu, Feb 20, 2020 at 3:46 PM Kern Sibbald  wrote:

> Hello,
>
> Concerning lack of documentation as you noted below:  yes, we can always
> improve documentation.  In this case, the need for special libraries is
> documented.  For example in the ReleaseNotes file, it
> is documented under 3 releases.  The most recent one states:
>
> =
> Release 9.4.3
>
> This is a bug fix release for version 9.4.2.  It includes a number of bug
> fixes and patches.
>
> [snip] ...
>
> S3 driver: If you are trying to build the S3 drivers, please remember to
> use the
> community supplied (from Bacula Enterprise) version of libs3.so found at:
>
> https://www.bacula.org/downloads/libs3-20181010.tar.gz
> ==
>
> Not
>
>
>
> On 2/17/20 3:27 PM, Erik Geiger wrote:
>
>
> On Mon, Feb 17, 2020 at 1:41 PM Radosław Korzeniewski <
> rados...@korzeniewski.net> wrote:
>
>> Hello,
>>
>> wt., 11 lut 2020 o 18:41 Erik Geiger  napisał(a):
>>
>>> You are referring to this documentation?
>>> https://www.bacula.org/9.4.x-manuals/en/main/New_Features_in_9_4_0.html#SECTION00300100
>>>
>>
>> Yes.
>>
>>
>>> I wasn't able to build bacula with cloud support and I can't use the rpm
>>> packages from bacula as they aren't supported by the puppet module I'm
>>> using. So I was looking for something like an "after" job syncing to S3
>>> with aws cli or the like.
>>>
>>
>> Sorry for that. You should create a ticket at bugs.bacula.org to show
>> that something is wrong about it.
>>
>
> Hi Radoslaw,
>
> Turned out that I was able to build when using the libs3 provided by
> bacula [https://www.bacula.org/downloads/libs3-20181010.tar.gz]
> Sadly that wasn't really documented.
>
>
>>
>>> But if the Cloud Storage functionality is the way to go I'll figure out
>>> how to compile with S3 support.
>>> So if I get the documentation right the backup is first stored to the
>>> local disk and afterwards moved to the cloud while I could still do a
>>> restore from local disk as long as I configure the configure the "cache
>>> Retention", right?
>>>
>>
>> The default local disk backup for cloud storage works as a cache only
>> with fully configurable behavior and retention. You can use it as the
>> single archive storage, so during backup all your data will be saved on
>> local disks and then synced into S3 as configured. You can use it as a DR
>> storage using Copy Jobs where your local disks will be your main storage
>> which will be copied into a storage cloud after a local backup. All
>> possible configurations depends on your requirements.
>> I hope it helps.
>>
>
> I do have the cloud backup running, now. All works even better than
> expected regarding the S3 upload. I also realised that I can use "Cache
> Retention" so the local disk won't run out of disk pace while still
> allowing fast restores within the "Cache Retention" period.
>
> Tanks again,
>
> Erik
>
>> best regards
>> --
>> Radosław Korzeniewski
>> rados...@korzeniewski.net
>>
>
>
> ___
> Bacula-users mailing 
> listBacula-users@lists.sourceforge.nethttps://lists.sourceforge.net/lists/listinfo/bacula-users
>
>
>
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] bacula archive/off-site-backup to S3 storage, what's the best way?

2020-02-20 Thread Kern Sibbald

  
  
Hello,

Concerning lack of documentation as you noted below:  yes, we can
always improve documentation.  In this case, the need for special
libraries is documented.  For example in the ReleaseNotes file, it
is documented under 3 releases.  The most recent one states:

=
Release 9.4.3

This is a bug fix release for version 9.4.2.  It includes a number
of bug
fixes and patches. 

[snip] ...

S3 driver: If you are trying to build the S3 drivers, please
remember to use the
community supplied (from Bacula Enterprise) version of libs3.so
found at:

https://www.bacula.org/downloads/libs3-20181010.tar.gz
==

Not



On 2/17/20 3:27 PM, Erik Geiger wrote:


  
  

  
  


  On Mon, Feb 17, 2020 at 1:41
PM Radosław Korzeniewski 
wrote:
  
  

  Hello,
  
  
wt., 11 lut 2020 o
  18:41 Erik Geiger 
  napisał(a):


  

  You are referring
to this documentation? https://www.bacula.org/9.4.x-manuals/en/main/New_Features_in_9_4_0.html#SECTION00300100

  



Yes.
 

  

  I wasn't able to
build bacula with cloud support and I can't use
the rpm packages from bacula as they aren't
supported by the puppet module I'm using. So I
was looking for something like an "after" job
syncing to S3 with aws cli or the like.

  



Sorry for that. You should create a ticket at bugs.bacula.org to show
  that something is wrong about it.
  

  
  
  
  Hi
Radoslaw,
  
  
  Turned
out that I was able to build when using the libs3 provided
by bacula [https://www.bacula.org/downloads/libs3-20181010.tar.gz]
  Sadly
that wasn't really documented.
  
  
  

  
 

  

  But if the Cloud
Storage functionality is the way to go I'll
figure out how to compile with S3 support.
  
  So if I get the
documentation right the backup is first stored
to the local disk and afterwards moved to the
cloud while I could still do a restore from
local disk as long as I configure the configure
the "cache Retention", right?

  



The default local disk backup for cloud storage
  works as a cache only with fully configurable behavior
  and retention. You can use it as the single archive
  storage, so during backup all your data will be saved
  on local disks and then synced into S3 as configured.
  You can use it as a DR storage using Copy Jobs where
  your local disks will be your main storage which will
  be copied into a storage cloud after a local backup.
  All possible configurations depends on your
  requirements.
  
  I hope it helps.

  
  
  
  I do
have the cloud backup running, now. All works even better
than expected regarding the S3 upload. I also realised that
I can use "Cache Retention" so the local disk won't run out
of disk pace while still allowing fast restores within the
"Cache Retention" period.
  
  
  Tanks
again,
  
  
  Erik
  

  best regards
  -- 
  Radosław Korzeniewski
rados...@korzeniewski.net

  

  
  
  
  
  
  ___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users



  



Re: [Bacula-users] bacula archive/off-site-backup to S3 storage, what's the best way?

2020-02-17 Thread Radosław Korzeniewski
Hello,

pon., 17 lut 2020 o 15:27 Erik Geiger  napisał(a):

>
> On Mon, Feb 17, 2020 at 1:41 PM Radosław Korzeniewski <
> rados...@korzeniewski.net> wrote:
>
>> Hello,
>>
>> wt., 11 lut 2020 o 18:41 Erik Geiger  napisał(a):
>>
>>> You are referring to this documentation?
>>> https://www.bacula.org/9.4.x-manuals/en/main/New_Features_in_9_4_0.html#SECTION00300100
>>>
>>
>> Yes.
>>
>>
>>> I wasn't able to build bacula with cloud support and I can't use the rpm
>>> packages from bacula as they aren't supported by the puppet module I'm
>>> using. So I was looking for something like an "after" job syncing to S3
>>> with aws cli or the like.
>>>
>>
>> Sorry for that. You should create a ticket at bugs.bacula.org to show
>> that something is wrong about it.
>>
>
> Hi Radoslaw,
>
> Turned out that I was able to build when using the libs3 provided by
> bacula [https://www.bacula.org/downloads/libs3-20181010.tar.gz]
> Sadly that wasn't really documented.
>

Sorry for that. You can still fill the issue ticket about it.


>
>
>>
>>> But if the Cloud Storage functionality is the way to go I'll figure out
>>> how to compile with S3 support.
>>> So if I get the documentation right the backup is first stored to the
>>> local disk and afterwards moved to the cloud while I could still do a
>>> restore from local disk as long as I configure the configure the "cache
>>> Retention", right?
>>>
>>
>> The default local disk backup for cloud storage works as a cache only
>> with fully configurable behavior and retention. You can use it as the
>> single archive storage, so during backup all your data will be saved on
>> local disks and then synced into S3 as configured. You can use it as a DR
>> storage using Copy Jobs where your local disks will be your main storage
>> which will be copied into a storage cloud after a local backup. All
>> possible configurations depends on your requirements.
>> I hope it helps.
>>
>
> I do have the cloud backup running, now. All works even better than
> expected regarding the S3 upload. I also realised that I can use "Cache
> Retention" so the local disk won't run out of disk pace while still
> allowing fast restores within the "Cache Retention" period.
>

Great! I'm very happy that it is working now and I could help.

best regards
-- 
Radosław Korzeniewski
rados...@korzeniewski.net
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] bacula archive/off-site-backup to S3 storage, what's the best way?

2020-02-17 Thread Erik Geiger
On Mon, Feb 17, 2020 at 1:41 PM Radosław Korzeniewski <
rados...@korzeniewski.net> wrote:

> Hello,
>
> wt., 11 lut 2020 o 18:41 Erik Geiger  napisał(a):
>
>> You are referring to this documentation?
>> https://www.bacula.org/9.4.x-manuals/en/main/New_Features_in_9_4_0.html#SECTION00300100
>>
>
> Yes.
>
>
>> I wasn't able to build bacula with cloud support and I can't use the rpm
>> packages from bacula as they aren't supported by the puppet module I'm
>> using. So I was looking for something like an "after" job syncing to S3
>> with aws cli or the like.
>>
>
> Sorry for that. You should create a ticket at bugs.bacula.org to show
> that something is wrong about it.
>

Hi Radoslaw,

Turned out that I was able to build when using the libs3 provided by bacula
[https://www.bacula.org/downloads/libs3-20181010.tar.gz]
Sadly that wasn't really documented.


>
>> But if the Cloud Storage functionality is the way to go I'll figure out
>> how to compile with S3 support.
>> So if I get the documentation right the backup is first stored to the
>> local disk and afterwards moved to the cloud while I could still do a
>> restore from local disk as long as I configure the configure the "cache
>> Retention", right?
>>
>
> The default local disk backup for cloud storage works as a cache only with
> fully configurable behavior and retention. You can use it as the single
> archive storage, so during backup all your data will be saved on local
> disks and then synced into S3 as configured. You can use it as a DR storage
> using Copy Jobs where your local disks will be your main storage which will
> be copied into a storage cloud after a local backup. All possible
> configurations depends on your requirements.
> I hope it helps.
>

I do have the cloud backup running, now. All works even better than
expected regarding the S3 upload. I also realised that I can use "Cache
Retention" so the local disk won't run out of disk pace while still
allowing fast restores within the "Cache Retention" period.

Tanks again,

Erik

> best regards
> --
> Radosław Korzeniewski
> rados...@korzeniewski.net
>
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] bacula archive/off-site-backup to S3 storage, what's the best way?

2020-02-17 Thread Radosław Korzeniewski
Hello,

wt., 11 lut 2020 o 18:41 Erik Geiger  napisał(a):

> You are referring to this documentation?
> https://www.bacula.org/9.4.x-manuals/en/main/New_Features_in_9_4_0.html#SECTION00300100
>

Yes.


> I wasn't able to build bacula with cloud support and I can't use the rpm
> packages from bacula as they aren't supported by the puppet module I'm
> using. So I was looking for something like an "after" job syncing to S3
> with aws cli or the like.
>

Sorry for that. You should create a ticket at bugs.bacula.org to show that
something is wrong about it.


> But if the Cloud Storage functionality is the way to go I'll figure out
> how to compile with S3 support.
> So if I get the documentation right the backup is first stored to the
> local disk and afterwards moved to the cloud while I could still do a
> restore from local disk as long as I configure the configure the "cache
> Retention", right?
>

The default local disk backup for cloud storage works as a cache only with
fully configurable behavior and retention. You can use it as the single
archive storage, so during backup all your data will be saved on local
disks and then synced into S3 as configured. You can use it as a DR storage
using Copy Jobs where your local disks will be your main storage which will
be copied into a storage cloud after a local backup. All possible
configurations depends on your requirements.
I hope it helps.

best regards
-- 
Radosław Korzeniewski
rados...@korzeniewski.net
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] bacula archive/off-site-backup to S3 storage, what's the best way?

2020-02-12 Thread Erik Geiger
On Tue, Feb 11, 2020 at 9:29 PM Bruno Vane  wrote:

> Hello Erik,
>
> As we have a private cloud (VMware), we're using AWS Storage Gateway.
> We create a bucket, attach this bucket in gateway (that will generate a
> NFS mount point) and mount the NFS in Bacula.
>

Thanks Bruno,

I was thinking about AWS Storage Gateway at first and decided against it as
I thought it's just adding costs when I could just sync the volumes to S3
after file backup. And afterwards I forgot about it.
I'm about to rebuild bacula with S3 support to give that a try. If that
fails I'm giving the Storage Gateway a closer look.

Kind regards,

Erik
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] bacula archive/off-site-backup to S3 storage, what's the best way?

2020-02-11 Thread Bruno Vane
Hello Erik,

As we have a private cloud (VMware), we're using AWS Storage Gateway.
We create a bucket, attach this bucket in gateway (that will generate a NFS
mount point) and mount the NFS in Bacula.

Em ter., 11 de fev. de 2020 às 14:44, Erik Geiger 
escreveu:

>
>
> On Tue, Feb 11, 2020 at 6:10 PM Radosław Korzeniewski <
> rados...@korzeniewski.net> wrote:
>
>> Hello,
>>
>
> Hi Radoslaw,
>
> Thanks for your fast reply.
>
>>
>> wt., 11 lut 2020 o 17:18 Erik Geiger  napisał(a):
>>
>>> Hi,
>>>
>>> Is someone using S3 or another object store for bacula backups?
>>>
>>
>> Yes.
>>
>>
>>>
>>> Are you using the "cloud" function of bacula or are you using a
>>> custom script or the like to sync your backups to the object store?
>>>
>>>
>> Cloud Storage functionality only. I tried third party solutions but I
>> failed.
>>
>>
>>> I'd like to see some working set-ups or ideas on how to implement that.
>>>
>>
>> What exactly do you want? All I implemented was the exact Cloud S3
>> Storage configuration as described in the manual.
>>
>
> You are referring to this documentation?
> https://www.bacula.org/9.4.x-manuals/en/main/New_Features_in_9_4_0.html#SECTION00300100
>
> I wasn't able to build bacula with cloud support and I can't use the rpm
> packages from bacula as they aren't supported by the puppet module I'm
> using. So I was looking for something like an "after" job syncing to S3
> with aws cli or the like.
>
> But if the Cloud Storage functionality is the way to go I'll figure out
> how to compile with S3 support.
> So if I get the documentation right the backup is first stored to the
> local disk and afterwards moved to the cloud while I could still do a
> restore from local disk as long as I configure the configure the "cache
> Retention", right?
>
>> best regards
>> --
>> Radosław Korzeniewski
>> rados...@korzeniewski.net
>>
>
> Thanks,
>
> Erik
> ___
> Bacula-users mailing list
> Bacula-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bacula-users
>
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] bacula archive/off-site-backup to S3 storage, what's the best way?

2020-02-11 Thread Erik Geiger
On Tue, Feb 11, 2020 at 6:10 PM Radosław Korzeniewski <
rados...@korzeniewski.net> wrote:

> Hello,
>

Hi Radoslaw,

Thanks for your fast reply.

>
> wt., 11 lut 2020 o 17:18 Erik Geiger  napisał(a):
>
>> Hi,
>>
>> Is someone using S3 or another object store for bacula backups?
>>
>
> Yes.
>
>
>>
>> Are you using the "cloud" function of bacula or are you using a
>> custom script or the like to sync your backups to the object store?
>>
>>
> Cloud Storage functionality only. I tried third party solutions but I
> failed.
>
>
>> I'd like to see some working set-ups or ideas on how to implement that.
>>
>
> What exactly do you want? All I implemented was the exact Cloud S3 Storage
> configuration as described in the manual.
>

You are referring to this documentation?
https://www.bacula.org/9.4.x-manuals/en/main/New_Features_in_9_4_0.html#SECTION00300100

I wasn't able to build bacula with cloud support and I can't use the rpm
packages from bacula as they aren't supported by the puppet module I'm
using. So I was looking for something like an "after" job syncing to S3
with aws cli or the like.

But if the Cloud Storage functionality is the way to go I'll figure out how
to compile with S3 support.
So if I get the documentation right the backup is first stored to the local
disk and afterwards moved to the cloud while I could still do a restore
from local disk as long as I configure the configure the "cache Retention",
right?

> best regards
> --
> Radosław Korzeniewski
> rados...@korzeniewski.net
>

Thanks,

Erik
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] bacula archive/off-site-backup to S3 storage, what's the best way?

2020-02-11 Thread Radosław Korzeniewski
Hello,

wt., 11 lut 2020 o 17:18 Erik Geiger  napisał(a):

> Hi,
>
> Is someone using S3 or another object store for bacula backups?
>

Yes.


>
> Are you using the "cloud" function of bacula or are you using a
> custom script or the like to sync your backups to the object store?
>
>
Cloud Storage functionality only. I tried third party solutions but I
failed.


> I'd like to see some working set-ups or ideas on how to implement that.
>

What exactly do you want? All I implemented was the exact Cloud S3 Storage
configuration as described in the manual.

best regards
-- 
Radosław Korzeniewski
rados...@korzeniewski.net
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] bacula archive/off-site-backup to S3 storage, what's the best way?

2020-02-11 Thread Erik Geiger
Hi,

Is someone using S3 or another object store for bacula backups?

Are you using the "cloud" function of bacula or are you using a
custom script or the like to sync your backups to the object store?

I'd like to see some working set-ups or ideas on how to implement that.

Kind regards,

Erik

>
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


[Bacula-users] bacula archive/off-site-backup to S3 storage, what's the best way?

2020-02-04 Thread Erik Geiger
Hi,

I tried to build bacula with S3 support but failed. This might be due to
the exact libs3 version needed [
https://sourceforge.net/p/bacula/mailman/message/36503142/].

As that didn't work I'm now focusing on sort of manually copying File
Volumes over to S3 as shortly described here [
https://blog.bacula.org/whitepapers/ObjectStorage.pdf]. But I wasn't able
to find any details on how to exactly do so (running a job which deletes
all "full" or "used" volumes for example).

I know how the SQL query would look like and will be able to write a bash
script doing the uploads to S3 and deleting volumes. But I would have
thought that there are more people doing similar things and do have proven
approaches for that at least.

My plan is to back up multiple machines onto a single storage daemon with
local file storage. (working, already).

For security reasons I'd like to copy all the Volume files immediately
after backup to an S3 bucket.
(I think that would just be a RunScript definition or the like with a bash
script doing an s3sync).

To keep the needed amount of disk space low I'd like to delete local
volumes after a specific scheme (like keeping a specific time range or
keeping the last full + incrementals) and only if the sync went fine.
That's the tricky part as I don't know how to select the volumes within a
schedule or a script without accessing the database directly.
And as we are talking about backups I don't want to experiment too much.

Am I missing something? I wasn't able to find a detailed description on how
to achieve that.

Ahh, I missed some details:

Bacula version: 9.4.4 (build from source)
OS: Amazon Linux 2

Kind regards,

Erik
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users