Re: [Bacula-users] Bacula 11.06 RPMs (Cloud Storage)

2022-05-07 Thread Davide F.
Hi,

cloud storage rpm’s uploaded for 11.0.6

Best regards

On Fri, 6 May 2022 at 20:00 sruckh--- via Bacula-users <
bacula-users@lists.sourceforge.net> wrote:

> Are the additional RPMs (for cloud-storage) that are in the 11.05
> directory going to be added to the 11.06 directory?
>
> Thank You.
> Scott
>
>
> ___
> Bacula-users mailing list
> Bacula-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bacula-users
>
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


[Bacula-users] Bacula 11.06 RPMs (Cloud Storage)

2022-05-06 Thread sruckh--- via Bacula-users
Are the additional RPMs (for cloud-storage) that are in the 11.05 
directory going to be added to the 11.06 directory?


Thank You.
Scott


___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2017-03-20 Thread Kern Sibbald
Hello Daniele,

It has been released in the Enterprise addition at the end of February.  
As I reported in my last status report (see www.bacula.org), I am now 
backporting the changes from the Enterprise version.  All the new 
Enterprise SD plugins will not be available in the first community 
version with the backport.  I am planning to release the Aligned Volumes 
plugin first then a couple of months later release the Cloud plugin.

I have given a rough date for the backport, but since I am "retired" 
deadlines are now out the window (i.e. I avoid deadlines).  The Oct/Nov 
time frame is probably reasonable ...

Best regards,

Kern


On 03/20/2017 03:18 AM, Daniele Palumbo wrote:
> Hi Kern,
>
> News about it?
>
> Thanks,
> Daniele
>
>> Il giorno 18 ott 2016, alle ore 14:13, Kern Sibbald  ha 
>> scritto:
>>
>> Hello,
>>
>> Bacula Systems has a White Paper on Bacula Enterprise Edition in the
>> cloud, and they have given me permission to publish it. However, as it
>> is currently written for Bacula Enterprise customers it needs some
>> modification, which I will make over the next week or so then release it.
>>
>> It discusses a number of different ways that Bacula can work with the
>> cloud, so you all might find it very interesting.  Obviously one of the
>> current limitations for most people (like me) who do not have a big
>> budget for high-speed fiber optic Internet connections is the upload
>> speed.  I have spent a lot of time thinking about this, and I think
>> there are a number of very interesting solutions that will become
>> available in the near future.
>>
>> Best regards,
>> Kern
>>
>> On 10/18/2016 01:45 PM, Josh Fisher wrote:
>>> On 10/18/2016 3:42 AM, Uwe Schuerkamp wrote:
 Hello Jason,

 On Mon, Oct 17, 2016 at 09:37:12PM -0500, Jason Voorhees wrote:
> Hello guys:
>
> Based on your experience, what alternative do we have for backing up
> information to the cloud preferably using Bacula?
>
 I wrote a script a while ago that runs as a RunAfterJob element which
 encrypts (gpg) and copies a full backup of a client (or its disk
 volume rather) to an S3 bucket using the aws shell client.

 It's still very rudimentary but it does the job nicely when it comes
 to keeping a full backup safe (and secure) from a local disaster.

 I seem to recall "cloud support" (whatever that may mean in today's
 buzzword bingo) was announced for Bacula 8.
>>> I tend to think that will be targeting local cloud storage, for example
>>> ownCloud, in enterprise environments. I'm not sure something like S3 is
>>> very useful for direct backup storage over the Internet. A 1 TB backup
>>> over a 100 Mbps connection would take a minimum of 22+ hours, assuming
>>> maximum throughput and that S3 could actually sustain 12.5 MB/s.
>>>
>>> For S3, copying via a script seems the best way to go.
>>>
 All the best,

 Uwe

>>> --
>>> Check out the vibrant tech community on one of the world's most
>>> engaging tech sites, SlashDot.org! http://sdm.link/slashdot
>>> ___
>>> Bacula-users mailing list
>>> Bacula-users@lists.sourceforge.net
>>> https://lists.sourceforge.net/lists/listinfo/bacula-users
>>>
>>
>> --
>> Check out the vibrant tech community on one of the world's most
>> engaging tech sites, SlashDot.org! http://sdm.link/slashdot
>> ___
>> Bacula-users mailing list
>> Bacula-users@lists.sourceforge.net
>> https://lists.sourceforge.net/lists/listinfo/bacula-users


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2017-03-19 Thread Daniele Palumbo
Hi Kern,

News about it?

Thanks,
Daniele

> Il giorno 18 ott 2016, alle ore 14:13, Kern Sibbald  ha 
> scritto:
> 
> Hello,
> 
> Bacula Systems has a White Paper on Bacula Enterprise Edition in the
> cloud, and they have given me permission to publish it. However, as it
> is currently written for Bacula Enterprise customers it needs some
> modification, which I will make over the next week or so then release it.
> 
> It discusses a number of different ways that Bacula can work with the
> cloud, so you all might find it very interesting.  Obviously one of the
> current limitations for most people (like me) who do not have a big
> budget for high-speed fiber optic Internet connections is the upload
> speed.  I have spent a lot of time thinking about this, and I think
> there are a number of very interesting solutions that will become
> available in the near future.
> 
> Best regards,
> Kern
> 
> On 10/18/2016 01:45 PM, Josh Fisher wrote:
>> On 10/18/2016 3:42 AM, Uwe Schuerkamp wrote:
>>> Hello Jason,
>>> 
>>> On Mon, Oct 17, 2016 at 09:37:12PM -0500, Jason Voorhees wrote:
 Hello guys:
 
 Based on your experience, what alternative do we have for backing up
 information to the cloud preferably using Bacula?
 
>>> I wrote a script a while ago that runs as a RunAfterJob element which
>>> encrypts (gpg) and copies a full backup of a client (or its disk
>>> volume rather) to an S3 bucket using the aws shell client.
>>> 
>>> It's still very rudimentary but it does the job nicely when it comes
>>> to keeping a full backup safe (and secure) from a local disaster.
>>> 
>>> I seem to recall "cloud support" (whatever that may mean in today's
>>> buzzword bingo) was announced for Bacula 8.
>> I tend to think that will be targeting local cloud storage, for example
>> ownCloud, in enterprise environments. I'm not sure something like S3 is
>> very useful for direct backup storage over the Internet. A 1 TB backup
>> over a 100 Mbps connection would take a minimum of 22+ hours, assuming
>> maximum throughput and that S3 could actually sustain 12.5 MB/s.
>> 
>> For S3, copying via a script seems the best way to go.
>> 
>>> All the best,
>>> 
>>> Uwe
>>> 
>> 
>> --
>> Check out the vibrant tech community on one of the world's most
>> engaging tech sites, SlashDot.org! http://sdm.link/slashdot
>> ___
>> Bacula-users mailing list
>> Bacula-users@lists.sourceforge.net
>> https://lists.sourceforge.net/lists/listinfo/bacula-users
>> 
> 
> 
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, SlashDot.org! http://sdm.link/slashdot
> ___
> Bacula-users mailing list
> Bacula-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bacula-users



signature.asc
Description: Message signed with OpenPGP using GPGMail
--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2016-10-19 Thread Kern Sibbald

  
  
On 10/19/2016 08:41 AM, Roberts, Ben
  wrote:


  
  
  
  
The
documentation is outdated and this limit was removed (or
perhaps vastly increased?) somewhere around the 7 mark. I’ve
had jobs running a lot longer since upgrading.
 
In
branch-5.2:
http://www.bacula.org/git/cgit.cgi/bacula/tree/bacula/src/lib/bnet.c#n784
bsock->timeout = 60 * 60 * 6 *
24;   /* 6 days timeout */
 
In
branch-7.0 this line is removed.
  


Well, I am not really sure that it was removed, since I would say
that it was replaced.  The current bsock timeout is 200 days.

Best regards,
Kern



  

 
(Unfortunately
I can’t see a way to get a direct link from cgit directly to
a line at a particular commit.)
 

  Regards,
  Ben
  Roberts

 

  
From: Clark, Patti [mailto:clar...@ornl.gov]

Sent: 18 October 2016 22:29
To: bacula-users@lists.sourceforge.net
Subject: Re: [Bacula-users] Bacula in the cloud
  

 
From Bacula’s main.pdf documentation:
 
Max Run Time =
<time> The time specifies the
maximum allowed time that a job may run, counted from when
the job starts, (not
necessarily the same as when the job was scheduled). 
By default, the the watchdog thread will kill
any Job that has run more than 6 days. The maximum watchdog timeout is independent
of MaxRunTime and cannot be changed.

 
 

  Patti Clark
  Linux System Administrator
  R Systems Support Oak Ridge
  National Laboratory

 

  From:
  Josip Deanovic <djosip+n...@linuxpages.net>
  Date: Tuesday, October 18, 2016 at 4:06 PM
  To: "bacula-users@lists.sourceforge.net"
  <bacula-users@lists.sourceforge.net>
          Subject: Re: [Bacula-users] Bacula in the cloud


   


  
On Tuesday 2016-10-18 12:34:08 Jason
Voorhees wrote:
  
  

  Thank you all for your responses.


  I'll take a look at Bacula systems'
  whitepaper to see what they're


  talking about. Meanwhile I'll explore
  some of the alternatives


  discussed on this thread like copying
  files with scripts and making a


  replica on SpiderOak or anything similar.


  I hope we can have an interesting
  solution for this "problem" in the


  near future.

  
  
 
  
  
 
  
  
Hi Jason!
  
  
 
  
  
You have said that "Bacula can't run jobs
for so long without modifying
  
  
source code and recompiling".
  
  
 
  
  
What did you mean by that and can you give
an example of the problem
  
  
you have experienced?
  
  
 
  
  
I am asking because I am not aware of the
bacula's job duration related
  
  
limitations.
  
  
 
  
  
--

  
  
Josip Deanovic
  
  
 
  
  
--
  
  
Check out the vibrant tech community on one
of the world's most

  
  
engaging tech sites, SlashDot.org!
http://sdm.link/slashdot
  
  
___
  
  
Bacula-users mailing list
  
  
Bacula-users@lists.sourceforge.net
  
  
https://lists.sourceforge.net/lists/listinfo/bacula-users
  
  
 
  
  
 
  

  
  
  This email and any files
trans

Re: [Bacula-users] Bacula in the cloud

2016-10-19 Thread Josip Deanovic
On Wednesday 2016-10-19 06:41:53 Roberts, Ben wrote:
> The documentation is outdated and this limit was removed (or perhaps
> vastly increased?) somewhere around the 7 mark. I’ve had jobs running a
> lot longer since upgrading.
> 
> In branch-5.2:
> http://www.bacula.org/git/cgit.cgi/bacula/tree/bacula/src/lib/bnet.c#n7
> 84 bsock->timeout = 60 * 60 * 6 * 24;   /* 6 days timeout */
> 
> In branch-7.0 this line is removed.
> 
> (Unfortunately I can’t see a way to get a direct link from cgit directly
> to a line at a particular commit.)
> 
> Regards,
> Ben Roberts


Great.
Thanks for the info.

-- 
Josip Deanovic

--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2016-10-19 Thread Roberts, Ben
The documentation is outdated and this limit was removed (or perhaps vastly 
increased?) somewhere around the 7 mark. I’ve had jobs running a lot longer 
since upgrading.

In branch-5.2: 
http://www.bacula.org/git/cgit.cgi/bacula/tree/bacula/src/lib/bnet.c#n784
bsock->timeout = 60 * 60 * 6 * 24;   /* 6 days timeout */

In branch-7.0 this line is removed.

(Unfortunately I can’t see a way to get a direct link from cgit directly to a 
line at a particular commit.)

Regards,
Ben Roberts

From: Clark, Patti [mailto:clar...@ornl.gov]
Sent: 18 October 2016 22:29
To: bacula-users@lists.sourceforge.net
Subject: Re: [Bacula-users] Bacula in the cloud

From Bacula’s main.pdf documentation:

Max Run Time =  The time specifies the maximum allowed time that a job 
may run, counted from when the job starts, (not necessarily the same as when 
the job was scheduled).
By default, the the watchdog thread will kill any Job that has run more than 6 
days. The maximum watchdog timeout is independent of MaxRunTime and cannot be 
changed.


Patti Clark
Linux System Administrator
R Systems Support Oak Ridge National Laboratory

From: Josip Deanovic 
<djosip+n...@linuxpages.net<mailto:djosip+n...@linuxpages.net>>
Date: Tuesday, October 18, 2016 at 4:06 PM
To: 
"bacula-users@lists.sourceforge.net<mailto:bacula-users@lists.sourceforge.net>" 
<bacula-users@lists.sourceforge.net<mailto:bacula-users@lists.sourceforge.net>>
Subject: Re: [Bacula-users] Bacula in the cloud

On Tuesday 2016-10-18 12:34:08 Jason Voorhees wrote:
Thank you all for your responses.
I'll take a look at Bacula systems' whitepaper to see what they're
talking about. Meanwhile I'll explore some of the alternatives
discussed on this thread like copying files with scripts and making a
replica on SpiderOak or anything similar.
I hope we can have an interesting solution for this "problem" in the
near future.


Hi Jason!

You have said that "Bacula can't run jobs for so long without modifying
source code and recompiling".

What did you mean by that and can you give an example of the problem
you have experienced?

I am asking because I am not aware of the bacula's job duration related
limitations.

--
Josip Deanovic

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! 
http://sdm.link/slashdot<http://sdm.link/slashdot>
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net<mailto:Bacula-users@lists.sourceforge.net>
https://lists.sourceforge.net/lists/listinfo/bacula-users<https://lists.sourceforge.net/lists/listinfo/bacula-users>

This email and any files transmitted with it contain confidential and 
proprietary information and is solely for the use of the intended recipient.
If you are not the intended recipient please return the email to the sender and 
delete it from your computer and you must not use, disclose, distribute, copy, 
print or rely on this email or its contents.
This communication is for informational purposes only.
It is not intended as an offer or solicitation for the purchase or sale of any 
financial instrument or as an official confirmation of any transaction.
Any comments or statements made herein do not necessarily reflect those of GSA 
Capital.
GSA Capital Partners LLP is authorised and regulated by the Financial Conduct 
Authority and is registered in England and Wales at Stratton House, 5 Stratton 
Street, London W1J 8LA, number OC309261.
GSA Capital Services Limited is registered in England and Wales at the same 
address, number 5320529.
--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2016-10-18 Thread Josip Deanovic
On Tuesday 2016-10-18 21:28:44 Clark, Patti wrote:
> From Bacula’s main.pdf documentation:
> 
> Max Run Time =  The time specifies the maximum allowed time that a
> job may run, counted from when the job starts, (not necessarily the
> same as when the job was scheduled). By default, the the watchdog
> thread will kill any Job that has run more than 6 days. The maximum
> watchdog timeout is independent of MaxRunTime and cannot be changed.

Thanks.

I have never stumbled on this one and I missed this part in the
documentation.
The longest running job I had was a little over five days.

-- 
Josip Deanovic

--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2016-10-18 Thread Heitor Faria

> Thank you all for your responses.
> 
> I'll take a look at Bacula systems' whitepaper to see what they're
> talking about. Meanwhile I'll explore some of the alternatives
> discussed on this thread like copying files with scripts and making a
> replica on SpiderOak or anything similar.

Hello, Jason: why don't you install a 2nd remote Bacula storage daemon on a VPS 
/ Dedicated Server?

> I hope we can have an interesting solution for this "problem" in the
> near future.
> 
> Thanks again for your time :)

Regards,
-- 
=== 
Heitor Medrado de Faria - LPIC-III | ITIL-F | Bacula Systems Certified 
Administrator II 
• Do you need Bacula training? http://bacula.us/video-classes/ 
+55 61 8268-4220 | http://bacula.us 
===

--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2016-10-18 Thread Clark, Patti
From Bacula’s main.pdf documentation:

Max Run Time =  The time specifies the maximum allowed time that a job 
may run, counted from when the job starts, (not necessarily the same as when 
the job was scheduled).
By default, the the watchdog thread will kill any Job that has run more than 6 
days. The maximum watchdog timeout is independent of MaxRunTime and cannot be 
changed.


Patti Clark
Linux System Administrator
R Systems Support Oak Ridge National Laboratory

From: Josip Deanovic <djosip+n...@linuxpages.net>
Date: Tuesday, October 18, 2016 at 4:06 PM
To: "bacula-users@lists.sourceforge.net" <bacula-users@lists.sourceforge.net>
Subject: Re: [Bacula-users] Bacula in the cloud

On Tuesday 2016-10-18 12:34:08 Jason Voorhees wrote:
Thank you all for your responses.
I'll take a look at Bacula systems' whitepaper to see what they're
talking about. Meanwhile I'll explore some of the alternatives
discussed on this thread like copying files with scripts and making a
replica on SpiderOak or anything similar.
I hope we can have an interesting solution for this "problem" in the
near future.


Hi Jason!

You have said that "Bacula can't run jobs for so long without modifying
source code and recompiling".

What did you mean by that and can you give an example of the problem
you have experienced?

I am asking because I am not aware of the bacula's job duration related
limitations.

--
Josip Deanovic

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net<mailto:Bacula-users@lists.sourceforge.net>
https://lists.sourceforge.net/lists/listinfo/bacula-users


--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2016-10-18 Thread Josip Deanovic
On Tuesday 2016-10-18 12:34:08 Jason Voorhees wrote:
> Thank you all for your responses.
> 
> I'll take a look at Bacula systems' whitepaper to see what they're
> talking about. Meanwhile I'll explore some of the alternatives
> discussed on this thread like copying files with scripts and making a
> replica on SpiderOak or anything similar.
> 
> I hope we can have an interesting solution for this "problem" in the
> near future.


Hi Jason!

You have said that "Bacula can't run jobs for so long without modifying
source code and recompiling".

What did you mean by that and can you give an example of the problem
you have experienced?

I am asking because I am not aware of the bacula's job duration related
limitations.

-- 
Josip Deanovic

--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2016-10-18 Thread Jason Voorhees
Thank you all for your responses.

I'll take a look at Bacula systems' whitepaper to see what they're
talking about. Meanwhile I'll explore some of the alternatives
discussed on this thread like copying files with scripts and making a
replica on SpiderOak or anything similar.

I hope we can have an interesting solution for this "problem" in the
near future.

Thanks again for your time :)

--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2016-10-18 Thread C M Reinehr
On 10/17/2016 09:37 PM, Jason Voorhees wrote:
> Hello guys:
>
> Based on your experience, what alternative do we have for backing up
> information to the cloud preferably using Bacula?
>
> I've been reading some posts about similar topics. Bandwidth always
> seem to be a problem because it isn't to big (Gigs per second) or
> there's to much information (several terabytes) and Bacula can't run
> jobs for so long without modifying source code and recompiling.
>
> I've been thinking something alternatives like these:
>
> 1. Backup to local disk and configure Copy jobs to make a copy to
> Amazon S3. Local backups can run always fast but Copy jobs might be
> delayed ... without issues?
>
> 2. Configure Amazon Storage Gateway as VTL so Bacula can backup
> directly to Amazon using virtual tape devices through iSCSI. What do
> you think about this?
>
> 3. For a single fileserver to be backed up (let's say a Samba server),
> I could create a replica in the cloud (i.e. Amazon EC2) which can be
> constantly synchronized (via rsync) and run Bacula locally in such EC2
> instance.
>
> What other ideas have you thought? Maybe a combination of other open
> source tools that can be combined with Bacula? or maybe a different
> open source solution that replaces Bacula to save backups in the
> cloud?
>
> I'd appreciate some ideas, pros and/or cons to be discussed.
>
> Thanks in advance for your time bats!
>
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, SlashDot.org! http://sdm.link/slashdot
> ___
> Bacula-users mailing list
> Bacula-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bacula-users
>

My configuration, here, is to do daily backups at night to a disk array. 
The backup files then are mirrored to an on-line cloud storage service 
called SpiderOak. I keep several days worth of backups on the disk array 
& then rely on the SpiderOak files for long term storage. In this way, I 
completely have eliminated tape backups.

CMR
-- 
Linux distribution Debian v8.5, "Jessie"
--
"Firearms are second only to the Constitution in importance; they are 
the peoples' liberty teeth." -- George Washington


--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2016-10-18 Thread Kern Sibbald
Hello,

Bacula Systems has a White Paper on Bacula Enterprise Edition in the 
cloud, and they have given me permission to publish it. However, as it 
is currently written for Bacula Enterprise customers it needs some 
modification, which I will make over the next week or so then release it.

It discusses a number of different ways that Bacula can work with the 
cloud, so you all might find it very interesting.  Obviously one of the 
current limitations for most people (like me) who do not have a big 
budget for high-speed fiber optic Internet connections is the upload 
speed.  I have spent a lot of time thinking about this, and I think 
there are a number of very interesting solutions that will become 
available in the near future.

Best regards,
Kern

On 10/18/2016 01:45 PM, Josh Fisher wrote:
> On 10/18/2016 3:42 AM, Uwe Schuerkamp wrote:
>> Hello Jason,
>>
>> On Mon, Oct 17, 2016 at 09:37:12PM -0500, Jason Voorhees wrote:
>>> Hello guys:
>>>
>>> Based on your experience, what alternative do we have for backing up
>>> information to the cloud preferably using Bacula?
>>>
>> I wrote a script a while ago that runs as a RunAfterJob element which
>> encrypts (gpg) and copies a full backup of a client (or its disk
>> volume rather) to an S3 bucket using the aws shell client.
>>
>> It's still very rudimentary but it does the job nicely when it comes
>> to keeping a full backup safe (and secure) from a local disaster.
>>
>> I seem to recall "cloud support" (whatever that may mean in today's
>> buzzword bingo) was announced for Bacula 8.
> I tend to think that will be targeting local cloud storage, for example
> ownCloud, in enterprise environments. I'm not sure something like S3 is
> very useful for direct backup storage over the Internet. A 1 TB backup
> over a 100 Mbps connection would take a minimum of 22+ hours, assuming
> maximum throughput and that S3 could actually sustain 12.5 MB/s.
>
> For S3, copying via a script seems the best way to go.
>
>> All the best,
>>
>> Uwe
>>
>
> --
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, SlashDot.org! http://sdm.link/slashdot
> ___
> Bacula-users mailing list
> Bacula-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bacula-users
>


--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2016-10-18 Thread Josh Fisher

On 10/18/2016 3:42 AM, Uwe Schuerkamp wrote:
> Hello Jason,
>
> On Mon, Oct 17, 2016 at 09:37:12PM -0500, Jason Voorhees wrote:
>> Hello guys:
>>
>> Based on your experience, what alternative do we have for backing up
>> information to the cloud preferably using Bacula?
>>
> I wrote a script a while ago that runs as a RunAfterJob element which
> encrypts (gpg) and copies a full backup of a client (or its disk
> volume rather) to an S3 bucket using the aws shell client.
>
> It's still very rudimentary but it does the job nicely when it comes
> to keeping a full backup safe (and secure) from a local disaster.
>
> I seem to recall "cloud support" (whatever that may mean in today's
> buzzword bingo) was announced for Bacula 8.

I tend to think that will be targeting local cloud storage, for example 
ownCloud, in enterprise environments. I'm not sure something like S3 is 
very useful for direct backup storage over the Internet. A 1 TB backup 
over a 100 Mbps connection would take a minimum of 22+ hours, assuming 
maximum throughput and that S3 could actually sustain 12.5 MB/s.

For S3, copying via a script seems the best way to go.

>
> All the best,
>
> Uwe
>


--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula in the cloud

2016-10-18 Thread Uwe Schuerkamp
Hello Jason,

On Mon, Oct 17, 2016 at 09:37:12PM -0500, Jason Voorhees wrote:
> Hello guys:
> 
> Based on your experience, what alternative do we have for backing up
> information to the cloud preferably using Bacula?
> 

I wrote a script a while ago that runs as a RunAfterJob element which
encrypts (gpg) and copies a full backup of a client (or its disk
volume rather) to an S3 bucket using the aws shell client.

It's still very rudimentary but it does the job nicely when it comes
to keeping a full backup safe (and secure) from a local disaster.

I seem to recall "cloud support" (whatever that may mean in today's
buzzword bingo) was announced for Bacula 8.

All the best,

Uwe

-- 
Uwe Schürkamp | email: 








--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


[Bacula-users] Bacula in the cloud

2016-10-17 Thread Jason Voorhees
Hello guys:

Based on your experience, what alternative do we have for backing up
information to the cloud preferably using Bacula?

I've been reading some posts about similar topics. Bandwidth always
seem to be a problem because it isn't to big (Gigs per second) or
there's to much information (several terabytes) and Bacula can't run
jobs for so long without modifying source code and recompiling.

I've been thinking something alternatives like these:

1. Backup to local disk and configure Copy jobs to make a copy to
Amazon S3. Local backups can run always fast but Copy jobs might be
delayed ... without issues?

2. Configure Amazon Storage Gateway as VTL so Bacula can backup
directly to Amazon using virtual tape devices through iSCSI. What do
you think about this?

3. For a single fileserver to be backed up (let's say a Samba server),
I could create a replica in the cloud (i.e. Amazon EC2) which can be
constantly synchronized (via rsync) and run Bacula locally in such EC2
instance.

What other ideas have you thought? Maybe a combination of other open
source tools that can be combined with Bacula? or maybe a different
open source solution that replaces Bacula to save backups in the
cloud?

I'd appreciate some ideas, pros and/or cons to be discussed.

Thanks in advance for your time bats!

--
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula to the Cloud

2010-03-11 Thread Peter Zenge
Following up on my own post, I had a little free time the other day and decided 
to investigate whether this was feasible.  Setting up the necessary services on 
Amazon was trivial, including access control and block storage.  I tried s3fs 
first, and it worked, but it felt like there was way too much i/o going on for 
that kind of data (which is pretty much what I expected).  Then I tried putting 
my bacula-sd on an EC2 node, writing to files on EBS, and it worked great 
(spooling first to the local drive on EC2).  Throughput however was somewhat 
less than I was hoping for, approx. 25% of what I get locally to spool and then 
to tape.  However, I found that there was NO performance penalty for running 
two jobs concurrently.  I didn't try larger numbers, but my guess is you can 
run a large number of concurrent jobs to get a pretty good effective 
throughput, assuming you have lots of clients with similar data sizes.

Our problem is that 80% of our data is on one client, and it would take 130 
hours to do a full backup, and our backup window simply isn't that long.  Then 
I thought I could break the FileSets into smaller pieces and run multiple 
backup jobs in parallel (and I'm assuming that my client is not the 
bottleneck).  However, it wouldn't run more than one job on that client 
concurrently.  Since I can run multiple clients concurrently, I'm pretty sure 
my bacula-dir.conf and bacula-sd.conf settings are correct, and my 
bacula-fd.conf specifies Maximum Concurrent Jobs = 20... Any other reason why 
I couldn't run say 5 parallel jobs with different filesets off the same client?

From: Peter Zenge [mailto:pze...@ilinc.com]
Sent: Tuesday, March 02, 2010 2:57 PM
To: bacula-users@lists.sourceforge.net
Subject: [Bacula-users] Bacula to the Cloud

Hello, 2 year Bacula user but first-time poster.  I'm currently dumping about 
1.6TB to LTO2 tapes every week and I'm looking to migrate to a new storage 
medium.

The obvious answer, I think, is a direct-attached disk array (which I would be 
able to put in a remote gigabit-attached datacenter before too long).  However, 
I'm wondering if anyone is currently doing large (or what seem to me to be 
large) backups to the cloud in some way?  Assuming I have a gigabit connection 
to the Internet from my datacenter, I'm wondering how feasible it would be to 
either use something like Amazon S3 with s3fs (I'm guessing way too much 
overhead to be efficient), or a bacula-SD implementation on an EC2 node, using 
Elastic Block Store (EBS) as local disk, and VPN (Amazon VPC) between my 
datacenter and the SD.

Substitute your favorite cloud provider for Amazon above; I don't use any right 
now so not tied to any particular provider.  It just seems like Amazon has all 
the necessary pieces today.

To do this, and keep customers comfortable with the idea of data in the cloud, 
we would need to encrypt, so I'm also wondering if it would be possible for the 
SD to encrypt the backup volume, rather than the FD encrypt the data before 
sending it to SD (which is what we do now)?  Easier to manage if we just 
handled encryption in one place for all clients.

I would love to hear what other people are either doing with Bacula and the 
cloud, or why you have decided not to.

Thanks

Peter Zenge
Pzenge .at. ilinc .dot. com


--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula to the Cloud

2010-03-11 Thread Dan Langille
On 3/11/2010 4:31 PM, Peter Zenge wrote:
 Following up on my own post, I had a little free time the other day and
 decided to investigate whether this was feasible. Setting up the
 necessary services on Amazon was trivial, including access control and
 block storage. I tried s3fs first, and it worked, but it felt like there
 was way too much i/o going on for that kind of data (which is pretty
 much what I expected). Then I tried putting my bacula-sd on an EC2 node,
 writing to files on EBS, and it worked great (spooling first to the
 “local” drive on EC2). Throughput however was somewhat less than I was
 hoping for, approx. 25% of what I get locally to spool and then to tape.
 However, I found that there was NO performance penalty for running two
 jobs concurrently. I didn’t try larger numbers, but my guess is you can
 run a large number of concurrent jobs to get a pretty good effective
 throughput, assuming you have lots of clients with similar data sizes.

Would you care to add the steps to the wiki?  Then post the URL here please?

-- 
Dan Langille - http://langille.org/

--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Bacula to the Cloud

2010-03-03 Thread Christian Gaul
Am 02.03.2010 22:56, schrieb Peter Zenge:
 Hello, 2 year Bacula user but first-time poster.  I'm currently
 dumping about 1.6TB to LTO2 tapes every week and I'm looking to
 migrate to a new storage medium.
  
 The obvious answer, I think, is a direct-attached disk array (which I
 would be able to put in a remote gigabit-attached datacenter before
 too long).  However, I'm wondering if anyone is currently doing large
 (or what seem to me to be large) backups to the cloud in some way? 
 Assuming I have a gigabit connection to the Internet from my
 datacenter, I'm wondering how feasible it would be to either use
 something like Amazon S3 with s3fs (I'm guessing way too much overhead
 to be efficient), or a bacula-SD implementation on an EC2 node, using
 Elastic Block Store (EBS) as local disk, and VPN (Amazon VPC)
 between my datacenter and the SD.
  
 Substitute your favorite cloud provider for Amazon above; I don't use
 any right now so not tied to any particular provider.  It just seems
 like Amazon has all the necessary pieces today.
  
 To do this, and keep customers comfortable with the idea of data in
 the cloud, we would need to encrypt, so I'm also wondering if it would
 be possible for the SD to encrypt the backup volume, rather than the
 FD encrypt the data before sending it to SD (which is what we do
 now)?  Easier to manage if we just handled encryption in one place for
 all clients.
  
 I would love to hear what other people are either doing with Bacula
 and the cloud, or why you have decided not to.
  
 Thanks
  
 Peter Zenge
 Pzenge .at. ilinc .dot. com


Sending unencrypted data to the SD for encryption would be OK for doing
tape based backups where you do not want to lose the tapes. I would
suggest not sending your unencrypted backup data to someone else and
trusting the receiving system to encrypt it before someone reads it from
RAM.

Depending on your needs it might be OK to do that, but AFAIK bacula does
not support this mode (yet?). AFAIK you have the options of transport
encryption (for the connection and data between the systems) and data
encryption (for the data leaving the system, with the receiving SD not
having the key to do a restore by itself).

I personally use transport and data encryption for saving data to
offsite SDs in untrusted, meaning not directly accessible,
datacenters. If this takes too much CPU time for the 2x encryption you
*MIGHT* want to try data encryption with transport encryption but
dropping the transport encryption after authentication.. i am not sure
about this though, since metadata can be read from the encrypted data
and control structures are sent via this line i would also not suggest
doing this.

Using data encryption with bacula, imho especially with windows, is a
pain because of all the certificate management, but for me it is a
requirement.

--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


[Bacula-users] Bacula to the Cloud

2010-03-02 Thread Peter Zenge
Hello, 2 year Bacula user but first-time poster.  I'm currently dumping about 
1.6TB to LTO2 tapes every week and I'm looking to migrate to a new storage 
medium.

The obvious answer, I think, is a direct-attached disk array (which I would be 
able to put in a remote gigabit-attached datacenter before too long).  However, 
I'm wondering if anyone is currently doing large (or what seem to me to be 
large) backups to the cloud in some way?  Assuming I have a gigabit connection 
to the Internet from my datacenter, I'm wondering how feasible it would be to 
either use something like Amazon S3 with s3fs (I'm guessing way too much 
overhead to be efficient), or a bacula-SD implementation on an EC2 node, using 
Elastic Block Store (EBS) as local disk, and VPN (Amazon VPC) between my 
datacenter and the SD.

Substitute your favorite cloud provider for Amazon above; I don't use any right 
now so not tied to any particular provider.  It just seems like Amazon has all 
the necessary pieces today.

To do this, and keep customers comfortable with the idea of data in the cloud, 
we would need to encrypt, so I'm also wondering if it would be possible for the 
SD to encrypt the backup volume, rather than the FD encrypt the data before 
sending it to SD (which is what we do now)?  Easier to manage if we just 
handled encryption in one place for all clients.

I would love to hear what other people are either doing with Bacula and the 
cloud, or why you have decided not to.

Thanks

Peter Zenge
Pzenge .at. ilinc .dot. com


--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users