Re: [BackupPC-users] syncing local and cloud backups

2018-10-13 Thread Mike Hughes
Another related question: Does it make sense to use rsync's compression when 
transferring cpool? If that data is already compressed, am I gaining much by 
having rsync try to compress it again?

Thanks!

From: Mike Hughes 
Sent: Friday, October 12, 2018 8:25 AM
To: General list for user discussion, questions and support
Cc: Craig Barratt
Subject: Re: [BackupPC-users] syncing local and cloud backups


Cool, thanks for the idea Craig. So that will provide a backup of the entire 
cpool and associated metadata necessary to rebuild hosts in the event of a site 
loss, but what would that process look like?



Say I have the entire ‘/etc/BackupPC’ folder rsynced to an offsite disk. What 
would the recovery process look like? From what I’m thinking I’d have to rsync 
the entire folder back to the destination site, do a fresh install of BackupPC 
and associate it with this new folder. Is that about right? Would there not be 
a method to extract an important bit of data from the cpool without performing 
an entire site restore? I’m considering the situation where I have data of 
separate priority. That one cpool might contain several TB of files along with 
a few important servers of higher priority. The only option looks like a full 
site restore after rsyncing everything back. Am I thinking of this correctly?



From: Craig Barratt via BackupPC-users 
Sent: Thursday, October 11, 2018 20:01
To: General list for user discussion, questions and support 

Cc: Craig Barratt 
Subject: Re: [BackupPC-users] syncing local and cloud backups



I'd recommend just using rsync if you want to make a remote copy of the cpool, 
pc and conf directories, to a place that BackupPC doesn't back up.



Craig



On Thu, Oct 11, 2018 at 10:22 AM Mike Hughes 
mailto:m...@visionary.com>> wrote:

Hi BackupPC users,

Similar questions have come up a few times but I have not found anything 
relating to running multiple pools. Here's our setup:
- On-prem dev servers backed up locally to BackupPC (4.x)
- Prod servers backed up in the cloud to a separate BackupPC (4.x) instance

I'd like to provide disaster recovery options by syncing the dedup'd pools from 
on-prem to cloud and vice-versa but this would create an infinite loop. Is it 
possible to place the off-site data into a separate cpool which I could exclude 
from the sync? It would also be nice to be able to extract files from the 
synced pool individually without having to pull down the whole cpool and 
reproducing the entire BackupPC server.

How do others manage on-prem and off-site backup synchronization?
Thanks,
Mike


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync_bpc: write failed on "/path/dump.sql": Quota exceeded (122)

2018-10-13 Thread Craig Barratt via BackupPC-users
Please Google your os type + "quota".  Here's a tutorial for Ubuntu / Debian
.

Craig

On Sat, Oct 13, 2018 at 3:41 AM Oliver Lippert 
wrote:

> Hey there,
>
> I used to run BackupPC for a while now and since some time I get errors in
> the backups for big files (10GB to 50GB).
>
>
>
> […]
>
> rsync_bpc: write failed on "/path/dump.sql": Quota exceeded (122)
>
> […]
> rsync_bpc: failed to open "/path/mysql/ibdata1", continuing: Quota
> exceeded (122)
>
> […]
>
> rsync error: error in file IO (code 11) at receiver.c(391)
> [receiver=3.0.9.12]
>
>
>
> I searched in the WEB how to figure out which quota I have to configure,
> but I did not found someone else having this problem.
>
>
>
> I do run the BackupPC in a docker container on an Synology DS918+. There
> is enough diskspace available.
>
>
>
> I appreciate any informations / links.
>
>
>
> --
>
> Regards,
>   Oliver Lippert – Lipperts WEB
>
>
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/
>
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] fatal error during xfer (tar:632)

2018-10-13 Thread Craig Barratt via BackupPC-users
Paul,

Sorry, I really misinterpreted the error message ("tar_process done, err =
0") you sent.  It's from smbclient, not tar.

What version of smbclient are you running?  What is the smbclient command
line (near the top of the XferLOG file)?  What are the settings
for $Conf{SmbClientFullCmd} and $Conf{SmbClientIncrCmd}?  Do both
incremental and full backups fail with the same error?

Craig

On Fri, Oct 12, 2018 at 5:46 AM Holger Parplies  wrote:

> Hi,
>
> Paul Littlefield wrote on 2018-10-12 12:22:48 + [Re: [BackupPC-users]
> fatal error during xfer (tar:632)]:
> > On 11/10/2018 23:44, Craig Barratt wrote:
> > >[...]
> > >What version of tar are you using?  Can you run gnu tar instead (that's
> the default under cygwin)?
> > >[...]
> >
> > $ apt-cache policy tar
> > tar:
> >   Installed: 1.28-2.1ubuntu0.1
> >   Candidate: 1.28-2.1ubuntu0.1
> >   Version table:
> >  *** 1.28-2.1ubuntu0.1 500
> > 500 http://gb.archive.ubuntu.com/ubuntu xenial-updates/main
> amd64 Packages
> > 500 http://security.ubuntu.com/ubuntu xenial-security/main
> amd64 Packages
> > 100 /var/lib/dpkg/status
> >  1.28-2.1 500
> > 500 http://gb.archive.ubuntu.com/ubuntu xenial/main amd64
> Packages
>
> you're running Ubuntu on the Windoze host? Amazing ...
>
> Regards,
> Holger
>
>
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/
>
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] rsync_bpc: write failed on "/path/dump.sql": Quota exceeded (122)

2018-10-13 Thread Oliver Lippert
Hey there,

I used to run BackupPC for a while now and since some time I get errors in
the backups for big files (10GB to 50GB).

 

[.]

rsync_bpc: write failed on "/path/dump.sql": Quota exceeded (122)

[.]
rsync_bpc: failed to open "/path/mysql/ibdata1", continuing: Quota exceeded
(122)

[.]

rsync error: error in file IO (code 11) at receiver.c(391)
[receiver=3.0.9.12]

 

I searched in the WEB how to figure out which quota I have to configure, but
I did not found someone else having this problem.

 

I do run the BackupPC in a docker container on an Synology DS918+. There is
enough diskspace available.

 

I appreciate any informations / links.

 

--

Regards,
  Oliver Lippert - Lipperts WEB



 

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/