Re: [BackupPC-users] Backuppc in large environments

2020-12-02 Thread Daniel Berteaud
- Le 2 Déc 20, à 12:53, Dave Sherohman  a écrit : 

> - I'm definitely backing up the VMs as individual hosts, not as disk image
> files. Aside from minimizing atomicity concerns, it also makes single-file
> restores easier and, in the backuppc context, I doubt that deduplication would
> work well (if at all) with disk images.
It's possible to have dedup for huge files changing randomly, but it's a bit 
tricky ;-) 
I use this for some VM images backup : 

* Suspend the VM 
* Take an LVM snapshot (if available) 
* Resume the VM if a snapshot was taken (almost no downtime) 
* Mount the snapshot with chunkfs [0], which will make the big file appears 
as a lot of small chunks 
* Use BackupPC to backup the chunks 
* Resume the VM if no snapshot was taken (in which case there was downtime) 

With this you have dedup, and you can choose the granularity (with BackupPC v4, 
I use 2MB chunks). It requires a few more steps to restore though : 

* Mount the backup tree with fuse-backuppcfs 
* From this mount point, re-assemble the chunks as one, virtual huge file 
(still with chunkfs, which does the reverse operation) 
* You can now copy the image file where you want, and unmount the two 
stacked fuse mount points when done 

All this workflow can be seen in my virt-backup script [1], which is a helper 
for BackupPC to backup libvirt managed VM. 

The same can be done with some scripting for any large binary file, or block 
device. 

Cheers, 
Daniel 

[0] [ http://chunkfs.florz.de/ | http://chunkfs.florz.de/ ] 
[1] https://git.fws.fr/fws/virt-backup 

-- 

[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ] 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:https://github.com/backuppc/backuppc/wiki
Project: https://backuppc.github.io/backuppc/


Re: [BackupPC-users] Backuppc in large environments

2020-12-01 Thread Daniel Berteaud
- Le 1 Déc 20, à 16:33, Dave Sherohman dave.sheroh...@ub.lu.se a écrit :

> 
> Is this something that backuppc could reliably handle?
> 
> If so, what kind of CPU resources would it require?  I've already got a
> decent handle on the network requirements from observing the current TSM
> backups and can calculate likely disk storage needs, but I have no idea
> what to expect the backup server to need in the way of processing power.
> 

While not as big as you, I manage a reasonably big BackupPC server, on a single 
box. It's backing up 193 hosts in total, the pool is ~15TB, ~27 million files. 
The hosts are a mix of a lot of different stuff (mostly VM, but also a few 
appliances, and physical servers), with various backup frequency and history 
config. Most are backed up daily, but some are weekly. It usually represent 
between 200 and 600GB of new data per day.

I'm running this on a single box with those spec :
  * CPU Intel Xeon D-1541 @ 2.10GHz
  * 32GB of RAM
  * 2x120GB SSD for the OS (CentOS 7)
  * 4x12TB SATA in a ZFS pool (~ RAID10)

I'm using the lz4 compression provided by ZFS, so turned the BackupPC one off.

While I do see some slowliness from time to time, it's working well. Long story 
short: don't bother with CPU. Except for the very first backups where it can be 
a bottleneck, disk I/O is what will limit general speed. Spend more in fast 
disks or SSD. If using ZFS, NVMe as a slog can help (or as special metadata 
vdev, although I haven't tested it yet). And as much RAM as you can. with what 
you have left, choose a decent CPU, but don't spend too much on it.

Cheers,
Daniel

-- 
[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:https://github.com/backuppc/backuppc/wiki
Project: https://backuppc.github.io/backuppc/


Re: [BackupPC-users] FEATURE REQUEST: More robust error reporting/emailing

2020-06-25 Thread Daniel Berteaud
- Le 25 Juin 20, à 15:42,  backu...@kosowsky.org a écrit :

> 
> Helpful configurable options would include:
> - *Days* since last successful backup - *per host* configurable - as you
>  may want to be more paranoid about certain hosts versus others while
>  others you may not care if it gets backed up regularly and you want
>  to avoid the "nag" emails
> 
> - *Number* of errors in last backup - *per host/per share*
>  configurable - Idea being that some hosts may naturally have more
>  errors due to locked files or fleeting files while other shares may
>  be rock stable. (Potentially, one could even trigger on types of errors
>  or you could exclude certain types of errors from the count)
> 
> - *Percent* of files changed/added/deleted in last backup relative to
>  prior backup - *per host/per share* configurable - idea here being
>  that you want to be alerted if something unexpected has changed on
>  the host which could even be dramatic if a share has been damaged or
>  deleted or not mounted etc.
> 
> Just a thought starter... I'm sure others may have other ideas to add...
> 


I do all of this with Zabbix and some custom scripts : 
https://git.fws.fr/fws/zabbix-agent-addons/src/branch/master/zabbix_scripts

  * Alert if no backup since $Conf{EMailNotifyOldBackupDays}
  * Alert if Xfer error > threshold
  * Alert if new file size seems abnormal (too small or too big)
  * Graph some data about space consumption/compression efficiency

Maybe it can help ;-)

++

-- 
[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] V4 - eventual effects of wrong sequence when removing Host

2020-04-20 Thread Daniel Berteaud


- Le 18 Avr 20, à 18:00, Michael Stowe michael.st...@member.mensa.org a 
écrit :

> On 2020-04-17 03:20, R.C. wrote:
>> Hi all
>> 
>> Is the following Host remove sequence correct?
>> - remove Host backups (one by one) with BackupPC_backupDelete
>> - run BackupPC_nightly 0 255
>> - remove host from hosts file
>> - reload server configuration
>> - remove pc/ folder manually
>> 
>> What would happen if a wrong sequence is followed?
>> For example:
>> - remove host from hosts file
>> - reload server configuration
>> - remove pc/ folder manually
>> - run BackupPC_nightly 0 255
>> 
>> Would this leave orphan files in the pool?
>> 
>> Thank you
>> 
>> Raf
> 
> I usually follow this sequence:
> 
> - remove host from hosts file
> 
> That's it.  The nightly job will usually take care of the rest, although
> you can speed things up by running steps manually if you really need the
> space.
> 

You also need to delete the backup tree in /var/lib/BackupPC/pc/ (or 
whatever dir is used). Space won't be released as long as the backups are still 
referenced here, and BackupPC do not delete this dir when removing the host 
from the host file

++

-- 
[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] missing pool file erreor, for pool files which are not missing

2020-01-14 Thread Daniel Berteaud
Hi. 

I'm running BackupPC 4.3.1 (with rsync-bpc 3.1.2.1 and BackupPC::XS 0.59). 
Since a few months, I get errors when the nightly cleanup starts : 



2020-01-14 10:00:09 admin1 : BackupPC_refCountUpdate: missing pool file 
8017826edb68ca805238ca4ed9467a2b count 1 2020-01-14 10:00:09 admin1 : 
BackupPC_refCountUpdate: missing pool file 81e1541a0f2c9445d7e1833736ef96d2 
count 1 2020-01-14 10:00:09 admin1 : BackupPC_refCountUpdate: missing pool file 
81f8359f17fd7d1070f0e49ae3654c8d count 1 2020-01-14 10:00:09 admin1 : 
BackupPC_refCountUpdate: missing pool file 80aa2f7e12de3111fd5e40bb01d31d77 
count 1 2020-01-14 10:00:09 admin1 : BackupPC_refCountUpdate: missing pool file 
811b2a6ebdb610c76d14bc18a381e21a count 1 2020-01-14 10:00:09 admin1 : 
BackupPC_refCountUpdate: missing pool file 80fd71163afff0ebaff9ae4003b26181 
count 1 
[...] 

(2689 of these errors) 


I'm actually checking which hosts/backups are affected, but I don't really 
understand what it means. Those files do exist in my cpool, eg 




[root@bkp1 cpool]# ll 
/var/lib/BackupPC/cpool/80/16/8017826edb68ca805238ca4ed9467a2b -r--r--r--. 1 
backuppc backuppc 118 5 juil. 2018 
/var/lib/BackupPC/cpool/80/16/8017826edb68ca805238ca4ed9467a2b 

The file exist and is readable by backuppc. 

This instance was installed 18 months ago (BackupPC v3 install, then upgraded 
to various v4 versions up to the latest as of today). It was upgraded to the 
latest versions of BackupPC's components in April 2019. Those errors only 
appeared recently, so it doesn't look like the echo of an old BackupPC's bug 
already fixed. Also, all of the affected hosts had a full backups (which 
doesn't change anything, those error continue to appear) 

What does this error really means ? Is there a way to fixe this ? How can I be 
sure all my backups are OK ? 

Cheers, 
Daniel 


-- 


[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ] 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Win10 rsync and shadow

2020-01-13 Thread Daniel Berteaud
- Le 12 Jan 20, à 0:14, Michael Stowe  a 
écrit : 

> On 2020-01-09 12:23, Greg Harris wrote:
>> Anyone tried enabling the in-built Win10 SSH server and then using straight
>> rsync, not rsyncd, with cygwin-rsync client?
>> Additionally, and maybe I should create a separate thread, but does the
>> cygwin-rsync client that's bundled on the GitHub page use the shadow volume? 
>> I
>> can't see anywhere that it discusses this, nor find any forum posts. However,
>> this forum post says there are at least two solutions for doing shadow copies
>> with rsync:
>> [
>> https://u2182357.ct.sendgrid.net/wf/click?upn=rBK8reUlX8Sxr7Iz1fV-2F7YQ8BYSHbrWZS1jKQzKBLmHDvD-2BcaXOPRSW8DuNIHKGhDNdyUJdGmEZGMgRHKaX7P1K7kmZ1gTYsRz7rQadIHYI-3D_OypFYCWzG5ApGW-2FFpGTxc4RCS9eud0Dl1htN5rYoUZ8To4zeNUFBkAGI3hzer91C9VErJ0fkm9L7N1YEY7J0k6aPdAmMs-2BUPmYfHrUS-2BlNGF2-2B-2Bjsg0MpoOjaNdyNexmche5RLR2g94Z5b-2FiwuldbCH5OCZR0Os-2B0E01CPwXp3bdBoM2ptr4QH8NAxvo6adkVbEoHUAKboAHCCvuFjvzC-2BJgw5HSFku9S1b0ofdK5eXLc1XEdceiqbpealdu3-2BRbeBJr9ovw42-2F0a2d2-2Bi7uhA-3D-3D
>> | https://sourceforge.net/p/backuppc/mailman/message/36519429/ ]
>> Yet, searching around, I've only been able to find:
>> [
>> https://u2182357.ct.sendgrid.net/wf/click?upn=rBK8reUlX8Sxr7Iz1fV-2F7Zaxtwab-2FG40JLJH69HA23Ye53aTPmpurGS-2FLnwtQoz5_OypFYCWzG5ApGW-2FFpGTxc4RCS9eud0Dl1htN5rYoUZ8To4zeNUFBkAGI3hzer91C9VErJ0fkm9L7N1YEY7J0k4BxpvT-2BOvk3IjoUqhQ8jDF-2FE9nj4OCuhhb9No1f8TQctBDSZ6BaHSqje6kX3UdYOrXhDFt-2FJI-2BW8P9rmJEM9nhWqT0EHtTrygEYDAZr-2B0a9FaejogzoMSH904U-2B1-2BeH5qS3rUAqAQV9B5vBDTTVGRyzfOXqtE-2B7f5v93Cg8QLaGh7LlpOXCmHlKDwC-2FN-2BhJVA-3D-3D
>> | https://www.michaelstowe.com/backuppc/ ]
>> Thoughts? Ideas? Tomatoes?

>> Thanks,
>> Greg Harris

> It has been a while since I tried it, but last time I did, I had trouble with
> permissions (i.e., there were many files that it simply did not allow reading)
> and with shadow volumes -- there's a bit of a Catch-22 where the shadow volume
> is unusable unless it's created before the ssh service is started, which kind
> of defeats the point of using it to control the shadow volume.
To work with transient shadow volume, I created a simple rsync wrapper. It uses 
Michael Stowe scripts ( [ https://www.michaelstowe.com/backuppc/ | 
https://www.michaelstowe.com/backuppc/ ] with some slight modifications). I'm 
not using it with the native Win10 SSH Server, but with OpenSSH port ( [ 
https://github.com/PowerShell/Win32-OpenSSH | 
https://github.com/PowerShell/Win32-OpenSSH ] ) 
So yes, I use straight rsync over ssh to backup Windows hosts, with VSS 
support. The wrapper script makes it simple to use, with any pre/post backup 
action, and without the hassle of persistent shadow management. 

My script is [ 
https://git.fws.fr/fws/wapt-backuppc-agent/src/branch/master/rsync.cmd | 
https://git.fws.fr/fws/wapt-backuppc-agent/src/branch/master/rsync.cmd ] 

Cheers, 
Daniel 

-- 

[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ] 
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Force pool cleanup

2019-12-20 Thread Daniel Berteaud
You can use :

sudo -u backuppc /usr/share/BackupPC/bin/BackupPC_serverMesg BackupPC_nightly 
run

With this, you don't have to stop BackupPC, you just ask BackupPC daemon to 
start a pool cleanup right now

- Le 19 Déc 19, à 16:08, Gandalf Corvotempesta 
gandalf.corvotempe...@gmail.com a écrit :

> Hi to all.
> Any command to run manually to force deletion of "expired" files from
> pool to free up disk space?
> 
> I'm running the nightly on a very very low schedule , but right now I
> have to run it to clean up as much as possible
> 
> Any hint ?
> 
> 
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/

-- 
[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up LUKS encrypted home folder

2019-10-29 Thread Daniel Berteaud


- Le 29 Oct 19, à 16:42, jason ja...@jasonlocascio.com a écrit :

> 
> The only wrinkle - but I can live with it - is I would like to have /
> backed up as well 

The fact that changing the share from / to /home/mydir fixes it seems to 
confirm the issue is with --one-filesystem.
Have you tried running the dump from CLI to check rsync args are passed as 
expected ?

++

-- 
[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up LUKS encrypted home folder

2019-10-28 Thread Daniel Berteaud
- Le 28 Oct 19, à 10:11, jason ja...@jasonlocascio.com a écrit :

> 
> OK, I thought it was LUKS - it was setup when I installed Mint via the
> installer.
> 
> Either way, the question is how do I get back to it backing up in
> unencrypted form ? Nothing on the laptop has changed - it's BackupPC
> which is behaving differently ?

My guess is that you leave the --one-file-system (which is present in v4), and 
so rsync skip the "clear" data, and instead backups their encrypted form 
underneath.

++ 

-- 
[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up LUKS encrypted home folder

2019-10-28 Thread Daniel Berteaud


- Le 27 Oct 19, à 19:03, Jason LoCascio ja...@jasonlocascio.com a écrit :

> For some reason switching from BackupPC v3 to v4 has changed the way my
> LUKS encrypted laptop is backed up.
> 
> Previously, the backups in BackupPC were unencrypted (how I wanted
> them).
> 
> Now, there are just loads of entries like:
> 
> ECRYPTFS_FNEK_ENCRYPTED.FWaQ8pHwn1K4UkSWv5abeCSP.prlD68Y8MJK-
> cb.p7ggEFVH4QiIMNrvBE--

This looks like an ecryptfs setup, not LUKS. Not sur for ecryptfs, but I think 
you have a encrypted store, and you mount it unencrypted somewhere else no ?

++


-- 
[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Multiple CPU usage for compression?

2019-10-02 Thread Daniel Berteaud
- Le 2 Oct 19, à 18:51,  p2k-...@roosoft.ltd.uk a écrit :

> On 02/10/2019 15:46, Daniel Berteaud wrote:
>> - Le 1 Oct 19, à 10:51,  p2k-...@roosoft.ltd.uk a écrit :
>>
>>> Hmmm I am not so sure about that.. because it appears the time it takes
>>> compress files also slows down the transfer of them. I was getting like
>>> 6Mb/s from a server on the same switch as the backup machine. One CPU
>>> out of 16 was pegged at a 100% under compression.
>> How do you know compression is the bottleneck ?
> 
> 
> I happened to be watching htop at the time. I was suprrised to see only
> one core pegged at 100%

That doesn't mean this process is busy only doing compression (it might, but it 
could be doing something else)


> 
>>> Surely it would be
>>> trivial to replace gzip with pigz and bzip2 with pbzip2?
>> BackupPC does not use an external binary to compress data so no, it wouldn't 
>> be
>> as trivial as s/gzip/pigz/
>>
> 
> Oh? Then why is there a config variable for the gzip path ? What is it
> used for it not compression?
> 
> Curious.

It's for compression of archives, not pooled files

++

-- 
[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Multiple CPU usage for compression?

2019-10-02 Thread Daniel Berteaud

- Le 1 Oct 19, à 10:51,  p2k-...@roosoft.ltd.uk a écrit :

> Hmmm I am not so sure about that.. because it appears the time it takes
> compress files also slows down the transfer of them. I was getting like
> 6Mb/s from a server on the same switch as the backup machine. One CPU
> out of 16 was pegged at a 100% under compression.

How do you know compression is the bottleneck ?

> Surely it would be
> trivial to replace gzip with pigz and bzip2 with pbzip2?

BackupPC does not use an external binary to compress data so no, it wouldn't be 
as trivial as s/gzip/pigz/

-- 
[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Bug in backuppcfs.pl?

2019-05-13 Thread Daniel Berteaud
Le 2019-05-02 06:56, backu...@kosowsky.org a écrit :

> Listing 'ls'
directories give error messages even though the listing
> is otherwise
correct...
> 
> I mount the fusefs as follows:
> sudo backuppcfs.pl
/mnt/temp
> 
> Then when I run:
> ls /mnt/temp
> 
> I get lines of
form:
> ls: /mnt/temp/machine1: No such file or directory
> ls:
/mnt/temp/machine2: No such file or directory
> ls: /mnt/temp/machine3:
No such file or directory
> ls: /mnt/temp/machine3: No such file or
directory
> total 3
> drwxr-xr-x 25 root root 1024 Dec 31  1969
machine1/
> drwxr-xr-x  5 root root 1024 Dec 31  1969 machine2/
>
drwxr-xr-x 18 root root 1024 Dec 31  1969 machine3

Same behavior here,
on CentOS 7. Seems to be working fine though. I do not know FUSE enough
to debug (well at least, haven't took the time to analyze). 

++ 

--
Daniel Berteaud
FIREWALL-SERVICES SAS, La sécurité des réseaux
Société de Services en Logiciels Libres
Tel : +33.5 56 64 15 32
https://www.firewall-services.com___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] HIGH PRIORITY BUG FIX in v4.x that creates corruption when creating/reading/migrating v3 backups

2019-04-08 Thread Daniel Berteaud
Le 2019-04-08 02:23, Craig Barratt via BackupPC-users a écrit :

>
Jeff, 
> 
> Thanks for tracking this down and proposing the correct fix.
I've pushed the fix to git, and released a new BackupPC-XS-0.59.tar.gz
on both git and cpan.

I'll update ASAP, but, what are the concrete
risks for running 0.58 and earlier ? I have a few install upgraded from
v3, are the new backups made after the upgrade safe ? 

Cheers, 
Daniel


  

--
Daniel Berteaud
FIREWALL-SERVICES SAS, La sécurité des réseaux
Société de Services en Logiciels Libres
Tel/Fax : +33.5 56 64 15 32
Mail : dan...@firewall-services.com
Web : https://www.firewall-services.com___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Xfer errors (especially bpc_sysCall_checkFileMatch(/path/to/file): file doesn't exist)

2019-04-05 Thread Daniel Berteaud
Thanks for your response Michael, some comments inline 

Le 2019-04-04
23:22, Michael Stowe a écrit :

> On 2019-04-03 05:02, Daniel Berteaud
wrote: 
> 
> First, as soon as there's a single error (most common are
"file has
> vanished"), BackupPC reports 3 errors.
> 
> Often, the
backup reports an error count of 2, with these kind of
> errors :
> 
> [
213 lignes sautées ]
> rsync_bpc: stat
"/.config/google-chrome/CertificateRevocation/5022"
> failed: No such
file or directory (2)
> rsync_bpc: stat
>
"/.config/google-chrome/CertificateRevocation/5022/_metadata" failed:
>
No such file or directory (2)
> rsync_bpc: stat
"/.config/google-chrome/CertificateRevocation/5023"
> failed: No such
file or directory (2)
> rsync_bpc: stat
>
"/.config/google-chrome/CertificateRevocation/5023/_metadata" failed:
>
No such file or directory (2)

These files seem likely to have been
updated or removed during the backup run. That directory contains
bookmarks, cookies, extensions, and all kinds of things one might expect
to change. 

My log snippets are just examples. I know this directory
contains moving parts, I just try to understand why I do get this error
while the mentionned dirs exist both on the source, and in the backup.
To me this makes no sense. Either the dir doesn't exist during the scan,
and rsync won't try to transfert it, or the dir exists and rsync
transfert it, or the dir exists during the scan, but disapears before
rsync transfert it. In this last case, I expect this kind of errors in
the logs, but the directory should not exists in the backup. 

> The
symptoms here and the directory in question matches the behavior that is
expected when backing up a Samba share used by Windows systems with
access to a recycle bin. Windows will rename a file into the recycle
bin, so you'll see this when you back up the original file -and- the
recycle bin. I can't imagine why one would normally want to back up a
recycle bin, since presumably one would have already backed up the file
before it was deleted, so it's probably simplest just to exclude the bin
from the backups. Alternatively, one can remove the bin (or access to
it) so that Windows clients delete the original files, though that also
makes backup of the recycle bin somewhat pointless.

This was also just
the first occurence of the error I found in my logs, I have the same
errors for other files. Besides that, in this case, from BackupPC's POV,
it doesn't matter if the directory is used a Recycle Bin for samba. It's
just a directory where files which were previously elswhere have been
moved. It could have been done by a human intervention in a FooBar
directory. 

I'm just trying to understand what the error means 

--
Daniel Berteaud
FIREWALL-SERVICES SAS, La sécurité des réseaux
Société de Services en Logiciels Libres
Tel/Fax : +33.5 56 64 15 32
Mail : dan...@firewall-services.com
Web : https://www.firewall-services.com___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Xfer errors (especially bpc_sysCall_checkFileMatch(/path/to/file): file doesn't exist)

2019-04-04 Thread Daniel Berteaud
Le 04/04/2019 à 16:20, G.W. Haywood via BackupPC-users a écrit :
> Hi there,


Hi, thanks for your response :-)


>
> Like you, after using V3 for many years I'm playing with V4, but only
> seriously since about August 2018.  There were a few hiccups to begin
> with, but I don't see the kinds of errors that you're seeing.
>
> No hosts here use the 'rsync' XferMethod.  I've grepped XferLOGs for
> the past year for errors on hosts which use the 'rsyncd' XferMethod,
> and apart from a few expected errors where machines have been shut
> down mid-backup, nothing is making itself obvious.


Maybe the issue only manifest itself when using the rsync transport then
(I use almost exclusively rsync over ssh).


Craig, if you're arround, could you explain what those errors mean ?


++




___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Xfer errors (especially bpc_sysCall_checkFileMatch(/path/to/file): file doesn't exist)

2019-04-03 Thread Daniel Berteaud
Am I the only one having this kind of errors, or the only one caring
about the errors ?


Cheers

Daniel

Le 12/03/2019 à 08:56, Daniel Berteaud a écrit :
>
> Hi.
>
> I've been using BackupPC since the past 13 years (starting with 2.1.2
> I think), with great success. I'm now running a few installations with
> v4, and one of the things I don't understand is the Xfer errors. I
> regularily check Xfer error to be sure my backups are OK. Before v3, I
> could reliabily check the number of Xfer errors, but in v4, I don't
> understand.
>
> First, as soon as there's a single error (most common are "file has
> vanished"), BackupPC reports 3 errors.
>
> Often, the backup reports an error count of 2, with these kind of errors :
>
> [ 213 lignes sautées ]
> rsync_bpc: stat "/.config/google-chrome/CertificateRevocation/5022"
> failed: No such file or directory (2)
> rsync_bpc: stat
> "/.config/google-chrome/CertificateRevocation/5022/_metadata" failed:
> No such file or directory (2)
> rsync_bpc: stat "/.config/google-chrome/CertificateRevocation/5023"
> failed: No such file or directory (2)
> rsync_bpc: stat
> "/.config/google-chrome/CertificateRevocation/5023/_metadata" failed:
> No such file or directory (2)
> [ 106 lignes sautées ]
>
> [...]
>
> rsync error: some files/attrs were not transferred (see previous
> errors) (code 23) at main.c(1676) [generator=3.1.2.0]
> rsync_bpc exited with benign status 23 (5888)
>
>
> This is quite confusing because, in this example, the directories does
> exist, both on the source, and in the backup accounting these errors.
> So, what does this error mean ? It looks like it happens when those
> directories did not exist in the previous backup (the one aginst which
> rsync is transfering deltas), but, why would this be an error ?
>
> Last, and the most important to me, I have some hosts with hundreds of
> Xfer errors like these :
>
>
> [ 53777 lignes sautées ]
> R bpc_sysCall_checkFileMatch(shares/assistantedirection/files/Recycle
> Bin/BIBLIOTHEQUE/DOCUMENTATION/AMIANTE/Courrier FEDEREC-CRAMIF-Demande
> de conventionnement.pdf): file doesn't exist
> R bpc_sysCall_checkFileMatch(shares/assistantedirection/files/Recycle
> Bin/BIBLIOTHEQUE/DOCUMENTATION/BTP/Courrier FEDEREC-CRAMIF-Demande de
> conventionnement.pdf): file doesn't exist
> R bpc_sysCall_checkFileMatch(shares/assistantedirection/files/Recycle
> Bin/DOSSIERS COMMUNS GROUPE PENA/ACHATS/DEMANDES D'ACHAT/2019/2 FEV
> 19/110319CB - GESTECO Château de France.docx): file doesn't exist
> [ 161 lignes sautées ]
>
> Those errors are not accounted in the Xfer error count, but displayed
> if we open the XferErr log. My C foo is not good enough to understand
> exactly what's going on in
> https://github.com/backuppc/rsync-bpc/blob/master/bpc_sysCalls.c#L759
>
> Looks like this error can be logged when a file is moved (so it exists
> in the pool, but not at the same path in the previous backup agsint
> which we delta ?)
>
> Can someone confirm this is harmless ? I so, wouldn't it make sens to
> remove this from the XferErr (or maybe just include it if XferLogLevel
> > 3 or something like that ?)
>
>



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Xfer errors (especially bpc_sysCall_checkFileMatch(/path/to/file): file doesn't exist)

2019-03-12 Thread Daniel Berteaud
Hi.

I've been using BackupPC since the past 13 years (starting with 2.1.2 I
think), with great success. I'm now running a few installations with v4,
and one of the things I don't understand is the Xfer errors. I
regularily check Xfer error to be sure my backups are OK. Before v3, I
could reliabily check the number of Xfer errors, but in v4, I don't
understand.

First, as soon as there's a single error (most common are "file has
vanished"), BackupPC reports 3 errors.

Often, the backup reports an error count of 2, with these kind of errors :

[ 213 lignes sautées ]
rsync_bpc: stat "/.config/google-chrome/CertificateRevocation/5022"
failed: No such file or directory (2)
rsync_bpc: stat
"/.config/google-chrome/CertificateRevocation/5022/_metadata" failed: No
such file or directory (2)
rsync_bpc: stat "/.config/google-chrome/CertificateRevocation/5023"
failed: No such file or directory (2)
rsync_bpc: stat
"/.config/google-chrome/CertificateRevocation/5023/_metadata" failed: No
such file or directory (2)
[ 106 lignes sautées ]

[...]

rsync error: some files/attrs were not transferred (see previous errors)
(code 23) at main.c(1676) [generator=3.1.2.0]
rsync_bpc exited with benign status 23 (5888)


This is quite confusing because, in this example, the directories does
exist, both on the source, and in the backup accounting these errors.
So, what does this error mean ? It looks like it happens when those
directories did not exist in the previous backup (the one aginst which
rsync is transfering deltas), but, why would this be an error ?

Last, and the most important to me, I have some hosts with hundreds of
Xfer errors like these :


[ 53777 lignes sautées ]
R bpc_sysCall_checkFileMatch(shares/assistantedirection/files/Recycle
Bin/BIBLIOTHEQUE/DOCUMENTATION/AMIANTE/Courrier FEDEREC-CRAMIF-Demande
de conventionnement.pdf): file doesn't exist
R bpc_sysCall_checkFileMatch(shares/assistantedirection/files/Recycle
Bin/BIBLIOTHEQUE/DOCUMENTATION/BTP/Courrier FEDEREC-CRAMIF-Demande de
conventionnement.pdf): file doesn't exist
R bpc_sysCall_checkFileMatch(shares/assistantedirection/files/Recycle
Bin/DOSSIERS COMMUNS GROUPE PENA/ACHATS/DEMANDES D'ACHAT/2019/2 FEV
19/110319CB - GESTECO Château de France.docx): file doesn't exist
[ 161 lignes sautées ]

Those errors are not accounted in the Xfer error count, but displayed if
we open the XferErr log. My C foo is not good enough to understand
exactly what's going on in
https://github.com/backuppc/rsync-bpc/blob/master/bpc_sysCalls.c#L759

Looks like this error can be logged when a file is moved (so it exists
in the pool, but not at the same path in the previous backup agsint
which we delta ?)

Can someone confirm this is harmless ? I so, wouldn't it make sens to
remove this from the XferErr (or maybe just include it if XferLogLevel >
3 or something like that ?)


Cheers,

Daniel

-- 

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32
Matrix: @dani:fws.fr
/www.firewall-services.com/

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] logout button

2018-12-19 Thread Daniel Berteaud
Le 19/12/2018 à 08:39, Alexey Safonov via BackupPC-users a écrit :
> Hi guys
>
> is it possible to add logout button for backuppc interface?


I don't think so, as BackupPC doesn't handle authentication itself (it
just trust the webserver to do it). All that could be done is to add a
button to redirect on a specific URL, which could be catched by your
WebSSO system if you have one, and close the SSO session.


++

-- 

Logo FWS

    *Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32
Matrix: @dani:fws.fr
/www.firewall-services.com/

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] IO error encountered -- skipping file deletion

2018-12-17 Thread Daniel Berteaud
Le 17/12/2018 à 19:06, Mike Hughes a écrit :
>
> Hi Daniel,
>
>  
>
> It sounds like the exclusion rules aren’t working as you expect. If
> they were I don’t think you’d see the errors even if the pool files
> had a problem since they’d never be compared. I ran into problems when
> I set up mine too. If I recall my confusion was around identifying the
> share name in the exclusion rule. I often use an asterisk to cover all
> shares so my exclusion statements look like this:
>

My exclusion works, but anyway, is irrelevant because the file listed as
having an error doesn't exist anymore.

BackupPC runs with --delete-excluded that's why even excluded dir will
trigger this. The problem is when rsync_bpc tries to delete the file
from the pool

Regards,

Daniel

-- 

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32
Matrix: @dani:fws.fr
/www.firewall-services.com/

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] IO error encountered -- skipping file deletion

2018-12-17 Thread Daniel Berteaud
Hi.


I'm running BackupPC v4.3.0 on CentOS 7 for a bit more than 200 hosts.
Most are using rsync Xfer method and are working fine.

But for two of them, I started last week to have respectively 3 and 6
xfer errors, on every new backup (no matter if I start a new complete,
or incr backup)

Eg:


[ 57 lignes sautées ]
file has vanished: 
"/var/log/pve/tasks/6/UPID:pve3:597E:01554D97:5C0C5076:vzdump::root@pam:"
IO error encountered -- skipping file deletion
[ 82 lignes sautées ]
rsync_bpc: fstat 
"/var/log/pve/tasks/6/UPID:pve3:597E:01554D97:5C0C5076:vzdump::root@pam:" 
failed: No such file or directory (2)
[ 5 lignes sautées ]
rsync_bpc: fstat 
"/var/log/pve/tasks/E/UPID:pve3:5875:0162BE48:5C0C72DE:aptupdate::root@pam:"
 failed: No such file or directory (2)

I did had those files when the first backup reporting the errors ran.
But they do not even exist anymore now. I've also tried to exclude the
impacted directories, but the issue remains. So I guess there's
something wrong on the pool. How can I check this ? And how can I fix it
so it doesn't report Xfer errors anymore ?


Regards,

Daniel

-- 

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32
Matrix: @dani:fws.fr
/www.firewall-services.com/

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Disable hostname lookup check

2018-11-28 Thread Daniel Berteaud

Le 26/11/2018 à 17:54, Daniel Berteaud a écrit :
> Le 23/11/2018 à 19:54, Craig Barratt via BackupPC-users a écrit :
>>
>>   * set $Conf{NmbLookupFindHostCmd} to an empty string; if the DNS
>> lookup fails and $Conf{NmbLookupFindHostCmd} is empty, then
>> whatever host name you use should be used without a lookup error.
>>

Setting an empty $Conf{NmbLookupFindHostCmd} works as expected. Thanks
again Craig ! :-)

-- 

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32
Matrix: @dani:fws.fr
/www.firewall-services.com/

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Disable hostname lookup check

2018-11-26 Thread Daniel Berteaud
Le 23/11/2018 à 19:54, Craig Barratt via BackupPC-users a écrit :
> Daniel,


Hi Craig


>
> Here are two things to try (in theory either one should work):
>
>   * use an IP address (anything) instead of a bogus host name,
>

The hostname is not really bogus, as it's a valid one behind the
bastion. And it makes it easier to manage my .ssh/config on BackupPC's side


>   * set $Conf{NmbLookupFindHostCmd} to an empty string; if the DNS
> lookup fails and $Conf{NmbLookupFindHostCmd} is empty, then
> whatever host name you use should be used without a lookup error.
>

Great, I'll test this ASAP and let you know


Thanks for your answer :-)


Cheers,

Daniel

-- 

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32
Matrix: @dani:fws.fr
/www.firewall-services.com/

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Disable hostname lookup check

2018-11-23 Thread Daniel Berteaud
HI there

I have an issue where BackupPC can't backup some hosts anymore. Let me
explain

  * I backup a bunch of servers behind bastion hosts. SSH tunneling is
configured in ~backuppc/.ssh/config
  * Some of the host behind the bastion do not have a corresponding
public DNS entry published
  * For all of my hosts, the name of the machine is meaningless. It's
something like client1_machine3. The DNS name is always set in the
ClientNameAlias
  * For host behind a bastion, I change the PingCmd to $pingPath -c 1 -w
3 bastion.host.net so the bastion is pingued and its latency can be
checked

Until now, I had a wildcard DNS CNAME entry for the domain of my backup
servers, which made everything working. But as I'm changing the backend,
and don't want a wildcard (* IN CNAME foo.bar) entry, I'm now facing the
issue that, hosts behind a bastion, without a corresponding DNS entry
won't backup anymore. The error is


2018-11-23 08:59:53 Can't find host ldap.client1.fr via NS and netbios
2018-11-23 08:59:53 can't ping  (client = client1_ldap); exiting


Of course it can't find it, and it doesn't need to because everything
will be done through an ssh tunnel, defined in .ssh/config.
With my previous setup, it would have looked for ldap.client1.fr, and
wouldn't have an anwser, and then, it would have looked for
ldap.client1.fr.firewall-services.com which would get caught by the
wildcard entry, and then, the backup could run. But the DNS answer
itself was meaningless, and not used at all.

Is there a way to disable this lookup for such cases ?

Regards, Daniel


-- 

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32
Matrix: @dani:fws.fr
/www.firewall-services.com/

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Large files with small changes

2018-11-20 Thread Daniel Berteaud
Le 20/11/2018 à 18:39, Craig Barratt via BackupPC-users a écrit :
> Steve,
>
> You are exactly right - BackupPC's storage granularity is whole
> files.  So, in the worst case, a single byte change to a file that is
> a unique will result in a new file in the pool.  Rsync will only
> transfer the deltas, but the full file gets rebuilt on the server.
>
> Before I did the 4.x rewrite, I did some benchmarking on block-level
> or more granular deltas, but the typical performance improvement was
> modest and the effort to implement it was large.  However, there are
> two cases where block-level or byte-level deltas would be very helpful
> - database files (as you mentioned) and VM images.
>
> Perhaps you could use $Conf{DumpPreUserCmd} to run a script that
> generates byte-level deltas, and exclude the original database files? 
> You could have a weekly schedule where you copy the full database file
> on, eg, Sunday, and generate deltas every other day of the week.  Then
> BackupPC will backup the full file once, and also grab each of the
> deltas.  That way you'll have a complete database file once per week,
> and all the daily (cumulative) deltas.


That's what I do for VM images for the past 10 years now: I use chunkfs
[1] to expose big VM images (or block devices) as a set of fixed sized
chunks, which BackupPC can archive. This way I have only the delta to
transfert store. With BackupPC v4 I use a 2MB chunk size.

To restore, I just have to assemble the chunks back (I stack backuppcfs
and unchunkfs so there's no temp file anywhere). It's rock solid,
efficient, and performant. Already restored several TB of VM using this :-)

The same can be done for DB dumps.


[1] http://chunkfs.florz.de/

-- 

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32
Matrix: @dani:fws.fr
/www.firewall-services.com/

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 4.1.5 released (plus new versions of backuppc-xs and rsync-bpc)

2018-01-13 Thread Daniel Berteaud



Le 11/01/2018 à 18:50, Pete Geenhuizen a écrit :


Daniel,

Thanks for the quick reply, and you are correct, however when I check 
the list of available rpms


Here's a list of the first few RPMs that I get, as you can see they 
are for noarch.  I'm only interested int the X86_64 rpms that you have 
created.


Available Packages
BackupPC-server-scripts.noarch 
0.1.2-1.el7.fws fws
dehydrated.noarch 
0.4.0-10.el7.fws    fws

dl.noarch 0.18.1-1.el7.fws    fws
dl-cli.noarch 0.18.1-1.el7.fws    fws
dokuwiki.noarch 
20170219e-1.el7.fws fws
dokuwiki-plugins.noarch 
20170219e-2.el7.fws fws
fuse-backuppcfs.noarch 
0.1-2.el7.fws   fws
fuse-chunkfs.x86_64 
0.7-1.el7.fws   fws




You can see fuse-chunkfs for example which is for x86_64. I'm not sure 
what's the problem. As you ended the list with etc I guess you have 
other packages listed. Are you saying you don't have BackupPC4 in the list ?
It's expected to see the noarch RPMS, as they are valid for any 
architecture, including x86_64


++
--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Matrix: @dani:fws.fr
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 4.1.5 released (plus new versions of backuppc-xs and rsync-bpc)

2018-01-11 Thread Daniel Berteaud

Le 11/01/2018 à 18:31, Pete Geenhuizen a écrit :


Daniel,

When I list the available rpms from your repo for Centos 7 X86_64 all 
that I get is the listing for the noarch rpms.




http://repo.firewall-services.com/centos/7/x86_64/ contains x86_64 RPMS

++

--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Matrix: @dani:fws.fr
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 4.1.5 released (plus new versions of backuppc-xs and rsync-bpc)

2017-12-19 Thread Daniel Berteaud

Le 18/12/2017 à 19:08, Pete Geenhuizen a écrit :

OK I'll give it a shot with the V3 config.

As far as the configure.pl script goes, it should be included since it 
is mentioned in the documentation.


Now included in the doc dir

Cheers,
Daniel
--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Matrix: @dani:fws.fr
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 4.1.5 released (plus new versions of backuppc-xs and rsync-bpc)

2017-12-18 Thread Daniel Berteaud

Le 18/12/2017 à 14:08, Pete Geenhuizen a écrit :

Daniel,

I plan on upgrading from v3 to v4, and have installed your rpm on a 
test host, so this is in essence a new install.  According to the 
documentation to set up /etc/BackupPC/config.pl I need to run 
configure.pl which appears to be included in the tar file, but isn't 
in the rpm.  Is there a way to include it, or is your rpm intended to 
be a means of upgrading v3 to v4?  If that is the case then I guess 
that that should be made clear.


The default config.pl should be fine for a fresh v4 install. But you're 
right that maybe the configure.pl script should be bundled to handle 
upgrades from v3


++

--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Matrix: @dani:fws.fr
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 4.1.5 released (plus new versions of backuppc-xs and rsync-bpc)

2017-12-18 Thread Daniel Berteaud

Le 17/12/2017 à 14:35, Pete Geenhuizen a écrit :

Daniel,

I just tried to install your rpm on Centos 7 which failed with this 
dependency error


Error: Package: BackupPC4-4.1.5-1.x86_64

   Requires: par2cmd

I have par2cmdline.rpm installed but it doesn't have par2cmd not can I 
find a par2cmd rpm.


Thanks, fixed in BackupPC4-4.1.5-2.fws which will be in my repo in a few 
minutes



++


--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Matrix: @dani:fws.fr
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 4.1.5 released (plus new versions of backuppc-xs and rsync-bpc)

2017-12-15 Thread Daniel Berteaud

Le 14/12/2017 à 19:14, Kris Lou a écrit :

Daniel,

Do you maintain that repo?  How is that different from the COPR?



I do maintain that repo. I don't know how it differs from the COPR. I 
guess the main difference is that my package is named BackupPC4 so it 
won't upgrade automatically a v3 install to a v4. You have to install it 
explicitely.


Cheers,
Daniel

--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Matrix: @dani:fws.fr
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 4.1.5 released (plus new versions of backuppc-xs and rsync-bpc)

2017-12-14 Thread Daniel Berteaud



Le 14/12/2017 à 08:03, Sorin Srbu a écrit :


Hi,

Do we know of any repos for RHEL/CentOS that usually package the 
latest BPC versions like this one?




I do. For el6 and el7

http://repo.firewall-services.com/centos/

Cheers
Daniel

--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32
Matrix: @dani:fws.fr
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Xfer errors for 31 files

2017-08-28 Thread Daniel Berteaud

Le 27/08/2017 à 23:18, Craig Barratt via BackupPC-users a écrit :

Daniel,


Thanks Craig for looking at this. To check if the problem could be 
reproduced, I've renamed the faulty host dir and started with fresh 
backup (and up to now, no errors occured). I still have the faulty 
directory (renamed robert.xferErr), here are the info requested




Can you check if there is an attrib file in that directory 
(/var/lib/BackupPC//pc/robert/242/f%2f/fusr/fbin)? (You might need to 
update the 242 to be the number of the latest filled backup.)  It 
should have a file name of the form "attrib_" where  is 32 hex 
digits.


Yes, there's 
/var/lib/BackupPC//pc/robert.xferErr/244/f%2f/fusr/fbin/attrib_5ecf35554e19361c98c012c616eb838c




Next, run this command to display the contents of the attrib file:

BackupPC_attribPrint
/var/lib/BackupPC//pc/robert/242/f%2f/fusr/fbin/attrib_

Either send the whole output to me directly (not to the list), or 
excerpt just the perl entry (about a dozen lines starting with 'perl' 
=> {)


Here's the part regarding the perl binary:


  'perl' => {
'compress' => 3,
'digest' => '',
'gid' => 0,
'inode' => 2833243,
'mode' => 493,
'mtime' => 1490180558,
'name' => 'perl',
'nlinks' => 0,
'size' => 13304,
'type' => 0,
'uid' => 0
  },

And, as the digest is missing, I can't lookup in the cpool. The file 
must be there now anyway (as it was present in previous backups, and is 
now for sure present in the new backups of the same host since I've 
started new backups from scratch). So I guess the issue is just the 
missing digest in the attrib.


Anything else you'd need ?

Cheers,
Daniel
--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.fws.fr/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Xfer errors for 31 files

2017-08-21 Thread Daniel Berteaud

Hi.


Running BackupPC 4.1.3 (with rsync-bpc 3.0.9.8 and BackupPC::XS 0.56) 
since a few months on my EL6 server. I'm backing up 53 machines 
successfuly. Except for 1 of them, where for each backup I have 31 xfer 
errors like this one:



G bpc_fileOpen: can't open file 
/var/lib/BackupPC//pc/robert/242/f%2f/fusr/fbin/fperl (from 
usr/bin/perl, 3, 0, 0)

G bpc_fileDescFree: fdNum = 3, tmpFd = -1, tmpFileName = NULL
G bpc_open(usr/bin/perl, 0x0, 00) -> -1
rsync_bpc: failed to open "/usr/bin/perl", continuing: No such file or 
directory (2)


Nothing special about thses files on the client, they are readable. I 
guess there's an issue in BackupPC's pool, but I don't know how to track 
it. At the begining (migration from v3), I didn't had these errors. v4 
ran fine for a few weeks, and then started to show these errors.


How can I track this down ?


Cheers,

Daniel

--

Logo FWS

    *Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.fws.fr/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] I'd like to make a suggestion to the BPC devs

2017-07-20 Thread Daniel Berteaud

Le 20/07/2017 à 15:54, B a écrit :

On Thu, 20 Jul 2017 10:26:18 +0200
Daniel Berteaud <dan...@firewall-services.com> wrote:


I think this is out of BackupPC's scope

Please develop, don't drop me dry, why is that?
Why adding a kinda-Xtiple-fugitive-daily-snapshots of only touched files
is out of the BPC's scope ? On the other hand, I see this as the missing
complement to get a professional ubiquitous backup system.


Because what you want to achieve is not really backups, but some kind of 
rotative snapshots. There are lots of different ways to do this (LVM, 
LVM-thin, btrfs, zfs etc..), and this is very dependant on the system 
hosting your data, which is not controlled by BackupPC. BackupPC is just 
a backup tool. You could configure more frequent incr (every hour), but 
the performance impact won't be the same as snapshots. This can be a 
solution depending on the amount of data you have to manage, but it's 
already possible, without any modification to BackupPC


++

--

Logo FWS

    *Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.fws.fr/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] I'd like to make a suggestion to the BPC devs

2017-07-20 Thread Daniel Berteaud

Le 20/07/2017 à 02:54, B a écrit :

Hi Bacukppcers,


My suggestion is to avoid using such things as FS snapshots during the
day to avoid work losses.

An addition to BPC could do the trick, preferably saving the result in
another directory than the main one, by checking which files have been
touched the present day and save them automatically; it may be triggered
from a crontab. And before each complete backup, BPC would empty this
daily directory for the next day.

This way, if we take an hypothesis of an hourly crontab, clumsy users
would be able to recover their work very rapidly with at most one hour
loss - and the hourly backups being confined to only touched files
should be quite transparent/invisible/lightweight for them.

How about that ?


I think this is out of BackupPC's scope

++

--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.fws.fr/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC v4 and rsync for localhost

2017-06-25 Thread Daniel Berteaud

Le 16/06/2017 à 05:13, Craig Barratt via BackupPC-users a écrit :



A very bad hack would be to use something like this:

$Conf{RsyncSshArgs} = ['-e', '/usr/bin/sudo -p'];



It's working great. Might not be the cleanest solution, but it's easy, 
and doesn't require deploying any additional script :-)


Thanks Craig.

++


--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.fws.fr/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Fuse filesystem to explore backups

2017-06-19 Thread Daniel Berteaud



Le 18/06/2017 à 01:10, Craig Barratt via BackupPC-users a écrit :

Daniel,

I updated backuppcfs.pl <http://backuppcfs.pl> for 4.x.  I've attached 
the gzip'ed new version.


Unfortunately fuse creates its own inode numbers, so hardlinks won't 
be correctly rendered via fuse (ie: the inode numbers of two 
hardlinked files won't be the same when viewed via fuse).




Craig, I can't thank you enough :-)
I'll try this as soon as the migration to v4 of my pool is finished.

I think this fuse FS is very useful, could it be provided in next 
BackupPC release ?


++

--

Logo FWS

    *Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.fws.fr/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_migrateV3toV4 failing

2017-06-16 Thread Daniel Berteaud



Le 16/06/2017 à 06:25, Craig Barratt via BackupPC-users a écrit :

Daniel,

It looks like it is reading a V3 file, and the file size from the 
attributes is bigger than how much data it read from the file.  This 
is probably due to some sort of disk data corruption.


That'd be odd, as I have the same error for all of my hosts (and the 
pool is only ~6 months old, running v3.3.2 until now, on ext4, no power 
outage or anything else)




At line 722 of /usr/share/BackupPC/lib/BackupPC/Lib.pm there is this code:

$md5->add(substr($$dataRef, $seekPosn, 131072));

You could try replacing that with this:

$md5->add(substr($$dataRef, $seekPosn, 131072)) if (
length($$dataRef) >= $seekPosn );



It's now running, but I'm not sure yet what is being done. Will try to 
report when it's finished. How can I check everything went fine ?


++
--

Logo FWS

    *Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.fws.fr/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC v4 and rsync for localhost

2017-06-16 Thread Daniel Berteaud



Le 16/06/2017 à 05:13, Craig Barratt via BackupPC-users a écrit :
You can set $Conf{RsyncSshArgs} to a command that simply executes its 
arguments (like ssh does).  However, rsync adds the host name to 
whatever you specify in $Conf{RsyncSshArgs}, so unfortunately this 
doesn't work since the first argument is the host name, and env will 
give an error:


$Conf{RsyncSshArgs} = ['-e', '/usr/bin/env'];# <- this doesn't
work

A very bad hack would be to use something like this:

$Conf{RsyncSshArgs} = ['-e', '/usr/bin/sudo -p'];

(sudo has to be passwordless from the BackupPC user.)  The -p (prompt) 
option expects an argument to set the password prompt (which is not 
used).  So it consumes the host name argument appended by rsync, then 
runs the rest of the command like ssh does.  Bingo - the client rsync 
gets run locally, with elevated privileges from sudo, without ssh.




Thanks Craig, I'll try ASAP this :-)

++

--

Logo FWS

    *Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.fws.fr/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC_migrateV3toV4 failing

2017-06-15 Thread Daniel Berteaud
Just upgraded to BackupPC 4.1.3 on my CentOS 6 server. I'd like to 
migrate the whole pool from v3 to v4. Simulation (with -m flag) works as 
expected:



[root@gerard ~]# sudo -u backuppc 
/usr/share/BackupPC/bin/BackupPC_migrateV3toV4 -h xbmc -m
BackupPC_migrateV3toV4: migrating host xbmc backup #0 to V4 (approx 
67839 files)
BackupPC_migrateV3toV4: migrating host xbmc backup #30 to V4 (approx 
69166 files)
BackupPC_migrateV3toV4: migrating host xbmc backup #60 to V4 (approx 
69884 files)
BackupPC_migrateV3toV4: migrating host xbmc backup #90 to V4 (approx 
70058 files)
BackupPC_migrateV3toV4: migrating host xbmc backup #109 to V4 (approx 
70307 files)
BackupPC_migrateV3toV4: migrating host xbmc backup #139 to V4 (approx 
71128 files)


[...]

But with the -m, it fails immediatly:


[root@gerard ~]# sudo -u backuppc 
/usr/share/BackupPC/bin/BackupPC_migrateV3toV4 -h xbmc
BackupPC_migrateV3toV4: migrating host xbmc backup #0 to V4 (approx 
67839 files)
BackupPC_migrateV3toV4: removing temp target directory 
/var/lib/BackupPC//pc/xbmc/0.v4
substr outside of string at /usr/share/BackupPC/lib/BackupPC/Lib.pm line 
722.


Where could the error comes from ?


++

--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.fws.fr/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC v4 and rsync for localhost

2017-06-15 Thread Daniel Berteaud

Le 15/06/2017 à 15:53, Michael Stowe a écrit :


I looked on my own setup to answer this question, since I used a 
similar method under 3.x and have been backing up the local systems 
under 4.x since the alpha versions.


Turns out I just use a pretty vanilla rsync/ssh setup, and set up ssh 
keys so the box can log into itself without issues.




That's what I'll do if no other solution is available, but too bad to 
add the ssh overhead when it was so easy to use plain rsync on v3


++
--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.fws.fr/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Fuse filesystem to explore backups

2017-06-15 Thread Daniel Berteaud
Until BackupPC v3, I was using a script to mount BackupPC's data as a 
standard FS using FUSE (backuppcfs.pl from Pieter Wuille Based on the 
backuppc-fuse script by Stephen Day).


This was very handy (I have a specific use case for low level VM backups 
which requires this). But I guess it won't work anymore with the new v4 
storage model. And anyway, it required BackupPC::Attrib which seems to 
have disappeared in v4


Is there an alternative or something similar in v4 ? A fuse FS like this 
would be great, even better if it was bundled with BackupPC v4 sources



Cheers

Daniel

--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.fws.fr/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC v4 and rsync for localhost

2017-06-15 Thread Daniel Berteaud

Hi there.

Using BackupPC since v2, I used to be able to BackupPC the host itself 
(the one running BackupPC) using rsync by simply modifying 
$Conf{RsyncClientCmd} (to something like '/usr/bin/sudo $rsyncPath 
$argList', and same for $Conf{RsyncClientRestoreCmd}). THis worked with 
BackupPC v3 too.


How can the same be done with BackupPC v4 now that RsyncClientCmd isn't 
used anymore ? Setting RsyncSshArgs to undef does't work as it'll try to 
run with ssh (but without a full path), eg



[root@gerard ~]# sudo -u backuppc /usr/share/BackupPC/bin/BackupPC_dump 
-f -vv gerard
Backup type: type = full, needs_full = , needs_incr = 1, lastFullTime = 
1495674113, opts{f} = 1, opts{i} = , opts{F} =

cmdSystemOrEval: about to system /bin/ping -c 1 -w 3 localhost
cmdSystemOrEval: about to system /bin/ping -c 1 -w 3 localhost
CheckHostAlive: ran '/bin/ping -c 1 -w 3 localhost'; returning 0.013
XferLOG file /var/lib/BackupPC//pc/gerard/XferLOG.190.z created 
2017-06-15 10:22:29
Backup prep: type = full, case = 6, inPlace = 1, doDuplicate = 0, 
newBkupNum = 190, newBkupIdx = 35, lastBkupNum = , lastBkupIdx = 
(FillCycle = 0, noFillCnt = 20)

__bpc_progress_state__ backup /
Running: /usr/bin/rsync_bpc --bpc-top-dir /var/lib/BackupPC/ 
--bpc-host-name gerard --bpc-share-name / --bpc-bkup-num 190 
--bpc-bkup-comp 3 --bpc-bkup-prevnum -1 --bpc-bkup-prevcomp -1 
--bpc-bkup-inode0 1 --bpc-
attrib-new --bpc-log-level 1 --rsync-path=/usr/bin/rsync --super 
--recursive --protect-args --numeric-ids --perms --owner --group -D 
--times --links --hard-links --delete --delete-excluded --partial 
--log-format
=log:\ %o\ %i\ %B\ %8U,%8G\ %9l\ %f%L --stats --checksum --timeout=72000 
--exclude=/dev --exclude=/proc --exclude=/sys --exclude=\*lost+found 
--exclude=/mnt/\* --exclude=/media/\* --exclude=/tmp --exclude=/var/t
mp --exclude=/var/lib/libvirt/backup --exclude=/var/lib/BackupPC 
--exclude=/var/lib/libvirt/images --exclude=/var/lib/libvirt/iso 
--exclude=/var/lib/libvirt/qemu --exclude=/var/run --exclude=/selinux 
--exclude=/

var/lib/yum/yumdb/ --exclude=/cgroup/ --exclude=/var/lock/lvm/ localhost:/ /
full backup started for directory /
started full dump, share=/
Xfer PIDs are now 16424
xferPids 16424
This is the rsync child about to exec /usr/bin/rsync_bpc
cmdExecOrEval: about to exec /usr/bin/rsync_bpc --bpc-top-dir 
/var/lib/BackupPC/ --bpc-host-name gerard --bpc-share-name / 
--bpc-bkup-num 190 --bpc-bkup-comp 3 --bpc-bkup-prevnum -1 
--bpc-bkup-prevcomp -1 --bpc-bkup-inode0 1 --bpc-attrib-new 
--bpc-log-level 1 --rsync-path=/usr/bin/rsync --super --recursive 
--protect-args --numeric-ids --perms --owner --group -D --times --links 
--hard-links --delete --delete-excluded --partial --log-format=log:\ %o\ 
%i\ %B\ %8U,%8G\ %9l\ %f%L --stats --checksum --timeout=72000 
--exclude=/dev --exclude=/proc --exclude=/sys --exclude=\*lost+found 
--exclude=/mnt/\* --exclude=/media/\* --exclude=/tmp --exclude=/var/tmp 
--exclude=/var/lib/libvirt/backup --exclude=/var/lib/BackupPC 
--exclude=/var/lib/libvirt/images --exclude=/var/lib/libvirt/iso 
--exclude=/var/lib/libvirt/qemu --exclude=/var/run --exclude=/selinux 
--exclude=/var/lib/yum/yumdb/ --exclude=/cgroup/ 
--exclude=/var/lock/lvm/ localhost:/ /

rsync_bpc: Failed to exec ssh: No such file or directory (2)
Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp, 0 
filesTotal, 0 sizeTotal, 0 filesNew, 0 sizeNew, 0 sizeNewComp, 1 inode

rsync error: error in IPC code (code 14) at pipe.c(84) [Receiver=3.0.9.8]


Running:

- CentOS 6

- BackupPC v4.1.3

- rsync-bpc 3.0.9.8

- BackupPC::XS 0.56

--

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.fws.fr/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC v4 for Fedora / EPEL Update

2017-05-09 Thread Daniel Berteaud



Le 29/04/2017 à 01:08, Richard Shaw a écrit :

Been available for a while on my COPR:

https://copr.fedorainfracloud.org/coprs/hobbes1069/BackupPC/

Must have missed the announcement.

Requires EXTRAS+EPEL repositories.



Hi, and thanks for your packages.
Unfortunatly, it cannot be installed on EL6 as there's no par2cmdline 
for this version. As this is an optional feature, could you relaxe it ? 
(or enclose in Requires in a conditional > el6)


Cheers,
Daniel

--

Logo FWS

    *Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32 
Visio : http://vroom.im/dani
/www.firewall-services.com/

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup of virtual machines

2016-07-04 Thread Daniel Berteaud


Le 04/07/2016 à 14:01, Smith, Graham - Computing Technical Officer a écrit :
>
> I don't use xenserver but my generic advice as a strategy would be to
> backup the contents of the VMs rather than the raw virtual disks.
>
> Effetively treating the guests as if they were physical systems. The
> raw virtual disk files will be large but also non-unique so will cost
> a lot more
>
> in storage space in your backups and potentially take a lot more time
> to backup. Backing up the files contained within the VMs should save
> space
>
> with BackupPC in taking full advantage of single instance storage of
> repeated files in your pool common to many users or host systems.
>

You can have both: save full image while storing only the delta between
two backups.

Here's what I've written:
http://gitweb.firewall-services.com/?p=virt-backup;a=blob_plain;f=virt-backup;hb=HEAD

I use it only on KVM, but should work with anything libvirt supports. I
use it in combination with BackupPC (as pre dump and post dump script)

The basic idea is:

- Suspend the VM
- take a snapshot of the underlying storage (using LVM)
- Resume the VM (so it has only been suspended for a very short time,
usually not even noticeable)
- Using chunkfs, mount the image (file or raw block) as a set of chunks
of fixed size: there's no copy, the fuse mount point is virtual, and
ready from it will read in fact from the snapshot, except that it'll be
pesented as a set of small chunks
- Using BackupPC, backup all those chunks
- When done, unmount the fuse file system and remove the snapshot

Now, the next backup will only store chunks which have changed, which is
usually not a lot

++

-- 

Logo FWS

*Daniel Berteaud*

FIREWALL-SERVICES SAS.
Société de Services en Logiciels Libres
Tel : 05 56 64 15 32
Visio : http://vroom.im/dani
/www.firewall-services.com/

--
Attend Shape: An AT Tech Expo July 15-16. Meet us at AT Park in San
Francisco, CA to explore cutting-edge tech and listen to tech luminaries
present their vision of the future. This family event has something for
everyone, including kids. Get more information and register today.
http://sdm.link/attshape___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backup for all of the hosts at certain pointof time

2011-02-07 Thread Daniel Berteaud
You need to adjust the MaxBackups number if you really want to backup
all 20. But increasing this value to 20 will probably slow down backups
(slower than backing them up 2 or 4 at a time).

Regards, Daniel

Le lundi 07 février 2011 à 16:26 +0300, Levkovich Andrew a écrit :
 i'l try to explane:
 I have shedule: every day at 5pm bpc run queue, but at 5pm at run's backup 
 only for 2 or 4 machins. but i need to satrt backup for all 20 host's
 
 
 Mon, 7 Feb 2011 13:48:31 +0100 письмо от Sorin Srbu 
 sorin.s...@orgfarm.uu.se:
 
  -Original Message-
  From: Левкович Андрей [mailto:volan...@inbox.ru]
  Sent: Monday, February 07, 2011 1:13 PM
  To: backuppc-users
  Subject: [BackupPC-users] backup for all of the hosts at certain point
  of time
  
  is it possible to do so at a time run a backup for all of the hosts
  
  I don't follow, please clarify!
  
  Otherwise, in BPC you can specify a time window, within which time all hosts
  will be queued to be backed up. Is this what you mean?
  
  -- 
  /Sorin
  
  --
  The modern datacenter depends on network connectivity to access resources
  and provide services. The best practices for maximizing a physical server's
  connectivity to a physical network are well understood - see how these
  rules translate into the virtual world? 
  http://p.sf.net/sfu/oracle-sfdevnlfb
  ___
  BackupPC-users mailing list
  BackupPC-users@lists.sourceforge.net
  List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
  Wiki:http://backuppc.wiki.sourceforge.net
  Project: http://backuppc.sourceforge.net/
 
 
 --
 The modern datacenter depends on network connectivity to access resources
 and provide services. The best practices for maximizing a physical server's
 connectivity to a physical network are well understood - see how these
 rules translate into the virtual world? 
 http://p.sf.net/sfu/oracle-sfdevnlfb
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
The modern datacenter depends on network connectivity to access resources
and provide services. The best practices for maximizing a physical server's
connectivity to a physical network are well understood - see how these
rules translate into the virtual world? 
http://p.sf.net/sfu/oracle-sfdevnlfb
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Improving backuppc performance

2010-04-16 Thread Daniel Berteaud
Le vendredi 16 avril 2010 à 16:00 +1000, Adam Goryachev a écrit :
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1
 
 Just had a thought ... dangerous I know :)
 
 Would it improve backuppc performance to use RAID1 with 3 drives?
 Obviously writes are not improved, but it would seem the majority of
 work done on the FS are random reads...
 
 ie, new files, and unlinks for old expiring files/backups are the only
 writes (ok, plus logging). Other than that, there is a lot of reading
 both for every backup as well as for the nightly prune process.
 
 Currently, I'm using linux software RAID1 from md, with 2 x 1TB SATAII
 HDD's, but thinking that adding a third HDD might improve performance
 (albeit not increasing capacity).

I'm not sure adding a 3rd drive will really improve the performances. If
you really want better performances, add 2 drives, and go with a RAID10
setup. This will *really* improve both read and write (but especially
reads).

Regards.

 
 Thanks,
 Adam
 
 - --
 Adam Goryachev
 Website Managers
 www.websitemanagers.com.au
 -BEGIN PGP SIGNATURE-
 Version: GnuPG v1.4.9 (GNU/Linux)
 Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
 iEYEARECAAYFAkvH/RQACgkQGyoxogrTyiUbuQCgwfRn8f6Z/cbO1DDKoLJqzqGA
 /C0AoIZ/7c+9EWPIZBZrxb860Mtr+vsH
 =s2IR
 -END PGP SIGNATURE-
 
 --
 Download Intel#174; Parallel Studio Eval
 Try the new software tools for yourself. Speed compiling, find bugs
 proactively, and fine-tune applications for parallel performance.
 See why Intel Parallel Studio got high marks during beta.
 http://p.sf.net/sfu/intel-sw-dev
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_nightly takes too much time

2010-04-08 Thread Daniel Berteaud
Le jeudi 08 avril 2010 à 15:36 +0200, Norbert Schulze a écrit :
 Hello Gerald,
 
 
  Is this running in a VM? I often see high CPU wait times when doing disk
  I/O in a VM. 
 
 No, it is a normal system with a hardware raid-controller.

Do you have a BBU on the raid controler ?

Hardware RAID without BBU (and of course, without write-back cache
enabled) is usually *very* slow, much slower than simple disk system, or
software RAID.

Regards, Daniel

 
 
 Regards
 Norbert
 
 --
 Download Intel#174; Parallel Studio Eval
 Try the new software tools for yourself. Speed compiling, find bugs
 proactively, and fine-tune applications for parallel performance.
 See why Intel Parallel Studio got high marks during beta.
 http://p.sf.net/sfu/intel-sw-dev
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] I broke my cgi interface! :(

2010-03-25 Thread Daniel Berteaud
Hi. I'm the maintainer of BackupPC's contrib on SME Server.

look at /var/log/httpd/error_log, which should give you some hint about
the error.

Your data are in /opt/backuppc/ which means you're probably using a very
old version of the contrib, you can read this page

http://wiki.contribs.org/BackupPC#Upgrade_from_smeserver-backuppc.fws-3.0-1

To upgrade it to a more recent version

Regards, Daniel

Le jeudi 25 mars 2010 à 18:44 +0200, Jaco Meintjes a écrit :
 I'm running Backuppc on SME Server (someone else set it up).  So I
 ssh'd into the box to delete some entries in the hosts file and now I
 get this error message when I click on the cgi interface link:
 
 Software error:
 
 Illegal division by zero at /opt/backuppc/files/conf/config.pl line
 64.
 
 For help, please send mail to the webmaster (), giving
 this error message and the time and date of the error.
 Content-Type: text/html; charset=ISO-8859-1
 Error: Unable to read config.pl or language strings!!
 
 Please help me to fix it?
 
 Jaco
 --
 Download Intel#174; Parallel Studio Eval
 Try the new software tools for yourself. Speed compiling, find bugs
 proactively, and fine-tune applications for parallel performance.
 See why Intel Parallel Studio got high marks during beta.
 http://p.sf.net/sfu/intel-sw-dev
 ___ BackupPC-users mailing list 
 BackupPC-users@lists.sourceforge.net List: 
 https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: 
 http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Cant find how to set what is backed up!

2009-10-21 Thread Daniel Berteaud
Le mardi 20 octobre 2009 à 19:23 -0400, giorgio p a écrit :
 I'm trying to get backuppc configured.
 I thought I had done the required setup... 
 
 In the /etc/backuppc/config.pl file I have:
 $Conf{XferMethod} = 'rsync';
 $Conf{RsyncShareName} = ['/home/storage','/home/george'];
 
 In the /etc/backuppc/hosts file I have:
 localhost   0   backuppc
 
 However when the backup runs it appears to just backup the /etc directory 
 which isn't even specified.
 
 I am a bit bemused. Have I missed something fundamental?
 
 It is running on a bubba miniserver with Debian if that makes any 
 difference.

Looks like a permission problem. backuppc user cannot access other
users' home directory. You should use sudo:

give backuppc the permission to run rsync as root without password, then
change this config:

$Conf{RsyncClientCmd} = '/usr/bin/sudo $rsyncPath $argList';
$Conf{RsyncClientRestoreCmd} = '/usr/bin/sudo $rsyncPath $argList';


Regards, Daniel

 
 Thanks
 George
 
 +--
 |This was sent by george.pever...@gmail.com via Backup Central.
 |Forward SPAM to ab...@backupcentral.com.
 +--
 
 
 
 --
 Come build with us! The BlackBerry(R) Developer Conference in SF, CA
 is the only developer event you need to attend this year. Jumpstart your
 developing skills, take BlackBerry mobile applications to market and stay 
 ahead of the curve. Join us from November 9 - 12, 2009. Register now!
 http://p.sf.net/sfu/devconference
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] [virt-tools-list] pre/post backup script for KVM guests

2009-10-21 Thread Daniel Berteaud
Le mercredi 21 octobre 2009 à 11:56 +0300, Nikolai K. Bochev a écrit :
 Hello Daniel.

Hi

 
 I have one question - does BackupPC backup sparse files correctly ? 

Well, I'm not sure, I never tried. But before BackupPC (which is quite
independant from this script, even if I wrote it with BackupPC in mind),
dd need to handle sparse file (I don't know if it does):
- the script uses dd to dump the block devices or files representing the
virtual disks (and optionnaly compress it)
- then BackupPC (or what ever backup tool you want to use) will backup
these dumps

But usually, even if sparse file are not supported and are treated like
normal files, if you compress it (on the fly), you'll get about the same
size. I use pbzip2, which gives quite a high compression ration, and can
use all the cores of my host's CPU, and doesn't slow down the process
too much (Xeon 5520). If pbzip2 is too slow, you can try lzop, which
should just as fast as your disk.


 I am working on a python script to automate the whole process ( something 
 among the lines of what your script does ) i.e. :
 
 1. Taking a lvm snapshot
 2. mounting the snapshot
 3. backing up the kvm images ( selectively  - storebackup supports conditions 
 - i.e. i wouldn't want to backup the iso files as i did above )
 4. unmounting and destroying the lvm snapshot.

My script can use LVM snapshots only if virtual disk are directly a LV.
It doesn't use snapshots if you have a file system on a LV where you
store images files(.raw, .qcow2 etc...)

Regards.

 
 I could scrap the above for your script, but the disk usage of backups is 
 critical for me.
 
 - Original Message -
 From: Daniel Berteaud dan...@firewall-services.com
 To: virt-tools-l...@redhat.com, backuppc-users@lists.sourceforge.net
 Cc: t...@firewall-services.com
 Sent: Monday, October 19, 2009 4:02:58 PM
 Subject: [virt-tools-list] pre/post backup script for KVM guests
 
 Hi everyone.
 
 I've written a script to backup virtual machines managed by libvirt
 (only tested with KVM guests, but should works for Xen too, maybe others
 as well).
 
 It's called virt-backup.pl
 
 I've written it for integration with BackupPC:
 - take a dump of a VM in using the pre-backup facility
 - backup the dumps using BackupPC
 - cleanup the dumps in the post-backup phase
 
 This script can be used outside of BackupPC as it's quite generic.
 
 There are two main mode for this script:
 --pre: take the backup of the VM. This is called --pre because I use it
 as a pre-backup script for BackupPC
 --post: cleanup the dumps
 
 Here're some functionalities:
 
 - no configuration file needed, everything can be passed as command line
 arg
 - can take snapshots of virtual disks if they are on LVM Logical volume
 (the default is to try LVM in any case, if it's not possible, then, just
 dump the block device/file)
 - Supports backup of running VM with minimal downtime (if each virtual
 disks can be snapshoted, just resume the VM immediately resulting in
 just a few seconds of downtime, then, dump the snapshots. If snapshots
 are not available, the guest is suspended during the dump)
 - can save the state of running vm (equivalent of virsh save/virsh
 restore). This is optional because it's still not very reliable, and
 sometimes the restoration fails, leaving a crashed qemu process running
 and eating CPU cycles.
 - Can compress on-the-fly the virtual disks dumps
 (gzip,bzip2,pbzip2,lzop,xz)
 - Support virtual disks exclusions (if you want to backup the system
 disk of a VM, but not the data one for example)
 - Can work on installations where virtual disks are stored on one hosts,
 and guests runs on another on (NFS, iscsi etc... in any case, the script
 must be run on the host which holds the virtual disks)
 - Can backup as many guests as you want in one run (they'll be dumped
 sequentially)
 - Backups are run with low priority (nice and ionice), so it should'nt
 slow down too much your system.
 
 Here are the dependencies for this script to work
 - Sys::Virt perl module
 - XML::Simple per module
 - Getopt::Long perl module
 - lvm2
 - gzip (optional)
 - bzip2 (optional)
 - pbzip2 (optional)
 - lzop (optional)
 - xz (optional)
 
 You can run this script without argument to see the help. Edit it if you
 want more informations (there are some examples on how to use it at the
 beginning of the script).
 
 
 The script can be found here:
 
 http://repo.firewall-services.com/misc/virt-backup.pl
 
 Regards, Daniel
 
-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications

Re: [BackupPC-users] fulls vs incrementals (was: how to have 1 full backup + incrementals forever?)

2009-10-20 Thread Daniel Berteaud
Le mardi 20 octobre 2009 à 11:57 -0400, Ambrose LI a écrit :
 2009/10/20 Michael Stowe mst...@chicago.us.mensa.org:
  If you're going to keep all those incrementals, I wouldn't recommend doing
  full backups any less frequently than every two weeks or so, which is
  often the point at which they become as slow or slower than full backups.
 
 I'm not sure if I can agree with this, since this is definitely not my
 experience. Even after a month, there are pc's on my network where
 incrementals (rsync-based) still run significantly faster than fulls.

Yes, in my case also, I can see incremential running faster than full
after a month, but it usually don't eat more bandwidth (well, a bit
more, but not that much), just more time. The benefit is that all your
files will be checked (block checksum) so if one is corrupted in the
pool it can be updated.

I personnaly run one full per month, and one incremential per day, with
7 incremential levels ($Conf{IncrLevel} = [1, 2, 3, 4, 5, 6, 7])


Regards, Daniel


 
 
-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] pre/post backup script for KVM guests

2009-10-19 Thread Daniel Berteaud
Hi everyone.

I've written a script to backup virtual machines managed by libvirt
(only tested with KVM guests, but should works for Xen too, maybe others
as well).

It's called virt-backup.pl

I've written it for integration with BackupPC:
- take a dump of a VM in using the pre-backup facility
- backup the dumps using BackupPC
- cleanup the dumps in the post-backup phase

This script can be used outside of BackupPC as it's quite generic.

There are two main mode for this script:
--pre: take the backup of the VM. This is called --pre because I use it
as a pre-backup script for BackupPC
--post: cleanup the dumps

Here're some functionalities:

- no configuration file needed, everything can be passed as command line
arg
- can take snapshots of virtual disks if they are on LVM Logical volume
(the default is to try LVM in any case, if it's not possible, then, just
dump the block device/file)
- Supports backup of running VM with minimal downtime (if each virtual
disks can be snapshoted, just resume the VM immediately resulting in
just a few seconds of downtime, then, dump the snapshots. If snapshots
are not available, the guest is suspended during the dump)
- can save the state of running vm (equivalent of virsh save/virsh
restore). This is optional because it's still not very reliable, and
sometimes the restoration fails, leaving a crashed qemu process running
and eating CPU cycles.
- Can compress on-the-fly the virtual disks dumps
(gzip,bzip2,pbzip2,lzop,xz)
- Support virtual disks exclusions (if you want to backup the system
disk of a VM, but not the data one for example)
- Can work on installations where virtual disks are stored on one hosts,
and guests runs on another on (NFS, iscsi etc... in any case, the script
must be run on the host which holds the virtual disks)
- Can backup as many guests as you want in one run (they'll be dumped
sequentially)
- Backups are run with low priority (nice and ionice), so it should'nt
slow down too much your system.

Here are the dependencies for this script to work
- Sys::Virt perl module
- XML::Simple per module
- Getopt::Long perl module
- lvm2
- gzip (optional)
- bzip2 (optional)
- pbzip2 (optional)
- lzop (optional)
- xz (optional)

You can run this script without argument to see the help. Edit it if you
want more informations (there are some examples on how to use it at the
beginning of the script).


The script can be found here:

http://repo.firewall-services.com/misc/virt-backup.pl

Regards, Daniel

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] pre/post backup script for KVM guests

2009-10-19 Thread Daniel Berteaud
Le lundi 19 octobre 2009 à 14:36 +0100, Tyler J. Wagner a écrit :
 Very handy, Daniel, thank you.
 
 Why do it this way?  That is, why not back up the VMs like any other server?

I do both. Backup the VM at the file system level, like any other real
server, but I also backup the VM at the block device level, because it
provides a very easy/fast disaster recovery mecanism. If the --state
flag is present, I can restore the guest in the exact state it was when
the backup was made (including running programs, opened files etc...).
Without the --state flag, I can restore it, just as if the guest had
crashed (power failure).

Regards

 
 Regards,
 Tyler
 
 On Monday 19 October 2009 14:02:58 Daniel Berteaud wrote:
  Hi everyone.
  
  I've written a script to backup virtual machines managed by libvirt
  (only tested with KVM guests, but should works for Xen too, maybe others
  as well).
  
  It's called virt-backup.pl
  
  I've written it for integration with BackupPC:
  - take a dump of a VM in using the pre-backup facility
  - backup the dumps using BackupPC
  - cleanup the dumps in the post-backup phase
  
  This script can be used outside of BackupPC as it's quite generic.
  
  There are two main mode for this script:
  --pre: take the backup of the VM. This is called --pre because I use it
  as a pre-backup script for BackupPC
  --post: cleanup the dumps
  
  Here're some functionalities:
  
  - no configuration file needed, everything can be passed as command line
  arg
  - can take snapshots of virtual disks if they are on LVM Logical volume
  (the default is to try LVM in any case, if it's not possible, then, just
  dump the block device/file)
  - Supports backup of running VM with minimal downtime (if each virtual
  disks can be snapshoted, just resume the VM immediately resulting in
  just a few seconds of downtime, then, dump the snapshots. If snapshots
  are not available, the guest is suspended during the dump)
  - can save the state of running vm (equivalent of virsh save/virsh
  restore). This is optional because it's still not very reliable, and
  sometimes the restoration fails, leaving a crashed qemu process running
  and eating CPU cycles.
  - Can compress on-the-fly the virtual disks dumps
  (gzip,bzip2,pbzip2,lzop,xz)
  - Support virtual disks exclusions (if you want to backup the system
  disk of a VM, but not the data one for example)
  - Can work on installations where virtual disks are stored on one hosts,
  and guests runs on another on (NFS, iscsi etc... in any case, the script
  must be run on the host which holds the virtual disks)
  - Can backup as many guests as you want in one run (they'll be dumped
  sequentially)
  - Backups are run with low priority (nice and ionice), so it should'nt
  slow down too much your system.
  
  Here are the dependencies for this script to work
  - Sys::Virt perl module
  - XML::Simple per module
  - Getopt::Long perl module
  - lvm2
  - gzip (optional)
  - bzip2 (optional)
  - pbzip2 (optional)
  - lzop (optional)
  - xz (optional)
  
  You can run this script without argument to see the help. Edit it if you
  want more informations (there are some examples on how to use it at the
  beginning of the script).
  
  
  The script can be found here:
  
  http://repo.firewall-services.com/misc/virt-backup.pl
  
  Regards, Daniel
  
 
-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] installer backup Pc

2009-10-06 Thread Daniel Berteaud
Le mardi 06 octobre 2009 à 08:59 -0400, amdjehdi a écrit :
 Merci 
 aparament j'ai fait mes recherches mais j'arrive pas a deploey la solution 
 sous windows serveur  an tant que serveur BackupPC


Salut.

C'est pas courant un frenchi sur cette liste :) mais t'es pas tout seul.

Pour faire simple, si tu tiens vraiment à faire tourner le serveur
BackupPC sur une machine Windows le plus simple serait de faire
tourner un GNU/linux en instance virtuelle sur le Windows (VirtualBox,
VMWare, VirtualPC, je suis pas un expert sous Windows, mais il doit y
avoir des solutions qui tournent raisonnablement bien), et d'installer
BackupPC dedans.

Pour le système que tu mettra dedans, c'est à toi de voire, moi je te
conseil CentOS, c'est très solide, certains préfèreront Debian. Bref, ça
peu importe, du moment que c'est de *NIX

Évidement, tu vas perdre un peu de performances, mais bon Ça dépends
de tes besoins.

A+

 
 +--
 |This was sent by t...@deltaholding.ma via Backup Central.
 |Forward SPAM to ab...@backupcentral.com.
 +--
 
 
 
 --
 Come build with us! The BlackBerryreg; Developer Conference in SF, CA
 is the only developer event you need to attend this year. Jumpstart your
 developing skills, take BlackBerry mobile applications to market and stay 
 ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
 http://p.sf.net/sfu/devconf
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] pbzip2 support?

2009-10-05 Thread Daniel Berteaud
Le samedi 03 octobre 2009 à 14:57 +0200, m...@knoway.info a écrit :
 Hi,
 
 is currently possible to use pbzip2 instead of bzip2 for pool/file
 compression?
 
 It could be better with archive-hosts and/or for pool management?
 Someone has already tried it?

Not tried it yet with BackupPC. but tried pbzip2 with some big files,
and indeed, if you have multi-core CPU, it can speedup things.

For pool file, I don't think it can be used as files are compressed
using Compress::Zlib perl library.

But you can always use it for archived host, without modification needed
in BackupPC:

/usr/share/bin/BackupPC_tarCreate -t -h host -n -1 -s * . | pbzip2 -c
 /mnt/extbackup/host.tar.bz2

I just saw there's also an equivalent for gzip:
http://www.zlib.net/pigz/
(I haven't tried it yet).

Regards, Daniel


 
 Thanks
 
 
 --
 Come build with us! The BlackBerryreg; Developer Conference in SF, CA
 is the only developer event you need to attend this year. Jumpstart your
 developing skills, take BlackBerry mobile applications to market and stay 
 ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
 http://p.sf.net/sfu/devconf
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Using rsync for blockdevice-level synchronisation of BackupPC pools

2009-09-02 Thread Daniel Berteaud
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Feature request: preserve (from deletion) individual backups by number

2009-08-28 Thread Daniel Berteaud
Le vendredi 28 août 2009 à 11:47 -0400, Jeffrey J. Kosowsky a écrit :
 In general, using FullKeepCnt and IncrKeepCnt (and associated
 variables) works well to prune older backups.
 
 But sometimes there is a *specific* older backup that you want to hang
 onto because it has some crucial data (or is a 'better' snapshot). It
 would be great if you could tell BackupPC to keep an arbitrary list of
 numbered backups for each different host. (If any of the listed backups
 are incrementals, then BackupPC would of course be smart enough to
 save the relevant precedent incrementals and full backups).
 
 For example I could imagine, a perl hash of arrays of the following
 form:
 
 $Conf{PreserveBackups} = {
   hostA = [ '23', '354', '798' ],
   hostB = [ '3', '25', '37', '101' ],
   hostC = [ '9', '11', '33', '434' ],
   };
 
 Does this make sense?

I've already posted an idea like this on the list, but had no response.
Anyway, yes (for me at least), it'd be a quite useful option.

Cheers, Daniel

 
 --
 Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
 trial. Simplify your report design, integration and deployment - and focus on 
 what you do best, core application coding. Discover what's new with 
 Crystal Reports now.  http://p.sf.net/sfu/bobj-july
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup of backuppc User Directory

2009-08-04 Thread Daniel Berteaud
Le mardi 04 août 2009 à 14:38 +0200, Christian Völker a écrit :
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1
 
 Yohoo!
 
 I just encountered an issue with the BackupPC itself.
 
 I'm backing up loads of Linux servers with rsync. So in the home
 directory of the backuppc user there is obviously a .ssh directory which
 contains the ssh-keys and the host keys (known_hosts).
 
 For whatever reason the known_hosts got corrupted and backuppc was not
 able to backup my Linux servers any more. I'm not complaining about
 this, it makes sense.
 I had to logon as backuppc and manually connect to all servers so the
 host keys are stored again in the new know_hosts file, because
 /var/lib/BackupPC is excluded from backup.
 
 What I'm wondering is which include/exclude rules I should apply so the
 user data itself is backed up, but not the backup repository.
 
 Anyone an idea?

I use something like this:

$Conf{RsyncShareName} = [
  '/var/lib/BackupPC'
];

$Conf{BackupFilesOnly} = {
  '/var/lib/BackupPC' = [
'/etc',
'/log',
'/.ssh'
  ]
};

(etc and log are usually not stored here, but I've symlinked those dir
to /etc/BackupPC and /var/log/BackupPC so everything in one place).

Cheers, Daniel

 
 Thanks!
 
 Christian
 -BEGIN PGP SIGNATURE-
 Version: GnuPG v1.4.5 (GNU/Linux)
 Comment: Using GnuPG with CentOS - http://enigmail.mozdev.org
 
 iD8DBQFKeCve0XNIYlAXmzsRAvuhAJ4tB2sAGObK7MMeCaH2/q9sDD1IOACfZF4J
 dOyUEWkWRx1WONY3ZEdXjCE=
 =eg+s
 -END PGP SIGNATURE-
 
 --
 Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
 trial. Simplify your report design, integration and deployment - and focus on 
 what you do best, core application coding. Discover what's new with 
 Crystal Reports now.  http://p.sf.net/sfu/bobj-july
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC Community Edition 3.2.0_beta0-1 RPMs now available from Zmanda

2009-05-11 Thread Daniel Berteaud
Le jeudi 07 mai 2009 à 16:26 -0500, Les Mikesell a écrit :
 Paul Mantz wrote:
  Hello Everyone,
  
  BackupPC Community Edition3.2.0beta0-1 is now available for Red Hat
  Enterprise Linux 4  5, and Suse Enterprise Linux 9  10.  To
  download, please visit http://www.zmanda.com/download-backuppc.php
  
  BackupPC Community Edition automates the web configuration process, in
  addition to including all the improvements found in BackupPC v3.2.0
  (for more details, see Craig's announcement email.).  A full list of
  features can be found at
  http://backuppc.sourceforge.net/info.html#features
  
  Please feel free to contact us regarding questions, support for new
  platforms, or any other concerns :)
 
 Thanks for providing a packaged version.  I realize it doesn't match 
 RedHat's typical layout, but would it be possible to make all of the 
 user-modified parts 'self-contained' under a single directory or mount 
 point?  I like to periodically mirror the archive and rotate copies 
 offsite, and it would be nice if this disk contained the up to date 
 configuration as well as the data so it could be mounted in a different 
 machine and be immediately usable.

Instead of modifying the package, you can simply create some symlinks so
the logs and configuration are stored on the data disk:
Before installing the rpm (and after having mounted the dedicated
partition in /var/lib/BackupPC), just do:

mkdir -p /var/lib/BackupPC/{etc,log}
ln -s /var/lib/BackupPC/etc /etc/BackupPC
ln -s /var/lib/BackupPC/log /var/log/BackupPC

I do this on a lot of servers so I can simply take the disk off one
server, and plug into another, and the data+config are immediately
available.
I don't use this rpm but the one in the EPEL repo on Centos 4 based
system (SME Server). I think it'll work the same way with this one.


Cheers, Daniel
 
-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: dan...@firewall-services.com
Web : http://www.firewall-services.com


--
The NEW KODAK i700 Series Scanners deliver under ANY circumstances! Your
production scanning environment may not be a perfect world - but thanks to
Kodak, there's a perfect scanner to get the job done! With the NEW KODAK i700
Series Scanner you'll get full speed at 300 dpi even with all image 
processing features enabled. http://p.sf.net/sfu/kodak-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backup localhost with rsync (root's password problem)

2008-04-30 Thread Daniel Berteaud
Le mercredi 30 avril 2008 à 16:12 +0200, [EMAIL PROTECTED] a
écrit :
 Hello all,
 
 In fact, I want to back up my BackupPC server (localhost with Fedora Core
 6) by using rsync.
 When I want to do this via the web interface, I write Unable to read 4
 bytes...
 However, when I run /usr/local/BackupPC/bin/BackupPC_dump -v -f localhost
 in a console
 it's work but it asks me the root's password.
 Consequently, I have to enter the password manually and it's ok.

The default rsync command uses rsync over ssh. For the localhost, SSH
isn't need. Edit the RsyncCLientCmd for this host, and put something
like this:

/usr/bin/sudo $rsyncPath $argList+

(idem for the restore command)

Then allow backuppc user to run rsync with sudo without password
(visudo)

it'll be a lign like this:

backuppc ALL=(root) NOPASSWD:/usr/bin/rsync


 
 So how to configure this password via the web interface ?
 
 Thanks a lot for your help.
 
 Romain
 
 SC2N -S.A  Siège Social : 2, Rue Andre Boulle - 94000 Créteil  - 327 153
 722 RCS Créteil
 
 
 
 
 This e-mail message is intended only for the use of the intended
 recipient(s).
 The information contained therein may be confidential or privileged, and
 its disclosure or reproduction is strictly prohibited.
 If you are not the intended recipient, please return it immediately to its
 sender at the above address and destroy it.
 
 
 
 -
 This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
 Don't miss this year's exciting event. There's still time to save $100. 
 Use priority code J8TL2D2. 
 http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com


-
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Cannot move hard links using BackupPC_tarPCCopy

2008-04-30 Thread Daniel Berteaud
Le mercredi 30 avril 2008 à 23:02 +0800,
[EMAIL PROTECTED] a écrit :
 Hi,
 I have set up BackupPC on Kubuntu as a guest OS sitting on top of
 Windows XP using vmWare Server. Thus, my full set up is:
 - Windows XP Prof
 - vmWare Server
 - Kubuntu 8.04 Hardy Heron (as a guest OS in vmWare)
 - virtual disk on a USB 320G disk
 - Current backup directory: /var/lib/backuppc
 - New backup directory: /mnt/myusbdrive/backuppc
 
 I was able to do a test backup which went to the default set up
 of /var/lib/backuppc/pc
 
 However, to have a proper set up I need to increase my disk space for
 the backup. I created a virtual disk that is sitting on the USB drive.
 
 I created the folders under /mnt/myusbdrive/backuppc :
 - pc
 - cpool
 - pool
 - log
 
 The permissions and ownership for each are:
 - drwxr-xr-x  for pc, cpool,  pool
 - drwxr-x---  for the log directory
 - drwxr-x---  for the pc/leo directory
 
 I stopped the backuppc process with /etc/init.d/backuppc stop
 
 I then did a su to be the user backuppc and then navigated my way to
 /mnt/myusbdrive/backuppc/pc
 
 Once in the folder, I executed the command,
 /usr/share/backuppc/bin/BackupPC_tarPCCopy /var/lib/backuppc/pc |
 xvPf -

You first need to copy your pool and cpool directory using rsync or cp
(I use rsync -avP /source /dest) for exemple:

cd /var/lib/backuppc
rsync -avP --exclude=pc/ ./ /mnt/myusbdrive/backuppc/

then you'll be able to re-create the hardlinks with BackupPC_tarPCCopy

Cheers, Daniel


 
 Unfortunately, each time I have tried I receive the following error
 (for each and every file):
 tar: ./leo/5/fZleo/attrib: Cannot hard link to
 `../pool/7/1/a/71a84edaf55c1badb085a7937fc09c15': No such file or
 directory
 
 BTW, having a look at the folder, I can see that it has made the
 directories but no files. They have the f in front of each directory
 name.
 
 I've looked around on the net, but either the answer is there and I
 can't see it from the trees, or ... well it's not.
 
 I would appreciate any pointers on what I'm doing wrong.
 
 Thanks  
 -
 This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
 Don't miss this year's exciting event. There's still time to save $100. 
 Use priority code J8TL2D2. 
 http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
 ___ BackupPC-users mailing list 
 BackupPC-users@lists.sourceforge.net List: 
 https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: 
 http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 15 32
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com


-
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] remote replication with unison instead of rsync?

2007-12-06 Thread daniel berteaud
Le Thu, 6 Dec 2007 07:53:00 -0700,
dan [EMAIL PROTECTED] a écrit :

 i am currently cloning my backuppc server to a remote server in
 another city via rsync.  as i understand it, as my file could grows
 rsync will become unfriendly.
 
 i am completely unfamiliar with 'unison' and am wondering if anyone
 know is unison has the same issue as rsync with many many files and
 memory usage?
 
 im looking to replace my rsync replication method with one that is
 less problematic
 
 thanks

I never used unison, but it uses the same algorothm as rsync, the
difference is that unison allows bi-directionnal syncro, so it won't
help here.

Cheers

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]

-
SF.Net email is sponsored by: The Future of Linux Business White Paper
from Novell.  From the desktop to the data center, Linux is going
mainstream.  Let it simplify your IT future.
http://altfarm.mediaplex.com/ad/ck/8857-50307-18918-4
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Off site backup

2007-12-03 Thread daniel berteaud
Le Mon, 03 Dec 2007 16:30:09 +0100,
Koen Linders [EMAIL PROTECTED] a écrit :

 Rsyncing will crush your machine depending on speccs and amount of
 data.
 
 Some data here:
 
 Server:
 Debian
 Xeon 2.8 GHz
 2 GB DDR2
 Raid 5: 460 GB
 External HD 500 GB
 
 34 XP desktops/laptops
 
 Pool is 115.04GB comprising 865416 files and 4369 directories (as of
 27/11 01:37),
 Pool hashing gives 383 repeated files with longest chain 30,
 Nightly cleanup removed 2886 files of size 3.21GB (around 27/11
 01:37), Pool file system was recently at 55% (3/12 16:17), today's
 max is 55% (30/11 14:23) and yesterday's max was 57%.
 
 Trying to rsync all data (yes, all hosts with a lot of hardlinks and
 pool) didnt't work anymore. I learned this the painfull way.
 
 At night I stop necessary services (samba/backuppc), unmount
 partition and dd all to external disk every night.
 i replace the disk after one week and take the other with me at home.
 

I'm just finishing some scripts to make offsite backups of backuppc
data.
6 scripts allowing you to copy your backups in two differents way
(archive of hosts or entire pool) on three different destination (local
directory, usb drive or remote host with ssh enabled).
These scripts uses BackupPC_tarPCCopy to copy the pc directory, which
uses just a few RAM, the other data (pool, cpool and logs) are rsync in
three times (one time for the logs and all the little data, one time
for cpool and a last time for pool, this way, even with quite hudge
pool, youy won't run out of ram)
I'll post as soon as it'll be ready for testing.
These scripts has been written for SME server which is centOS based
(I've packaged BackupPC for this distro), so maybe they'll need some
tweak to work correctly.)
I'll keep you informed


 
 
 On Mon, 03 Dec 2007 16:05:37 +0100, alex [EMAIL PROTECTED] wrote:
 
  On Mon, Dec 03, 2007 at 07:58:19AM -0700, dan wrote:
  i installed an exact copy of my backup server at another store 30
  miles away.  i have backuppc installed but not running on the
  offsite machine and
  i have cron rsync with some fancy options each day.  if my main
  backuppc
 
  hm, interesting. I thought rsyncing didn't work for the BackupPC
  rep. What fancy rsync options do you use besides --hard-links?
 
 
 
 
 
 
 -
 SF.Net email is sponsored by: The Future of Linux Business White Paper
 from Novell.  From the desktop to the data center, Linux is going
 mainstream.  Let it simplify your IT future.
 http://altfarm.mediaplex.com/ad/ck/8857-50307-18918-4
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]

-
SF.Net email is sponsored by: The Future of Linux Business White Paper
from Novell.  From the desktop to the data center, Linux is going
mainstream.  Let it simplify your IT future.
http://altfarm.mediaplex.com/ad/ck/8857-50307-18918-4
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Using SCSI Tape with Archiver

2007-10-08 Thread Daniel Berteaud
Le Mon, 08 Oct 2007 15:28:32 -0400,
Jonathan Nelson [EMAIL PROTECTED] a écrit :

 Hi:
 
 I'm trying to use a SCSI Tape Drive with Archiver but, when I run the 
 archiver from the web, the log says /dev/st0: Permission denied.
 
 I've added the user respaldo to the Tape group and if i run the 
 command directly from the console with the user respaldo it works
 fine.
 
 
 Thanks for your help
 
 
Well, adding the user backuppc to the tape group (and chmod
g+w /dev/st0, but it seems it's allready done) should work

-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now  http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] copy pool to another filesystem formatted differently

2007-09-29 Thread Daniel Berteaud
Le Fri, 28 Sep 2007 20:00:49 -0600,
Ben Nickell [EMAIL PROTECTED] a écrit :

 I have set up my new backuppc filesystem on an LVM volume and decided
 on using ext3 for the filesytem.  The old filesytem is reiserfs.
 I would like to copy the old filesystem to the new larger volume.
 The backuppc FAQ states:
 
 The best way to copy a pool file system, if possible, is by
 copying the raw device at the block level (eg: using dd). Application
 level programs that understand hardlinks include the GNU cp program
 with the -a option and rsync -H. However, the large number of
 hardlinks in the pool will make the memory usage large and the copy
 very slow. Don't forget to stop BackupPC while the copy runs.
 
 
 At the risk of exposing my ignorance, If I use dd won't it copy 
 filesystem information as well?   Maybe I'm just tired and not
 thinking clearly. but would something like this work?
 
 dd if=/var/mapper/vg-reiserfs_volume
 of=/var/mapper/vg-larger_ext3volume
 
 I know that rsync would take a really long time.  (about 1.1tb of
 data, with all the hardlinks)  Should I just try cp -a?
 
 Any other suggestions?
 
 Thanks,
 Ben

To move the pool, I use rsync to transfert everything excluding the
directory pc (which contains all the hard links, so the others
directory can be transfered with rsync -avP or something like that).

Then I use the script BackupPC_tarPCCopy which is part of BackupPC 3.0.
This script write a raw tar archive on the standard output which
contains all the pc directory (or just a part, depending on what you
want). I just pipe this output in another tar command which will
extract on the fly to the destination I want.

It will give something like that:

BackupPC_tarPCCopy /path/to/data/dir/pc | (cd /destination/pc  tar xPf -)

It's not very fast, but it works. ANd you can also use something like that
to send the pool to a remote host: you just have to pipe the output of
BackupPC_tarPCCopy to ssh like this:

BackupPC_tarPCCopy /path/to/data/dir/pc | ssh [EMAIL PROTECTED] (cd 
/destination/pc  tar xPf -)

You can also compress on-the-fly the archive:

BackupPC_tarPCCopy /path/to/data/dir/pc | gzip -c | ssh [EMAIL PROTECTED] (cd 
/destination/pc  tar xzPf -)


But for a 1.1 TB, I'm not sure how many time it will take.

Cheers, daniel

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] filesystem recommendation

2007-09-24 Thread daniel berteaud
Le Sat, 22 Sep 2007 20:45:22 -0400,
Doug Lytle [EMAIL PROTECTED] a écrit :

 Ben Nickell wrote:
  so I'm 
  thinking of moving to something else that journals data in
  additional to metadata but feel free to try to talk me out
  changing.  Any thing I 
 

 

Just to share my experience.
I've setup a Backup server running BackupPC 3.0 on a smeserver (centos
based)
For the disk I've configured a big RAID5 (on an perc5i) array with 9x750
Go + 1x750Go in hot-spare
Then, with LVM, I've created a logical volume of about 3,5To which
I've formated in ext3. I keep the free place for some future usage.

Everything is working, and the bottleneck wasn't the disque but the
processor (an Intel Xeon Dual Core 2,8 GHz), so I've added a second
processor and now, I can reach the max performances of the disk array.
But even if the disk array is the bottleneck, max performances are
rarelly reached because backups occure at different times.

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]

-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] suggestions: tag a backup

2007-09-06 Thread daniel berteaud
Hi all.
I just want to make a suggestion to (IMHO) enhance backuppc.
I think it could be very useful to have the possibility to tag a
backup from the CGI for example, I perform a full backup each time a
update my servers, and I'd like to tag the backup like this

pre-update 7.2

This way I could easily find the last backup of my system before the
update to 7.2 (this is just an example, I use SME server 7.2)


Another feature I'd like to have would be the possibility to lock a
backup in order to prevent it from beeing deleted

A check box on each backup for example, if it's selected, this backup
will never be deleted


I don't have the skills to code it myself, so I just give ideas, anyone
else would like to have this features?

Cheers, Daniel
-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]

-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now   http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC 3.1.0beta0 released

2007-09-05 Thread daniel berteaud
 small patch from Sergey to bin/BackupPC_archiveHost.
 
 * Changed BackupPC_sendEmail so that summary admin email doesn't
   include errors from hosts that have $Conf{BackupsDisable} set.
   Reported by James Kyle.  Also, per-user email is now disabled
   when $Conf{BackupsDisable} is set.
 
 * Added RsyncdUserName to the config editor.  Reported by Vicent Roca
 Daniel.
 
 * $Conf{IncrLevels} is now defaulted if it is not defined.
 
 * configure.pl clears $Conf{ParPath} if it doesn't point to a valid
   executable.
 
 * Added documentation for BackupPC_tarPCCopy, including use of -P
 option to tar suggested by Daniel Berteaud.
 
 * Config editor now removes white space at start of exec path.
   Reported by Christoph Iwasjuta.
 
 * CgiDateFormatMMDD == 2 gives a -MM-DD format for CGI dates,
   suggested by Imre.
 
 -
 This SF.net email is sponsored by: Splunk Inc.
 Still grepping through log files to find problems?  Stop.
 Now Search log events and configuration files using AJAX and a
 browser. Download your FREE copy of Splunk now 
 http://get.splunk.com/ ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]

-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now   http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] shortcut

2007-07-24 Thread daniel berteaud
Le Tue, 24 Jul 2007 09:56:34 +0200,
Edouard de Ganay [EMAIL PROTECTED] a écrit :

 Hi there again,

Hi
 
 Another question,
 
 all backups are stores in /opt/backuppc/files/pc
 which is find
Looks like you're using my contrib for sme server :)
 
 I'd like to make these files public (readonly) in my samba server (in
 sme7.2), which means that files have to be under
 /home/e-smith/files/ibays/backup(name of my share on the server)/files

The files in the pool are stored in a special way, they cannot be used
directly through a samba share (especially if they are compressed)
 
 I could change the parameter : topdir in backuppc to the acurate
 folder, but it would also put the entire /opt/bacuppc/files in the
 samba share, and not only the subfolder /pc
Changing the topdir won't work, you'll need to change the path in some
script (/usr/local/BackupPC/bin/...). Anyway, you cannot move just the
subfolder pc/, you need to move all the data dir
 
 Then I thought that there might be the possibility to create a
 shortcut saying /home/e-smith/files/ibays/backup/files is a shortcut
 of the real /opt/backuppc/files/pc folder
I never tried this way, but I think it's not a good idea
 
 but I don't know how to do it,
 anyone could help me ?
 
 thanks,

Sorry, but I think this cannot be done, just look at the directory pc/,
for example, the backup num 35 of the localhost:

ll /opt/backuppc/files/pc/localhost/35/f%2f/



 
 Edd
 -rw-r-   3 backuppc backuppc  227 jun 25 21:09 attrib
drwxr-x---   2 backuppc backuppc 4096 jun 26 21:02 fbin
drwxr-x---   4 backuppc backuppc 4096 jun 26 21:02 fboot
drwxr-x---   2 backuppc backuppc 4096 jun 26 21:02 fcommand
drwxr-x---  63 backuppc backuppc 4096 jun 26 21:03 fetc
drwxr-x---   4 backuppc backuppc 4096 jun 26 21:03 fhome
drwxr-x---   2 backuppc backuppc 4096 jun 26 21:03 finitrd
drwxr-x---  10 backuppc backuppc 4096 jun 26 21:03 flib
drwxr-x---   2 backuppc backuppc 4096 jun 26 21:03 flost+found
drwxr-x---   5 backuppc backuppc 4096 jun 26 21:03 fmedia
drwxr-x---   2 backuppc backuppc 4096 jun 26 21:03 fmnt
drwxr-x---   2 backuppc backuppc 4096 jun 26 21:03 fopt
drwxr-x---   3 backuppc backuppc 4096 jun 26 21:03 fpackage
drwxr-x---   6 backuppc backuppc 4096 jun 26 21:03 froot
drwxr-x---   3 backuppc backuppc 4096 jun 26 21:03 fsbin
drwxr-x---   4 backuppc backuppc 4096 jun 26 21:03 fselinux
drwxr-x---   2 backuppc backuppc 4096 jun 26 21:03 fservice
drwxr-x---   2 backuppc backuppc 4096 jun 26 21:03 fsrv
drwxr-x---  15 backuppc backuppc 4096 jun 26 21:05 fusr
drwxr-x---  24 backuppc backuppc 4096 jun 26 21:06 fvar

You have a 'f' before each file name. If you are using compression,
you'll need to zcat each file before using it:

/usr/local/BackupPC/bin/BackupPC_zCat \
/opt/backuppc/files/pc/localhost/35/f%2f/fetc/fshadow  \
/path/where/you/want/your/file/to/be/restored

it's not very practicle


Cheers, Daniel
 



-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]

-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now   http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] shortcut

2007-07-24 Thread daniel berteaud
Le Tue, 24 Jul 2007 19:04:45 +1000,
Les Stott [EMAIL PROTECTED] a écrit :

 Edouard de Ganay wrote:
  Hi there again,
 
  Another question,
 
  all backups are stores in /opt/backuppc/files/pc
  which is find
 
  I'd like to make these files public (readonly) in my samba server
  (in sme7.2), which means that files have to be under
  /home/e-smith/files/ibays/backup(name of my share on the
  server)/files
 
  I could change the parameter : topdir in backuppc to the acurate
  folder, but it would also put the entire /opt/bacuppc/files in the
  samba share, and not only the subfolder /pc
 
  Then I thought that there might be the possibility to create a
  shortcut saying /home/e-smith/files/ibays/backup/files is a
  shortcut of the real /opt/backuppc/files/pc folder
 
  but I don't know how to do it,
  anyone could help me ?
 

 sorry it cant be done. its not designed for that type of access.
 
 the only way to view these files via the backuppc web interface and 
 browsing each host.
 
 perhaps an interesting feature request might be a pool browser, so 
 that you could look at the entire pool via the web interface without 
 going into each host. Not sure how hard that would be, probably 
 difficult, Craig could answer that. If that were possible that level
 of access would have to be restricted to administrators only.
 

Maybe it could be done with a fuse file system, or something like that.
But my skills are far too limited for this kind of dev
 Regards,
 
 Les
 
 
 
 -
 This SF.net email is sponsored by: Splunk Inc.
 Still grepping through log files to find problems?  Stop.
 Now Search log events and configuration files using AJAX and a
 browser. Download your FREE copy of Splunk now 
 http://get.splunk.com/ ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]

-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now   http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] RsyncdUSerName and CGI

2007-07-05 Thread daniel berteaud
Le Thu, 05 Jul 2007 17:27:25 +0200,
[EMAIL PROTECTED] a écrit :

 Hello,
 
 I've a little problem with the CGI interface when I want to add a  
 Windows computer. For Windows computer I use Rsyncd method and I
 can't specify the RsyncdUserName, the field doesn't appear in the
 web interface. So, I've to write it in the configuration file with
 vim on the Backup Server.
 I tried to find an error in the EditConfig.pm file but I saw nothing.
 Does someone have a solution?
 Thanks for your help
 Benoit
 

It's a little bug in the interface, here's the fix from Craig Barrat:



 Yes, this is a bug.  To fix it:

 - In lib/BackupPC/Config/Meta.pm, after this line:

   RsyncdPasswd  = string,

  add a new line:

RsyncdUserName= string,

 - In your main config.pl file, add a new entry to

$Conf{CgiUserConfigEdit} that says:

'RsyncdUserName' = '1',

I've added this to the todo list for the next release.

Craig


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]

-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to archive only part of a backup?

2007-06-18 Thread daniel berteaud
Le Mon, 18 Jun 2007 15:04:07 +0200,
Guy Malacrida [EMAIL PROTECTED] a écrit :

 Thank you for your quick reply.
 I know and I use include/exclude (only exclude really) directories.
 What I want to achieve is not changing the regular backup, which is
 the whole machine, but only archiving part of it.
 In other words, backup the whole machine but only archive two
 directories. Thanks!
 Guy
 
 
 2007/6/18, Nils Breunese (Lemonbit) [EMAIL PROTECTED]:
 
  Guy Malacrida wrote:
 
   I am a very happy user of BackupPC, backing up two home Windows PC
   (XP+Vista), one Ubuntu desktop and one Kubuntu laptop.
   From the WinXP machine I backup (220GB) I'd like to archive only
   the music and images representing some 40/50GB still manageable on
   DVDs.
   Is there any way to do this?
 
  Sure, using includes/excludes you can specify precisely what files/
  directories you want backed up. See the $Conf{BackupFilesOnly}
  setting: http://backuppc.sourceforge.net/faq/
  BackupPC.html#item__conf_backupfilesonly_
 
  Nils Breunese.
 

You can archive just one share if you want. So you can configure your
host with two shares
- the whole machine (including or excluding the part you want to
archive, as you want)
- the part you wan't to archive only

then, specify on the command line which share you want to archive (with
the -s option)

You can also create a virtual host, which point to the real host you
want (with $Conf{ClientNameAlias}) and that you can configure with
different share(s).




-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]

-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] a script to copy backup on a remote host

2007-05-02 Thread daniel berteaud
/
 
EOF
exit(1);
}

# fonction qui génère un nom aléatoire pour les fichiers temporaires
sub genRandName(){
my @c=(A..Z,a..z,0..9);
my $randomName = join(,@c[map{rand @c}(1..8)]);
return $randomName;
}

# Fonction qui vérifie la présences des rep
sub verifTree($$){
my ($source,$dest) = @_;
my $ok = 1;

foreach ($source,$source/pc,$source/pool,$source/cpool,,){
if (!-d $_){
print STDERR $_ is not a valid directory, aborting\n;
$ok = 0;
}
}
return $ok;
}

# La fonction qui stock toutes les commandes les unes à la suites des autres 
dans un tableau
sub remotePool ($){
my ($source,$dest,$extract,$compress,$logFile) = @_;

my $archName = 'pc.tar';
my @cmd = ();
my @pipe = ();
my $tarOpts = 'xPf';
my $main = '';


#if ( $dest !~ /^([\w-]:[EMAIL PROTECTED]/.])$/ ){
#   print STDERR destination should follow this format: [EMAIL 
PROTECTED]:/path\n;
#   exit (1);
#}

my @tmp = split(/@/,$dest);
my $remoteUser = shift(@tmp);
@tmp = split(/:/,$tmp[0]);
my $remoteHost = shift(@tmp);
my $remoteDir = $tmp[0];

push(@cmd,$sudoPath /etc/rc.d/init.d/backuppc stop);
 
# Si /etc/BackupPC exist, on le sauvegarde dans $source/etc
if(-d '/etc/BackupPC'){
push(@cmd,$mkdirPath -p $source/etc) if (!-d $source/etc);
push(@cmd,$rsyncPath -a --del --stats /etc/BackupPC/ 
$source/etc/);
}


# commande qui sync en non récursif pour créer les rep nécessaires
push(@cmd,$rsyncPath -qlptgoDHd --stats --del $source/ $dest/);

# la commande suivante sync tout sauf les rep pc, pool et cpool (pour 
éviter de saturer la ram)
push(@cmd,$rsyncPath -a --del --stats --exclude=cpool/ --exclude=pool/ 
 --exclude=pc/ $source/ $dest/);

# Maintenant, on sync le rep pool
push(@cmd,$rsyncPath -a --del --stats $source/pool/ $dest/pool/);

# Puis le cpool
push(@cmd,$rsyncPath -a --del --stats $source/cpool/ $dest/cpool/);


# on vide le répertoire pc
if ($remoteDir ne ''){
push(@cmd,$sshPath [EMAIL PROTECTED] \$rmPath -Rf 
$remoteDir/pc/*\);
}


push(@pipe,$backuppcBinPath/BackupPC_tarPCCopy $source/pc/ |);

if ($compress eq '1'){
push (@pipe, gzip -c |);
$archName = pc.tar.gz;
$tarOpts = 'xPzf';
}
if ($extract eq '1'){
push (@pipe,$sshPath [EMAIL PROTECTED] \(cd $remoteDir/pc/  
$tarPath $tarOpts -)\);
}
else{
push (@pipe,$sshPath [EMAIL PROTECTED] \cat  
$remoteDir/pc/$archName\);
}

foreach (@pipe){
$main = $main.$_;
}

push (@cmd,$main);

push(@cmd,$sudoPath /etc/rc.d/init.d/backuppc start);
#print STDERR logfile is $logFile\n;
perform ($logFile,$remoteHost,@cmd);

}

# la fonction qui exécute les commandes les unes à la suite des autres
sub perform($$){
my ($logFile,$remoteHost,@cmd) = @_;
foreach (@cmd){
print \n\nexecuting command\n$_\n;
#print STDERR $_\n;
system($_);
}
print \n\n- End of copy -\n\n\n;
system($catPath $logFile  
/var/log/BackupPC/export/export.$remoteHost.log);
system($rmPath -f $logFile);
}



##



-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]

-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Howto backup BackupPC server

2007-03-09 Thread daniel berteaud
Le Thu, 8 Mar 2007 15:20:51 -0800,
Fabio Milano [EMAIL PROTECTED] a écrit :

 Thanks for all the ideas and great feedback.
 
 VPN is a possibility.
 
 The script looks feasible as well.
 
 I found a Backuppc RPM for SME 7 that apparently has an offsite rsync
 feature.
 
 Has anybody used this before?
 
 thanks
 
 

Hi, if you talk about the rpm smeserver-backuppc-1.0-5 (which you can
find on http://sme.firewall-services.com), then be carefull when using
the remote copy with rsync. I've made this contrib myself last year,
and I was quite noob in linux in general, and I just write a very
simple script which rsync all the files in one time. When I was testing
it, it was working because I only had a few data backed up. But now, on
a prod server, I have a 36 Go cpool (with a lot of little files) and
when the copy occures, rsync takes about 700 Mo of ram. So it can work,
but be carefull.
I'm now working on the integration of backuppc 3.0 in SME 7.X, and I'm
writing a new script to copy the pool on a remote host using the script
BackupPC_tarPCCopy to copy the directory pc/ and rsync to
transfert pool and cpool, I think it'll be much more efficient, I
haven't finish it yet but I'll post as soon as it's ready.

Best regards

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] using BackupPC_tarPCCopy

2007-02-17 Thread daniel berteaud
Le Fri, 16 Feb 2007 17:16:00 -0800,
Craig Barratt [EMAIL PROTECTED] a écrit :

 Daniel,
 
 What version of tar are you using?
 
 Craig

I'm using tar-1.14-12.RHEL4
But I've found the problem yesterday, I need to use the -P option for
tar, if I don't, I've got this message:

removing leading ../ from hardlink targets

So now, If I do a

sudo -u backuppc BackupPC_tarPCCopy /opt/backuppc/files/pc/ | tar xvPf -

it's working (I've not tested it but the linsks are restored without
error messages now). I haven't tested it through the ssh tunnel yet, but
there's no reason, it should also work.

Thanks for your help. I can now write my script to copy all the data:

- to a local directory
- on a remouvable media
- on a remote host

I'll share this as soon as it's finish.

The script BackupPC_tarPCCopy is exactly what was missing in
previous version of BackupPC to copy the backups data off-site.

Thanks again for all your help and your work.
Cheers, Daniel

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] using BackupPC_tarPCCopy

2007-02-16 Thread daniel berteaud
Le Mon, 5 Feb 2007 20:58:45 -0800,
Craig Barratt [EMAIL PROTECTED] a écrit :

 daniel writes:
 
  Hi everyone. I'm trying to backup all the data of a backuppc
  server. I saw that backuppc 3.0 has a script to copy the directory
  pc/
  
  So first I used rsync to copy everything except the directory pc/
  
  rsync -aH --exclude /opt/backuppc/files/pc/ /opt/backuppc/ \
  [EMAIL PROTECTED]:/opt/backuppc/
  
  This part works well, it doesn't take too much memory.
  
  Now I'm trying to copy the directory pc with the script. I'd like to
  copy it directly without using a temporary file. So I tried this
  command:
  
  sudo -u backuppc BackupPC_tarPCCopy /opt/backuppc/files/pc/ | gzip
  -c \ | ssh [EMAIL PROTECTED] (cd /opt/backuppc/files/pc 
  tar xzf -)
  
  it seems to work but on the sender side (the server I want to
  backup) I've got plenty of error messages like this (in french):
  
  tar: ./127.0.0.1/149/f%2f/fvar/fspool/fsquid/f03/attrib: ne peut
  établir un lien vers `cpool/3/5/3/3530222493619acd2106c996347e9034':
  Aucun fichier ou répertoire de ce type
 
 That's strange: the path in the hardlink should start with ../cpool
 not cpool.
 
 What output do you get when you do this:
 
 sudo -u backuppc BackupPC_tarPCCopy /opt/backuppc/files/pc/ | tar
 tvf -
 
 In particular, do you see this (correct):
 
 hrw-r- craig/None0 2006-11-30
 12:00 ./craigdell/12/attrib link
 to ../cpool/a/b/d/abde98f8cac551cb9273c093574dff6d
 
 or this (incorrect):
 
 hrw-r- craig/None0 2006-11-30
 12:00 ./craigdell/12/attrib link to
 cpool/a/b/d/abde98f8cac551cb9273c093574dff6d
 
 Craig

I've just tested this 
sudo -u backuppc BackupPC_tarPCCopy /opt/backuppc/files/pc/ | tar tvf -

and it seems to be ok:


hrw-r- backuppc/backuppc  0 2007-02-10
12:00:31 ./127.0.0.1/5/f%2f/fetc/fe-smith/ftemplates/fetc/attrib lien
vers ../cpool/9/1/d/91da064e7cfb069df88fd0cd3ed3af94 drwxr-x---
backuppc/backuppc   4096 2007-02-12
12:00:47 ./127.0.0.1/5/f%2f/fetc/fe-smith/ftemplates/fetc/frc.d/
hrw-r- backuppc/backuppc  0 2007-02-08
12:00:16 ./127.0.0.1/5/f%2f/fetc/fe-smith/ftemplates/fetc/frc.d/attrib
lien vers ../cpool/5/c/3/5c3eac65e38730ad8a965d1f8422572f

but when I extract it in the directory pc, I always get the same
error messages:


tar: ./127.0.0.1/5/f%2f/fetc/fe-smith/ftemplates/fetc/frc.d/finit.d/attrib:
ne peut �tablir un lien vers
`cpool/4/0/7/407e9cbad687de3d3396506a74c2c360': Aucun fichier ou
r�pertoire de ce type

If I extract it in the parent directory (/opt/backuppc/files), which
contains pool, cpool, pc etc, it seems to work (no more error
messages about hard links), but target directories are created
under /opt/backuppc/files instead of /opt/backuppc/files/pc/
(example, backups for host 127.0.0.1 are located
in /opt/backuppc/files/127.0.0.1 instead
of /opt/backuppc/files/pc/127.0.0.1)


I don't understand what I'm doing wrong, the archive seems to contain
the good links (pointing to ../cpool/) but when I extract it,
hard links seems to point to cpool/).

Is there any documentation about the script BackupPC_tarPCCopy on the
net?


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] using BackupPC_tarPCCopy

2007-02-05 Thread daniel berteaud
Hi everyone. I'm trying to backup all the data of a backuppc server. I
saw that backuppc 3.0 has a script to copy the directory pc/

So first I used rsync to copy everything except the directory pc/

rsync -aH --exclude /opt/backuppc/files/pc/ /opt/backuppc/ \
[EMAIL PROTECTED]:/opt/backuppc/

This part works well, it doesn't take too much memory.

Now I'm trying to copy the directory pc with the script. I'd like to
copy it directly without using a temporary file. So I tried this
command:

sudo -u backuppc BackupPC_tarPCCopy /opt/backuppc/files/pc/ | gzip -c \
| ssh [EMAIL PROTECTED] (cd /opt/backuppc/files/pc  tar xzf -)

it seems to work but on the sender side (the server I want to backup)
I've got plenty of error messages like this (in french):

tar: ./127.0.0.1/149/f%2f/fvar/fspool/fsquid/f03/attrib: ne peut
établir un lien vers `cpool/3/5/3/3530222493619acd2106c996347e9034':
Aucun fichier ou répertoire de ce type

I've got the same error messages if I do something like that:

sudo -u backuppc BackupPC_tarPCCopy /opt/backuppc/files/pc  pc.tar
scp -rP pc.tar [EMAIL PROTECTED]:/tmp
then ssh into the remote server and extracting the archive like that:

cd /opt/backuppc/files/pc
tar xf /tmp/pc.tar

another strange thing is that when I have a message like this

tar: ./127.0.0.1/149/f%2f/fvar/fspool/fsquid/f03/attrib: ne peut
établir un lien vers `cpool/3/5/3/3530222493619acd2106c996347e9034':
Aucun fichier ou répertoire de ce type

The file cpool/3/5/3/3530222493619acd2106c996347e9034 exists on the
receiver side (as I used rsync to copy the pool before), so I don't
understand the error.

So my questions are
Does anyone know why these errors occures?
Am I using the script correctly?
Anyone already tried something like that?

Thanks in advance.

Cheers, Daniel

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier.
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC Data Directory on a Network Attached Storage

2007-01-25 Thread daniel berteaud
Le Thu, 25 Jan 2007 20:29:05 +0100,
Simon Köstlin [EMAIL PROTECTED] a écrit :

 Hi,
 
  
 
 I want to have the Data Directory on a Network Attached Storage (NAS)
 and not on the BackupPC Server. The NAS supports NFS, SMB, FTP, CIFS
 and SSH. I tried to mount an NFS Share on the NAS and that works
 well. So I can use the Data Directory in this Share. But the NAS
 supports only a UDP connection with NFS. Are there any other
 solutions to use a TCP connection? I tried also SMB, but that did not
 work with BackupPC. I mounted a Share with SMB and that works, but
 when I wanted to start BackupPC, BackupPC did not start. I only found
 a log directory on the Share with the LOG file in which was an error
 like bind() failed. Does anybody know why SMB does not work with
 BackupPC? Or are there any other solutions like to mount a FTP
 connection?
 

I cannot work because SMB and FTP doesn't support hardlinks.
I don't know what's the best solution for storing the data on another
server.


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Client side config.pl not working on schedule

2006-11-09 Thread daniel berteaud
Le Thu, 9 Nov 2006 02:12:22 -0500,
nilesh vaghela [EMAIL PROTECTED] a écrit :

The schedule cannot be overriden by a per pc config. You should play
with the blackout perios instead


 Dear All,
 
 My client side config.pl not working as far as concern the schedule.
 
 the xx.xx.xx.xx.pl is as below : what should I check ???
 
 $Conf{BackupFilesOnly} = {
   '*' = [
 '/home',
 '/cygdrive/c/bamap'
   ]
 };
 $Conf{ClientCharset} = '';
 $Conf{RsyncArgs} = [
   '--numeric-ids',
   '--perms',
   '--owner',
   '--group',
   '-D',
   '--links',
   '--hard-links',
   '--times',
   '--block-size=2048',
   '--recursive'
 ];
 $Conf{RsyncClientCmd} = '$sshPath -q -x -l Administrator $host
 $rsyncPath $argList+';
 $Conf{RsyncCsumCacheVerifyProb} = '0.01';
 $Conf{XferLogLevel} = 1;
 $Conf{XferMethod} = 'rsync';
 $Conf{RsyncClientRestoreCmd} = '$sshPath -q -x -l Administrator $host
 $rsyncPath $argList+';
 $Conf{RsyncRestoreArgs} = [
   '--numeric-ids',
   '--perms',
   '--owner',
   '--group',
   '-D',
   '--links',
   '--hard-links',
   '--times',
   '--block-size=2048',
   '--relative',
   '--ignore-times',
   '--recursive'
 ];
 $Conf{BackupFilesExclude} = {
   '*' = [
 '*.bmp'
   ]
 };
 $Conf{RsyncShareName} = [
   '/'
 ];
 $Conf{WakeupSchedule} = [
   11,
   18,
   19,
   20,
   21,
   22
 ];
 $Conf{BlackoutPeriods} = [
   {
 'hourEnd' = '19.5',
 'weekDays' = [
   1,
   2,
   3,
   4,
   5
 ],
 'hourBegin' = '19.4'
   }
 ];
 
 --
 


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] mysql backup isn't working

2006-11-08 Thread daniel berteaud
You should save mysql databases this way. The best way is to use a
pre-dump script whcih dump tha database you want.
For exemple:
mysqldump mysqldump --add-drop-table -A -Q  /home/backup/mysql.sql

This command will dump all the mysql databases and save them as a mysql
script in /home/backup. Then you can include this file as a standard
file in backuppc configuration



Le Wed, 8 Nov 2006 12:49:42 +0200,
Mikael Lammentausta [EMAIL PROTECTED] a
écrit :

 Backups of the host below (backup_client) works fine for the two
 first folders, but not for /var/lib/mysql. I tried to manually rsync
 this folder to /tmp/backup as the user backuppc, and it worked. I
 wonder why it doesn't when the backupPC daemon does it. Any ideas?
 
 Ps. mysql is started, and backuppc doesn't and shouldn't have rights
 to shut it down.
 
 The Xfer error log:
 
 Running: /usr/bin/ssh -q -x -l backuppc
 backup_client /usr/bin/sudo /usr/bin/rsync --server --sender
 --numeric-ids --perms --owner --group --devices --links --times
 --block-size=2048 --recursive . /etc/ Xfer PIDs are now 12914 Got
 remote protocol 29 Xfer PIDs are now 12914,12917 [ skipped 199 lines ]
 Done: 0 files, 0 bytes
 Running: /usr/bin/ssh -q -x -l backuppc
 backup_client /usr/bin/sudo /usr/bin/rsync --server --sender
 --numeric-ids --perms --owner --group --devices --links --times
 --block-size=2048 --recursive . /var/www/ Xfer PIDs are now 12918 Got
 remote protocol 29 Xfer PIDs are now 12918,12925 [ skipped 199251
 lines ] Done: 23 files, 38187 bytes
 Running: /usr/bin/ssh -q -x -l backuppc
 backup_client /usr/bin/sudo /usr/bin/rsync --server --sender
 --numeric-ids --perms --owner --group --devices --links --times
 --block-size=2048 --recursive . /var/lib/mysql/ Xfer PIDs are now
 13065 Got remote protocol 29 fileListReceive() failed Done: 0 files,
 0 bytes Got fatal error during xfer (fileListReceive failed)
 Backup aborted (fileListReceive failed)


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] WEB interface to manage config file

2006-09-29 Thread daniel berteaud
Le Fri, 29 Sep 2006 14:47:34 +0200,
Alessandro Ferrari [EMAIL PROTECTED] a écrit :

 Hi,
 
 does a web interface to manage backuppc's config file?
 
 Thanks, Alessandro

In v3 (beta), there's a web interface to manage the config. For v2, I made a 
contrib with a web panel to manage the config but only for SME server 
distribution.


-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC contrib for SME server

2006-09-20 Thread daniel berteaud
I don't know if anyone is interested. I worked on the integration of backupPC 
on SME server. It's a very good GNU/linux distrib and if you want some more 
informations on it, you can go to http://contribs.org

I (and some others of the community contribs.org) use my contrib on several 
production servers. It's based on v 2.1.2pl2 and provided as a simple rpm to 
install.

I added a cgi panel for the configuration which is integrated in the main panel 
of the server (the server-manager), it lets you:
- manage the host list
- modify most of the parameters in the general config.pl file
- generate a custom config.pl file per host
- periodically export all your data to an (or several) offsite server(s)
- periodically archive the selected host as standards tar or tar.gz archives 
and optionnally, export the archives to an offsite server or move them to any 
local directory (external hard drive for exemple). It should also works for 
tape device but I haven't tested.

I think the last two functions are missing in backuppc. It would be great if 
they were included in v3

You can find the contribution and the installation how-to at:

http://sme.firewall-services.com/backuppc

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync trying to backup kcore

2006-09-20 Thread daniel berteaud
Le Wed, 20 Sep 2006 11:23:41 -0400,
Toby Johnson [EMAIL PROTECTED] a écrit :

  You should try this:
 
  $Conf{RsyncClientCmd} = '$sshPath -q -C -x -l root
  -i /path/to/keyfile $host $rsyncPath $argList+'; 
  $Conf{RsyncShareName} = ['/etc', '/var', '/home', '/root', '/usr',
  '/lib', '/lib64', '/bin', '/sbin'];
 
  $Conf{BackupFilesEsclude} = {'/var' =
  ['/named/chroot/dev','/named/chroot/etc','/named/chroot/proc','/log',]};
 
  It should works then.
  
 
 That still doesn't work. Perhaps I should just hardcode --exclude 
 statements into the RsyncClientCmd?

Well, sorry it's

$Conf{BackupFilesEsclude} = {'/var' = 
['named/chroot/dev','named/chroot/etc','named/chroot/proc','log',]};

without the fisrt slash. I usually backup from the root ($Conf{RsyncShareName} 
= ['/'];)
so I can exclude with absolute paths. but when you exclude from another 
directory 
you must use relative paths


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync trying to backup kcore

2006-09-20 Thread daniel berteaud
Le Wed, 20 Sep 2006 20:40:59 +0200,
daniel berteaud [EMAIL PROTECTED] a écrit :

 $Conf{BackupFilesEsclude} = {'/var' 
 =['named/chroot/dev','named/chroot/etc','named/chroot/proc','log',]};

it's $Conf{BackupFilesExclude}, sorry for the typo in my last two messages.

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Doing backups every 2 hours

2006-08-23 Thread daniel berteaud
On Tue, 22 Aug 2006 15:19:08 -0300
Vinícius Medina [EMAIL PROTECTED] wrote:

 The backup is doing now, but not in the right schedule.
 
 The config.pl is like this:
 
 $Conf{TarShareName} = '/backup';
 $Conf{XferMethod} = 'tar';
 $Conf{WakeupSchedule} = [10,12,14,16,18,20,22];
 $Conf{BlackoutGoodCnt} = -1;
 $Conf{IncrPeriod} = 0.04;
 $Conf{FullPeriod} = 0.97;
 $Conf{FullKeepCnt} = 2;
 $Conf{IncrKeepCnt} = 8;
 
 But, for instance, these are the times it was doing backup:
 Backup#  Type  Filled Start Date
 7  full   yes  8/21 13:00
 16incrno  8/21 23:00
 17incrno  8/22 07:55
 18incrno  8/22 09:00
 19incrno  8/22 10:00
 20incrno  8/22 11:00
 21incrno  8/22 12:00
 22fullyes 8/22 13:00
 23incrno  8/22 14:00
 24incrno  8/22 15:00
 
 I wonder why it is doing so. I already did reload the configs, in the
 cgi interface.
 
 Thanks, in advance, for your time.

$Conf{WakeupSchedule} cannot be superced in per pc configuration files,
I think that's why the dump doesn't occure at the right time.
You should let the general $Conf{WakeupSchedule} to the default
([1..23] wich means every houre) and decrease the value
$Conf{IncrPeriod} to 0.1 or 0.2 in the host config file.

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Excluding directories from smb

2006-08-11 Thread daniel berteaud
On Fri, 11 Aug 2006 12:24:54 +0200
Etaoin Shrdlu [EMAIL PROTECTED] wrote:

 On Friday 11 August 2006 10:47, Tony Molloy wrote:
 
  Try something like the following.
 
  $Conf{SmbShareName} = 'D$';
 
  $Conf{BackupFilesExclude} = {
  'D$' = ['\System Volume Information']
  };
 
 Thanks for the tip, I just tried it and nothing changed (same
 messages and errors).
 
I think it should be this instead 

$Conf{BackupFilesExclude} = {
 'D$' = ['/System\ Volume\ Information']
 };

-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup to localhost?

2006-08-04 Thread daniel berteaud
The main config is always used but you can define in the per pc config
file some parameters that overload (I don't if you say this way in
english, I'm french :/) the same parameters of the main config.pl file.
For exemple, with your per pc config, all the main variables will be
used except these ones which will be read from the per pc configuration
file:

$Conf{TarShareName}
$Conf{XferMethod}
$Conf{TarClientCmd}
$Conf{TarFullArgs}
$Conf{TarFullArgs}
$Conf{CompressLevel}
$Conf{TarIncrArgs}
$Conf{TarClientRestoreCmd}




On Fri, 04 Aug 2006 09:09:55 -0400
Rob Morin [EMAIL PROTECTED] wrote:

 How can i tell if it used the config.pl in te pc folder or the main
 one?
 
 It seemed to have work ok last night but not sure if it used the new 
 config or the main one? the new config uses this for backing up the 
 localhost
 
 # start of the per pc config #
 
 $Conf{TarShareName} = ['/'];
 
 $Conf{BackupFilesExclude}=['/proc','/sys','/dev','/tmp','/var/log', 
 '/mnt/usb', '/data/var/log', '/data/var/spool', '/data/BACKUP'];
 
 $Conf{XferMethod} = 'tar';
 
 $Conf{TarClientCmd} = '/usr/bin/sudo'
 . ' $tarPath -c -v -f - -C $shareName'
 . ' --totals';
 
 $Conf{TarFullArgs} = '$fileList';
 
 $Conf{CompressLevel} = 3;
 
 $Conf{TarIncrArgs} = '--newer=$incrDate $fileList';
 
 $Conf{TarClientRestoreCmd} = '/usr/bin/sudo'
. ' $tarPath -x -p --numeric-owner --same-owner'
. ' -v -f - -C $shareName+';
 
 # end of the per pc config #
 
 Thanks..
 
 Rob Morin
 Dido InterNet Inc.
 Montreal, Canada
 Http://www.dido.ca
 514-990-
 
 
 
 Les Mikesell wrote:
  On Thu, 2006-08-03 at 14:00 -0400, Rob Morin wrote:

  Ahh i see ok so if i force an incremental for all machines at, say 
  midnight or 11pm they will always backup at that time then? ok
  cool... 
 
  The next one won't start until about 24 hours has elapsed since
  the last run - then some other things could defer it.
 

  Another quick question, rather than do the rsync thing over ssh,
  i want to use a conf file someone on the list provided me to be
  placed in the pc's name dir... IE pc/localhost,   now do i name
  the file config.pl or the name of the directory/pc IE localhost.pl
  as i forst named it localhost.pl and it still used ssh rather than
  tar 
 
  Naming it config.pl in the pc/hostname directory will work.  This
  file only
  needs to contain the settings that differ from the global config.pl
  file.
 

 
 -
 Take Surveys. Earn Cash. Influence the Future of IT
 Join SourceForge.net's Techsay panel and you'll get the chance to
 share your opinions on IT  business topics through brief surveys --
 and earn cash
 http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
 ___ BackupPC-users
 mailing list BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/
 


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup to localhost?

2006-08-03 Thread daniel berteaud
Well, it should do incremental backups, depending on the
configuration. You can verify the type of each backup in the cgi
interface. You can specify in the per pc configuration file the
blackout periods so that backups only occure when you want them to
occure. For exemple (this is the default configuration)

$Conf{BlackoutPeriods} = [
{
hourBegin =  7.0,
hourEnd   = 19.5,
weekDays  = [1, 2, 3, 4, 5],
},
];

With this the backups won't occure (if the host is always connected,
but, as we talk about the localhost, it will always be connected) from
7h00 to 19h30 during all the week (from monday to friday). You can
define several blackout period.



On Thu, 03 Aug 2006 11:29:57 -0400
Rob Morin [EMAIL PROTECTED] wrote:

 Another question, why does the backup seem to start at 11am? is there
 a place to alter this. my server(localhost) is now backing up
 itself and the load is now at 6.00, this makes other services slow on
 the machine
 
 But it should do just an incremental backup now , right since it did
 its first backup yesterday?
 
 Any help appreciated
 
 Thanks..
 
 Rob Morin
 Dido InterNet Inc.
 Montreal, Canada
 Http://www.dido.ca
 514-990-
 
 
 
 daniel berteaud wrote:
  On Wed, 02 Aug 2006 10:44:18 -0400
  Rob Morin [EMAIL PROTECTED] wrote:
 
 
  Well I'm quite new in linux and maybe others users will tell you you
  shouldn't give www-data a sudo permission. Anyway, If you wan't
  backuppc to backup your localhost, the user how runs backuppc needs
  root permissions to reach some system files.
 
  I don't think adding these lines to sudoers introduce many security
  risk because you give root permission only for the /bin/tar program
  (or the /bin/rsync).
 
  Of course it would be better to run backuppc as a special user. If
  you have problem using sperl, I saw on that you can run a special
  instance of apache under another user, specially for backuppc.
  Maybe that would be more secure. 
 
 
 
 

  I had to use the user www-data as the backup user , because there
  is a problem for me to use sperl on all my debian systems for some
  reason Now if i add www-data to sudoers( which my apache runs
  as) does this introduce a security issue?
 
  Thanks...
 
  Rob Morin
  Dido InterNet Inc.
  Montreal, Canada
  Http://www.dido.ca
  514-990-
 

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup to localhost?

2006-08-02 Thread daniel berteaud
On Wed, 02 Aug 2006 10:44:18 -0400
Rob Morin [EMAIL PROTECTED] wrote:


Well I'm quite new in linux and maybe others users will tell you you
shouldn't give www-data a sudo permission. Anyway, If you wan't
backuppc to backup your localhost, the user how runs backuppc needs root
permissions to reach some system files.

I don't think adding these lines to sudoers introduce many security
risk because you give root permission only for the /bin/tar program (or
the /bin/rsync).

Of course it would be better to run backuppc as a special user. If you
have problem using sperl, I saw on that you can run a special instance
of apache under another user, specially for backuppc. Maybe that would
be more secure. 




 I had to use the user www-data as the backup user , because there is
 a problem for me to use sperl on all my debian systems for some
 reason Now if i add www-data to sudoers( which my apache runs as)
 does this introduce a security issue?
 
 Thanks...
 
 Rob Morin
 Dido InterNet Inc.
 Montreal, Canada
 Http://www.dido.ca
 514-990-
 
 
 
 daniel berteaud wrote:
  Hello, I worked on the integration of backuppc to SME server
  distribution. I use it to save several host including the localhost.
  For the localhost, I use this per pc config:
 
  # start of the per pc config #
 
  $Conf{TarShareName} = ['/'];
 
  $Conf{BackupFilesExclude}=['/proc','/sys','/dev','/tmp','/home/e-smith/files/ibays/backup'];
 
  $Conf{XferMethod} = 'tar';
 
  $Conf{TarClientCmd} = '/usr/bin/sudo'
  . ' $tarPath -c -v -f - -C $shareName'
  . ' --totals';
 
  $Conf{TarFullArgs} = '$fileList';
 
  $Conf{CompressLevel} = 3;
 
  $Conf{TarIncrArgs} = '--newer=$incrDate $fileList';
 
  $Conf{TarClientRestoreCmd} = '/usr/bin/sudo'
 . ' $tarPath -x -p --numeric-owner --same-owner'
 . ' -v -f - -C $shareName+';
 
  # end of the per pc config #
 
  For this to work, you need to allow the backuppc user to run tar
  with sudo. I use this line in /etc/sudoers:
 
  backuppc ALL=(root) NOPASSWD:/bin/tar
 
  I hope this can help.
 
 
  On Tue, 01 Aug 2006 08:33:19 -0400
  Rob Morin [EMAIL PROTECTED] wrote:
 

  Hello all , i am new to this list
 
  I was looking around the archives for a method of backing up the 
  localhost. I could not find anything for a newbie i did come
  across a small email about using a conf file named localhost.pl
  that was suppose to be in the package, however i did not see it in
  my tarball...
 
  Can some one suggest hat i should do or point me to some docs...
 
  Thanks
 
  Have  a great day!
 
  Rob Morin
  Dido InterNet Inc.
  Montreal, Canada
  Http://www.dido.ca
  514-990-
 


-- 
Daniel Berteaud
FIREWALL-SERVICES SARL.
Société de Services en Logiciels Libres
Technopôle Montesquieu
33650 MARTILLAC
Tel : 05 56 64 15 32
Fax : 05 56 64 82 05
Mail: [EMAIL PROTECTED]
Web : http://www.firewall-services.com

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/