[BackupPC-users] Dump Databases

2014-11-20 Thread Fanny Alexandra Oyarzún Bórquez
Hi:

I databases postgres and mysql to back and I wonder if is it possible to
add two script DumpPreUSerCmd?, eg
DumpPreUSerCmd $sshPath -q -x -l root $host
/usr/local/sbin/automysqlbackup.sh; $sshPath -q -x -l root $host
/usr/local/sbin/autopgsqlbackup.sh

Regards
--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Dump Databases

2014-11-20 Thread Carl Cravens
On 11/20/2014 07:03 AM, Fanny Alexandra Oyarzún Bórquez wrote:
 I databases postgres and mysql to back and I wonder if is it possible to add 
 two script DumpPreUSerCmd?, eg
 DumpPreUSerCmd $sshPath -q -x -l root $host 
 /usr/local/sbin/automysqlbackup.sh;$sshPath -q -x -l root $host 
 /usr/local/sbin/autopgsqlbackup.sh

What we do is call a single script named 'prebackup', which then runs all the 
scripts it finds in /usr/local/prebackup.d/ using GNU 'run-parts'.

#!/bin/bash

LOCKFILE=/var/lock/local-prebackup.lock

dotlockfile -l -p $LOCKFILE

run-parts --exit-on-error /usr/local/prebackup.d

dotlockfile -u $LOCKFILE

-- 
Carl D Cravens (ra...@phoenyx.net)
Talk is cheap because supply inevitably exceeds demand.

--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Dump Databases

2014-11-20 Thread Fanny Alexandra Oyarzún Bórquez
Ok, thank you very much I have been helpful. :)

2014-11-20 10:54 GMT-03:00 Carl Cravens ccrav...@excelii.com:

 On 11/20/2014 07:03 AM, Fanny Alexandra Oyarzún Bórquez wrote:
  I databases postgres and mysql to back and I wonder if is it possible to
 add two script DumpPreUSerCmd?, eg
  DumpPreUSerCmd $sshPath -q -x -l root $host
 /usr/local/sbin/automysqlbackup.sh;$sshPath -q -x -l root $host
 /usr/local/sbin/autopgsqlbackup.sh

 What we do is call a single script named 'prebackup', which then runs all
 the scripts it finds in /usr/local/prebackup.d/ using GNU 'run-parts'.

 #!/bin/bash

 LOCKFILE=/var/lock/local-prebackup.lock

 dotlockfile -l -p $LOCKFILE

 run-parts --exit-on-error /usr/local/prebackup.d

 dotlockfile -u $LOCKFILE

 --
 Carl D Cravens (ra...@phoenyx.net)
 Talk is cheap because supply inevitably exceeds demand.


 --
 Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
 from Actuate! Instantly Supercharge Your Business Reports and Dashboards
 with Interactivity, Sharing, Native Excel Exports, App Integration  more
 Get technology previously reserved for billion-dollar corporations, FREE

 http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Status of the wiki (was: Re: web GUI downloading BIN file Ubuntu 14.04)

2014-11-20 Thread Kris Lou
On Thu, Nov 13, 2014 at 8:31 AM, Holger Parplies wb...@parplies.de wrote:

 I'd volunteer to help with that (both putting information online and
 cleaning
 up).


As would I.


Kris Lou
k...@themusiclink.net
--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] PIPE error

2014-11-20 Thread tschmid4
Correct, It's a question, here's another:
You are correct, the configuration was backing up a dir that was not there.
However, after removing the entry, it returns the same error.
What else would cause 'unable to read 4 bytes?'



Terry


-Original Message-
From: Holger Parplies [mailto:wb...@parplies.de]
Sent: Tuesday, November 18, 2014 1:27 PM
To: General list for user discussion, questions and support
Subject: Re: [BackupPC-users] PIPE error

Hi,

tschmid4 wrote on 2014-11-18 14:43:37 + [Re: [BackupPC-users] PIPE error]:
 I've resolved all my backup issues but 1.
 A backup will start on this server and then end prematurely with the 
 following:
 Xfer log:

is that a question?

 [...]
 full backup started for directory /systems; updating partial #11 [...]
 Remote[1]: rsync: change_dir /systems failed: No such file or
 directory (2) [...]

You are trying to backup a directory which doesn't exist. Check your 
configuration ($Conf {RsyncShareName}).

Regards,
Holger

--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server from Actuate! 
Instantly Supercharge Your Business Reports and Dashboards with Interactivity, 
Sharing, Native Excel Exports, App Integration  more Get technology previously 
reserved for billion-dollar corporations, FREE 
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.netmailto:BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] PIPE error

2014-11-20 Thread Les Mikesell
On Thu, Nov 20, 2014 at 12:08 PM, tschmid4 tschm...@utk.edu wrote:

 Correct, It's a question, here's another:
 You are correct, the configuration was backing up a dir that was not there.
 However, after removing the entry, it returns the same error.
 What else would cause 'unable to read 4 bytes?'


Unable to read 4 bytes is not the same error as:
 Remote[1]: rsync: change_dir /systems failed: No such file or directory (2)

Unable to read 4 bytes almost always means that your ssh keys are not
correct and you aren't starting the remote rsync at all.   And since
that was working before, it shouldn't happen unless you changed more
than the target directory.

-- 
   Les Mikesell
 lesmikes...@gmail.com

--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] rsync out of memory

2014-11-20 Thread Christian Völker
Hi,

I decided to implement BackupPC at a second location as it is really
reliable and nearly perfect! ;)

Unfortunately, I'm getting errors when backing up one host. My debugging
gave the following results:

bash-4.1$ ./BackupPC_dump -v -f pdc.evs-nb.de
cmdSystemOrEval: about to system /bin/ping -c 1 -w 3 abc.domain.de
[...]
full backup started for directory /
started full dump, share=/
Running: /usr/bin/ssh -q -x -l root abc.domain.de /usr/bin/rsync
--server --sender --numeric-ids --perms --owner --group -D --links
--hard-links --times --block-size=2048 --recursive -x -z -v
--no-inc-recursive -v --ignore-times . /
Xfer PIDs are now 9933
xferPids 9933
Got remote protocol 30
Negotiated protocol version 28
Remote[2]: [sender] expand file_list pointer array to 524288 bytes, did move
Remote[2]: [sender] expand file_list pointer array to 1048576 bytes, did
move
Xfer PIDs are now 9933,9934
xferPids 9933,9934
  create d 555   0/04096 .
Out of memory!
Parent read EOF from child: fatal error!
Done: 0 files, 0 bytes
Got fatal error during xfer (Child exited prematurely)
[...]
CheckHostAlive: returning 0.156
Backup aborted (Child exited prematurely)
Not saving this as a partial backup since it has fewer files than the
prior one (got 0 and 0 files versus 0)
dump failed: Child exited prematurely

I started rsync manually with the same options as above and I monitored
the memory consumption on the sender. This time it succeeded:
total: matches=26625  hash_hits=2561895  false_alarms=35 data=5775632621

sent 2144313 bytes  received 3721731504 bytes  5878257.01 bytes/sec
total size is 5845056098  speedup is 1.57
[root@backup srv]#


rsync uses around 112M, which should not be too much. Especially as the
host hast still approx 1G of free memory.  top is saying:
 3620 root  20   0  112m 8036  952 R 69.5  0.2   3:17.46 rsync


As you can see, I already added the --no-inc-recursive (reduce memory
usage) and the -x (do not cross file systems) option to the rsync
command.  Still no go. Rsync continues telling me about out of memory. 
I can't believe there should be too much files in it as I store all
other data except the OS itself on different partitions.
/dev/mapper/vg1_pdc-root  12G  5.5G  5.5G  50% /


Any other ideas how I can backup this server? I'm still wondering as
this server is not a very uncommon setup- just default OS.

Anyone having a clue?


Greetings

Christian






--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] PIPE error

2014-11-20 Thread Holger Parplies
Hi,

Les Mikesell wrote on 2014-11-20 12:26:39 -0600 [Re: [BackupPC-users] PIPE 
error]:
 On Thu, Nov 20, 2014 at 12:08 PM, tschmid4 tschm...@utk.edu wrote:
 
  Correct, It's a question, here's another:
  You are correct, the configuration was backing up a dir that was not there.
  However, after removing the entry, it returns the same error.
  What else would cause 'unable to read 4 bytes?'

An error in your configuration would.

 Unable to read 4 bytes is not the same error as:
  Remote[1]: rsync: change_dir /systems failed: No such file or directory 
 (2)

In particular, unable to read 4 bytes is not the error we're talking about.
Unable to read 4 bytes was *eventually* resolved despite your (the OPs)
efforts. The same error is Got fatal error during xfer (aborted by
signal=PIPE). So why are you interested in further causes of unable to read
4 bytes? Doing a survey?

Hope that helps.

Regards,
Holger

--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync out of memory

2014-11-20 Thread Les Mikesell
On Thu, Nov 20, 2014 at 2:07 PM, Christian Völker chrisc...@knebb.de wrote:
  Got remote protocol 30
 Negotiated protocol version 28
 Remote[2]: [sender] expand file_list pointer array to 524288 bytes, did move
 Remote[2]: [sender] expand file_list pointer array to 1048576 bytes, did
 move
 Xfer PIDs are now 9933,9934
 xferPids 9933,9934
   create d 555   0/04096 .
Out of memory!
 Parent read EOF from child: fatal error!



 I started rsync manually with the same options as above and I monitored
 the memory consumption on the sender. This time it succeeded:
 total: matches=26625  hash_hits=2561895  false_alarms=35 data=5775632621


Native rsync 3.x versions can use protocol 30 which can handle the
directory tree incrementally.   Backuppc's version uses protocol 28
(and forces the remote to use that also) which must xfer the entire
tree first and hold it in RAM at both ends while they walk it doing
the comparisions.

 sent 2144313 bytes  received 3721731504 bytes  5878257.01 bytes/sec
 total size is 5845056098  speedup is 1.57
 [root@backup srv]#


 rsync uses around 112M, which should not be too much. Especially as the
 host hast still approx 1G of free memory.  top is saying:
  3620 root  20   0  112m 8036  952 R 69.5  0.2   3:17.46 rsync

I'm not actually sure from your error message which end was out of
memory.  How do things look on the target system while the backuppc
backup runs?

 As you can see, I already added the --no-inc-recursive (reduce memory
 usage) and the -x (do not cross file systems) option to the rsync
 command.  Still no go. Rsync continues telling me about out of memory.
 I can't believe there should be too much files in it as I store all
 other data except the OS itself on different partitions.
 /dev/mapper/vg1_pdc-root  12G  5.5G  5.5G  50% /


 Any other ideas how I can backup this server? I'm still wondering as
 this server is not a very uncommon setup- just default OS.

 Anyone having a clue?

Memory use should relate to the number of files more than the total
size.  Do any directories have a huge number of tiny files?

-- 
   Les Mikesell
  lesmikes...@gmail.com

--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync out of memory

2014-11-20 Thread Holger Parplies
Hi,

Christian Völker wrote on 2014-11-20 21:07:59 +0100 [[BackupPC-users] rsync out 
of memory]:
 [...]
 Unfortunately, I'm getting errors when backing up one host. My debugging
 gave the following results:
 
 bash-4.1$ ./BackupPC_dump -v -f pdc.evs-nb.de
 cmdSystemOrEval: about to system /bin/ping -c 1 -w 3 abc.domain.de

so ClientNameAlias for pdc.evs-nb.de is abc.domain.de? ;-)

 [...]
 Running: /usr/bin/ssh -q -x -l root abc.domain.de /usr/bin/rsync
 --server --sender --numeric-ids --perms --owner --group -D --links
 --hard-links --times --block-size=2048 --recursive -x -z -v
 --no-inc-recursive -v --ignore-times . /
 [...]
 Anyone having a clue?

Yes. Remove the '-z' option. It won't work with File::RsyncP.

Regards,
Holger

--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync out of memory

2014-11-20 Thread Christian Völker
Hi Les,


 Native rsync 3.x versions can use protocol 30 which can handle the
 directory tree incrementally. Backuppc's version uses protocol 28 (and
 forces the remote to use that also) which must xfer the entire tree
 first and hold it in RAM at both ends while they walk it doing the
 comparisions. 

Is there a way to force protocol version 30 instead? Obviously with v28
the --no-incremental-recursive does not have any effect.
 I'm not actually sure from your error message which end was out of
 memory.  How do things look on the target system while the backuppc
 backup runs?
The host to be backed up seems to run out of memory. Has 4GB and while
rsync runs the number of page cache increases so just around 100M
remain as free memory. Though, rsync does not use more than 112M.
Swapping apears but I was not able to catch the exact moment when the
oom appears.

 Memory use should relate to the number of files more than the total
 size.  Do any directories have a huge number of tiny files?

Not really. See the output of find DIR -type f | wc -l for various dirs:
bin: 86
dev: 43
lib: 2999
lib64: 281
media/: 0
opt/: 0
root/: 21
selinux/: 0
store/: 0
tmp/: 1
boot/: 22
etc/: 1222
mnt/: 37
proc/: 37101
sbin/: 190
srv/: 44
sys/: 6831
usr/: 54306
var: 7176

The huge directory is not to be backed up at this stage because it is
mounted and rsync got the -x option. But for completenes:
home: 57949

Any further ideas?

Greetings

Christian



--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync out of memory

2014-11-20 Thread Christian Völker
Hi

 so ClientNameAlias for pdc.evs-nb.de is abc.domain.de? ;-)
You got me! Always the same with this security by obscurity thing! ;-)

 [...]
 Running: /usr/bin/ssh -q -x -l root abc.domain.de /usr/bin/rsync
 --server --sender --numeric-ids --perms --owner --group -D --links
 --hard-links --times --block-size=2048 --recursive -x -z -v
 --no-inc-recursive -v --ignore-times . /
 [...]
 Anyone having a clue?
 Yes. Remove the '-z' option. It won't work with File::RsyncP.

Got it! Backup is running fine now. Strange.
As I am backing up through a slow line is there another option to
improve speed by using compression? I can obviously set the compression
parameter for ssh- is it a good idea?

Thanks for this great hint!

Greetings

Christian


--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync out of memory

2014-11-20 Thread Les Mikesell
On Thu, Nov 20, 2014 at 3:04 PM, Christian Völker chrisc...@knebb.de wrote:

 Got it! Backup is running fine now. Strange.
 As I am backing up through a slow line is there another option to
 improve speed by using compression? I can obviously set the compression
 parameter for ssh- is it a good idea?

I've never actually measured the difference, but I have added the -C
option after $sshPath in the RsyncClientCmd for remote targets and it
at least doesn't break anything.

-- 
   Les Mikesell
 lesmikes...@gmail.com

--
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration  more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/