Re: [BackupPC-users] No Files dumped for share C$ - NT_STATUS_INVALID_PARAMETER

2018-08-16 Thread Carl W. Soderstrom
Sorry I don't have any better advice... but is this working for any other
Windows machines?

Will it work if you try to back up a shorter path? Perhaps create a
directory called C:\testbackup and try to back that up?

On 08/16 02:04 , Christopher Diekkamp wrote:
> Hello dear backuppc users,
> 
> I'm struggling setting up BackupPC to backup a windows 10 host via SMB.


-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Large directory

2018-07-06 Thread Carl W. Soderstrom
On 07/06 02:38 , Bowie Bailey wrote:
> The problem is that the original backup took 2 weeks to fail with no
> indication of problems that I could see... it was just very slow.  I
> posted a previous question about it on this list while it was running. 
> I could not find any bottlenecks or problems.  I'm reluctant to start it
> again without some idea of what I'm looking for.  How would you suggest
> I go about collecting more info?  Up the log level in BPC?  Make rsync
> more verbose?
> 
> Right now, the only error I've seen is the error that stopped the backup:
> rsync error: error in rsync protocol data stream (code 12) at io.c(1556)
> [generator=3.0.9.12]
> 
> The main annoyance is that I have no way to track progress.  While the
> backup is running, I can't tell if it's about to finish, or if it's
> bogged down and is likely to take a few more hours (or days).


Do you have a preferred tool for tracking how much bandwidth is in use, and
to which ports and hosts it's going to?

For this I use 'iftop' (or, 'iftop -BP' to show output by bytes and to name
the ports). It's a good way to see if the backup is still moving data or if
it's hung up on something.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Slow local backup

2018-06-22 Thread Carl W. Soderstrom
On 06/20 03:57 , Holger Parplies wrote:
> Carl W. Soderstrom wrote on 2018-06-14 16:03:24 -0400 [Re: [BackupPC-users] 
> Slow local backup]:
> > On 06/14 03:38 , Bowie Bailey wrote:
> > > On 6/14/2018 3:27 PM, Michael Stowe wrote:
> > > > Why are you using rsyncd over the loopback instead of ??? rsync?
> > > 
> > > Mainly because that's the way all of my other clients are being backed
> > > up [...]
> > 
> > I've always used tar for local backups. The advantage of rsync is greater in
> > bandwidth-constrained environments because it saves moving whole files over
> > the network. However, if the file needs to be read anyway to see if anything
> > has changed, then nothing is saved because the local machine is the same as
> > the remote machine.
> 
> well, mostly true. You still save copying large amounts of data from one
> process address space to another and possible some context switches. While
> that may not make rsync *faster* than tar on local backups, it might mean
> it's not much slower. It probably depends on your setup. And it probably
> has low enough impact not to worry about it.

Thanks for the responses on this, you have some very good knowledge which
I've managed to forget over the years.

I remember once a long time ago, comparing the speed of an initial backup
and finding that tar was 3x-4x faster than rsync for the first data
transfer. That showed me how much overhead there was in rsync, and that it
may be better to go with tar for at least some backup cases.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Are Files Re-Copied In Pool On Reinstall of Backup Client Operating System

2018-06-19 Thread Carl W. Soderstrom
On 06/19 03:18 , Akibu Flash wrote:
> So, does the BackupPC pool keep the original backed-up files indefinitely, or 
> is there a period of time after which it deletes the files, the theory being 
> it doesn't see that Windows client anymore and thus the pool files aren't 
> needed anymore.  For example, if I my Windows machine is off line for a 
> couple of weeks while I am reinstalling the operating system, will BackupPC 
> delete the files in the interim?  Thanks.

It keeps old files until the backup containing them gets expired by newer 
backups.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Slow local backup

2018-06-14 Thread Carl W. Soderstrom
On 06/14 03:38 , Bowie Bailey wrote:
> On 6/14/2018 3:27 PM, Michael Stowe wrote:
> > Why are you using rsyncd over the loopback instead of … rsync?
> >
> 
> Mainly because that's the way all of my other clients are being backed
> up and what I found when I searched for how to back up a local
> filesystem said to do it the same way as the others and just point it to
> localhost.  I use rsyncd rather than rsync to avoid the ssh overhead.  I
> expected a backup done via the loopback interface to be fast since it
> doesn't have the normal networking bandwidth limitations.

I've always used tar for local backups. The advantage of rsync is greater in
bandwidth-constrained environments because it saves moving whole files over
the network. However, if the file needs to be read anyway to see if anything
has changed, then nothing is saved because the local machine is the same as
the remote machine.

I may be incorrect about some of my understanding here, I know rsync does a
few things which tar does not, but which slip my brain at the moment. Also,
some uses of rsync may be more efficient than this by only checking
timestamps.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Slow local backup

2018-06-14 Thread Carl W. Soderstrom
FWIW, here's my localhost.pl file:

#
# Local server backup of /etc as user backuppc
#
$Conf{XferMethod} = 'tar';

#for some reason pings fail
$Conf{PingCmd} = '/bin/true';


$Conf{BackupFilesExclude} = ['/proc', '/sys', '/var/lib/backuppc', '/var/log', 
'/tmp', '/var/tmp', '/staff', '/mnt'];

#$Conf{TarShareName} = ['/etc', '/var/lib/backuppc/.ssh/', '/root'];
$Conf{TarShareName} = ['/'];

$Conf{TarClientCmd} = '/usr/bin/env LC_ALL=C /usr/bin/sudo $tarPath -c -v -f - 
-C $shareName'
. ' --totals';

# turning off compression on these files, so they can be recovered without
# backuppc.
# wouldn't make sense to need your backup server, 
# in order to recover your backup server, now would it?
$Conf{CompressLevel} = 0;

# do backups anytime
$Conf{BlackoutPeriods} = [];

# remove extra shell escapes ($fileList+ etc.) that are
# needed for remote backups but may break local ones
#$Conf{TarFullArgs} = '$fileList';
#$Conf{TarIncrArgs} = '--newer=$incrDate $fileList';

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Out of inodes, how to recover?

2018-05-24 Thread Carl W. Soderstrom
On 05/24 05:21 , Holger Parplies wrote:
> The BackupPC daemon remembers the correct usage, and tends to ensure that
> BackupPC_nightly is not run concurrently with BackupPC_link (or whatever
> restrictions may apply for your version of BackupPC).
> 
>   BackupPC_serverMesg BackupPC_nightly run
> 
> appears to be the correct invocation to *let the BackupPC server* run
> BackupPC_nightly as soon as it is safe to do so. You will need to run
> that as the backuppc user.

Ah, yes, thank you for the reminder. It's been a few years since I had to do
this.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Out of inodes, how to recover?

2018-05-15 Thread Carl W. Soderstrom
On 05/15 02:14 , Tapio Lehtonen wrote:
> Question: The host is already out of inodes on backuppc partition, can it
> still remove old backups now obsolete since lower FullKeepCnt? And old files
> are removed and inodes made available? I assume this happens during nightly
> cleanup runs, so I have to wait until next day to find  out.


It is possible to run BackupPC_nightly by hand. Here's the help for it:

backuppc@archivist-2:~$ /usr/share/backuppc/bin/BackupPC_nightly --help
/usr/share/backuppc/bin/BackupPC_nightly version [unknown] calling
Getopt::Std::getopts (version 1.07 [paranoid]),
running under Perl version 5.18.2.

Usage: BackupPC_nightly [-OPTIONS [-MORE_OPTIONS]] [--] [PROGRAM_ARG1 ...]

The following single-character options are accepted:
Boolean (without arguments): -m

Options may be merged together.  -- stops processing of options.
  [Now continuing due to backward compatibility and excessive paranoia.
   See 'perldoc Getopt::Std' about $Getopt::Std::STANDARD_HELP_VERSION.]
usage: /usr/share/backuppc/bin/BackupPC_nightly [-m] poolRangeStart poolRangeEnd


and here's an example usage (from memory, so this may be wrong):
/usr/share/backuppc/bin/BackupPC_nightly 0 255


-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] migration of backupc env between machines

2018-05-14 Thread Carl W. Soderstrom
On 05/14 07:21 , daggs wrote:
> Greetings Carl,
> 
> > Sent: Sunday, May 13, 2018 at 11:58 PM
> > From: "Carl W. Soderstrom" <carl.soderst...@real-time.com>
> > To: "General list for user discussion, questions and support" 
> > <backuppc-users@lists.sourceforge.net>
> > Subject: Re: [BackupPC-users] migration of backupc env between machines
> >
> > Are you planning on moving your SSH host keys as well?
> not sure what do you mean by "moving my ssh host keys", can you elaborate a 
> bit more on this please?

Are you doing your backups over SSH (rsync over ssh), or some other
mechanism (tar, smb)? If so, you'll need to copy (at least) the private side
of the ssh key from ~backuppc/.ssh/. I'm just mentioning it to be sure you
don't forget it.

> > How are you planning on migrating your BackupPC data pool?
> my config.pl has these two entries:
> $Conf{TopDir} = '/mnt/backup';
> $Conf{ConfDir} = '/etc/BackupPC';
> 
> /mnt/backup is a mount of a external drive, e.g. all my data sits on an 
> external drive.
> assuming I'm not missing anything, all the data pool is movable. am I wrong?

These two things (and the SSH keys) should suffice, IIRC. If you can move
the external drive from one computer to another, that will give you a quick
and easy data migration from one computer to the other.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] migration of backupc env between machines

2018-05-13 Thread Carl W. Soderstrom
Are you planning on moving your SSH host keys as well?
How are you planning on migrating your BackupPC data pool?

On 05/12 07:35 , daggs wrote:
> Greetings, I'm in the process of migrating my backup env from gentoo to 
> debian.
> the version is the same on both sides, I wanted to verify, all I need to do 
> is to migrate the actual info and the hosts file?
> I've also migrated the user and password in config.pl
> is there any other thing I need to migrate?

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] about migrate to a new server

2018-05-10 Thread Carl W. Soderstrom
I've never worked with v4 yet, so someone else will have to answer this.

On 05/10 09:37 , Luigi Augello wrote:
> It is V3 and the new backuppc server have V4.
> I cannot preserve the old server I will return it because it was on leasing.
> which is the solution. Can I convert V3 to V4 and after can I make the
> migration to new server ?

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] about migrate to a new server

2018-05-10 Thread Carl W. Soderstrom
Is this BackupPC v3 or v4?
With v3, the number of symlinks make it impractical to copy data from one
machine to another (you can do it if you dd the partition to a partition on
the new machine, but don't try a file-level copy unless you use some
specialized scripts and really know what you're doing).

It's generally easiest to just set up a new server and keep the old one
around as a backup until the data can be considered 'expired'. Much simpler
and more reliable than trying to migrate data.

For BackupPC v4 I have no experience.

On 05/10 08:27 , Luigi Augello wrote:
> as from subject I need to migrate data-user from an old server to a new
> server it is right to copy data directory from old server  into new
> server or i will havew problems of data  compressed ?


-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Successful Backup ?

2018-04-24 Thread Carl W. Soderstrom
On 04/24 09:03 , Stefan Denzinger wrote:
> tarExtract: Done: 0 errors, 972441 filesExist, 867594590134 sizeExist, 
> 623807047182 sizeExistComp, 1016358 filesTotal, 867740315048 sizeTotal
> 
> Got fatal error during xfer (tar:712  Total bytes received: 867730506892)
> 
> Backup aborted (tar:712  Total bytes received: 867730506892)
> 
> Saving this as a partial backup
> 
> So what does this means to me ? is the full backup now successful or not ?
> When I try to start an incremental backup, it always starts a new full and 
> after 8 hours the same error came up. But I want to have new increments..
> Does anybody know my mistake ?


Is there a reason you're using the 'tar' backup method instead of 'rsync'?
Rsync tends to workd better for most circumstances.

To debug this, you can try looking at the error log. On the BackupPC page
for each host you back up, there is a 'XferLog' link for each backup. You
can click on that link and look at the errors generated as the backup runs. 

Alternatively, try running the BackupPC_dump command by hand.
1. become the 'backuppc' user
2. run '/usr/share/backuppc/bin/BackupPC_dump -f -v host.example.com'
3. watch the output scroll by, and see where it stops. This may be where
your problem is localized.

It's likely that the backup is stumbling over a named pipe, socket file,
symlink loop, or some other artifact on the filesystem which needs to be
excluded.


-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Getting data from cpool

2018-04-11 Thread Carl W. Soderstrom
I'm impressed. Good job!
I didn't know you were just looking for a few individual files, I was
expecting you needed to restore an entire machine.
What backup solution are you going to in the future?

On 04/11 11:46 , anonymous1...@cock.li wrote:
> Dear Carl,
> 
> Thank you very much for your reply that clarified the situation.
> 
> Yes, pc and pool folders are missing due to for me unknown reason.
> 
> I've written a small script that unpacks the files through zlib and maps
> them to description that "file" unix command can deliver.
> 
> That way I could sort out Word Documents and PDF files out there and
> required documents were found.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Getting data from cpool

2018-04-10 Thread Carl W. Soderstrom
On 04/09 05:59 , Michael Huntley wrote:
> Here are the files I got with a message "it's our backup":
> 
> config
> cpool
> lock.roster
> lost+found
> 
> What is the easiest way (if any :-) to extract browsable filestructure
> out of this?


So there's no 'pc' directory with trees of directories, one for each machine
which has been backed up?

The pc/ directory tree stores the mappings of what file goes to where on
what machine. The cpool/ directory tree stores the data according to hashed
names which are used for deduplication, and while files are symlinked from
one to the other, you need both in order to figure out what goes where.

(There may be a pool/ directory as well, which stores the data in the same
way as cpool/, but in uncompressed format).

Without the pc/ directory you're in trouble.


-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Import existing Backups from rsnapshot into BackupPC

2018-03-09 Thread Carl W. Soderstrom
On 03/09 10:56 , Zelislav Rudolf Slamaj wrote:
> We are moving from our rsnapshot Backups to BackupPC4.x. I have a question:
> is there a possibility we can /import/ our existing backups into BackupPC?
> Because we don't want to switch backups and we can't start by backing up all
> our servers at once. There are quite a lot of them and would take too much
> time.


The best thing to do is probably the same as what is recommended when
creating a new BackupPC server. When backups are working on the new BackupPC
server, stop doing backups on the old server. Then keep the old server
around until the backups would have expired.

Due to the many files involved in a backup, it's usually a tremendous amount
of time needed to move them all from one machine to another, even if it
could be accomplished efficiently. So if possible it's better to avoid that.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Best way to copy backuppc data

2018-02-06 Thread Carl W. Soderstrom
On 02/06 08:51 , Adam Pribyl wrote:
> On Mon, 5 Feb 2018, Carl W. Soderstrom wrote:
> 
> > On 02/04 07:48 , Raoul Bhatia wrote:
> > > What's the issue with using LVM?  Unless you need to reinitialize
> > > the whole fs, i.e. increase EXT4 inode count or switch to another
> > > fs, believe this is a perfect example of where LVM shines.
> > 
> > I'd like to insert a word of caution based on experience. I don't know about
> > using LVM for copying data to a new disk; but I do know that using LVM
> > snapshots to get a quiesced pool for backup, was a poor idea in my case. It
> > took 12h to get a tar dump of a given BackupPC data pool when BPC was
> > fully stopped, but 2 days to try to backup the same data pool when it was
> > quiesced using an LVM snapshot.
> 
> This is the thing - I was thinking about snapshot too, luckily I did not
> used them. To make a 1:1 copy you may use the LVM mirror feature, which is a
> bit new. I used "pvmove" - that _moved_ the data from one disk to another
> fine, but of course is not a copy. I wanted to separated backuppc data from
> /var mount point to its own disk that would be mounted in /var/lib/backuppc
> - that was the reason to make a copy, but as it failed I changed my
> approach.

Thanks for the information. I haven't looked at LVM for many years. 

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Best way to copy backuppc data

2018-02-05 Thread Carl W. Soderstrom
On 02/04 07:48 , Raoul Bhatia wrote:
> What's the issue with using LVM?  Unless you need to reinitialize the whole 
> fs, i.e. increase EXT4 inode count or switch to another fs, believe this is a 
> perfect example of where LVM shines.

I'd like to insert a word of caution based on experience. I don't know about
using LVM for copying data to a new disk; but I do know that using LVM
snapshots to get a quiesced pool for backup, was a poor idea in my case. It
took 12h to get a tar dump of a given BackupPC data pool when BPC was
fully stopped, but 2 days to try to backup the same data pool when it was
quiesced using an LVM snapshot. 

We decided it was better to have BPC offline for a short period (scheduled
on the weekend), but running full speed the rest of the time, rather than
degraded in performance for 4x as long.

Not fully relevant to the problem at hand, but still worth considering for
future reference.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Best way to copy backuppc data

2018-02-05 Thread Carl W. Soderstrom
On 02/03 10:30 , Les Mikesell wrote:
> On Sat, Feb 3, 2018 at 5:50 AM, Adam Pribyl  wrote:
> > On Fri, 2 Feb 2018, Iturriaga Woelfel, Markus wrote:
> >
> >
> > I am running short on ideas how to copy this.
> 
> If you are copying to an identical disk you can use dd on the raw
> devices.The system may be confused by the identical UUID's if you
> have both drives connected when you reboot, though.

Even if it's not an identical disk, if the target disk is larger, you can
use dd to copy the data pool partition, then grow your filesystem after
this. I've used this successfully in the past on a few occasions. dd is
definitely the fastest way I found to copy a BackupPC data pool (tho I
haven't experimented with ZFS or btrfs).

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup running and ignoring blackout time.

2018-01-22 Thread Carl W. Soderstrom
On 01/22 09:29 , Zielke, Julian, NLI wrote:
> First of all, the weekDays variable says "1,2,3,4,5,6,7". So why is Sunday 
> cutoff here?

Try numbering 0-6, with '0' for Sunday.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] File System containing backup was too full

2017-12-18 Thread Carl W. Soderstrom
On 12/18 03:15 , Adrien Coestesquis wrote:
> But today i have 4.5TB left and this is sufficient to make backups with my
> retention configuration. So why backuppc complains about this ? how 63%
> (today's disk utilisation) is superior to 95% ?

What is the output of 'df -i'? Could it be that you have 95% of your inodes
used? BackupPC is very hungry for inodes, so ext[2,3,4]fs are generally not
the best choice.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Create Archive of old backup

2017-12-11 Thread Carl W. Soderstrom
On 12/11 06:49 , Gerald Brandt wrote:
> I have a request from Management to create an archive from a backup I did in
> 2015. Is there a way to archive an old backup?

The easiest may be to use BackupPC_tarCreate to generate a tar stream of the
backup in question, and redirect to a tarfile.

Something like:
/usr/share/backuppc/bin/BackupPC_tarCreate -n  -h
host.example.tld -s / / > archive_of_host_example_tld-2015-xx-xx.tar

Have you used BackupPC_tarCreate before? Or do you need more help with the
command syntax?

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/