On Tue, Mar 8, 2022 at 8:18 AM G.W. Haywood via BackupPC-users
wrote:
> It isn't clear to me exactly what you're doing here, but I thought it
> might be worth a mention that if you back up files from Linux box A
> and restore them to Linux box B you need to be a bit careful about the
> owner and g
Hi there,
On Tue, 8 Mar 2022, ralph strebbing wrote:
what we're trying to do is get the ACLs working from the Archive
host. It is correctly exporting the tar.gz files, but when I extract
them onto my PC, the files have lost their ownership/group ...
It isn't clear to me exactly what you're do
In the past, rather than use the BackupPC archive function, I used rsync from
the command line to back-up my Backuppc server to a removable USB drive on a PC
on my network. Don’t know if that will work for you but, may be worth
exploring.
Paul Herron
p...@october11th.com | +1 (202) 413-2
On Mon, Mar 7, 2022 at 5:02 PM wrote:
>
> I'm not sure that tarCreate or archiveHost respect ACLs.
> I know that rsync restore does work...
Right, and that's what we're using right now to backup the Hosts. But
now we need to make our offline and offsite backups. So the BackupPC
localhost is now se
I'm not sure that tarCreate or archiveHost respect ACLs.
I know that rsync restore does work...
ralph strebbing wrote at about 15:53:24 -0500 on Monday, March 7, 2022:
> Hi again,
>
> So after updating to BackupPC 4.x, ACLs work for rsync (yay). Now what
> we're trying to do is get the ACLs wo
Hi again,
So after updating to BackupPC 4.x, ACLs work for rsync (yay). Now what
we're trying to do is get the ACLs working from the Archive host. It
is correctly exporting the tar.gz files, but when I extract them onto
my PC, the files have lost their ownership/group properties and ACLs.
I attemp
Mtime won't be a reliable indication of what's in an incremental backup.
First, other meta data might have changed (eg, rsync checks all file meta
data including permissions, size etc) that would cause the file to be
transferred during the incremental. You could look at the XferLOG file to
parse w
Hello,
Archive of incremental backups (only delta) has not been discussed in a long
time. One of the proposed option was to create a tarball of pc/host/BackupNum
however this requires some gymnastics to correctly interpret the paths. Has
anyone come up with an elegant solution to this problem?
No, this isn't supported. There are tools (like BackupPC_tarCreate) in
BackupPC that could be used with external tools or scripts to write/manage
tapes.
Or you could run something like Bacula in parallel with BackupPC if you
really want tapes.
Craig
On Thu, Oct 12, 2017 at 11:08 AM, Gandalf Cor
Hi to all
Is possible to use backuppc archive function to make tar archives on a tape
library?
While backuppc able to automatically change tapes?
--
Check out the vibrant tech community on one of the world's most
engaging
Ulrich,
As Kris suggested, you could use a script to create a copy of the latest
backup tree.
For example, you could use a command like this (replace HOST, SHARENAME and
/PATH/TO/USB/DRIVE) to extract the most recent backup of share SHARE on
host HOST:
BackupPC_tarCreate -h HOST -n -1 -s SHARENA
Hi,
Actually when i made the archive, backuppc create 9 file with the
extention .tar.gz.
What i want is 9 directory, with each directory the archived file
decompressed.
I would like if i need a recovery, i can plug the usb drive and navigate
to the different file without to have to uncompres
>
> To archive the last backup of each host, i would like to copy the last
> extracted version of each host to a usb drive without using tar.
> I would like on the usb drive (to restore easily a file if necessary)
> that the drive can be used even without backuppc with for each host the
> directory
Hi,
I use backuppc version 4, and i'm very happy.
It is the best tool i use to make my backup.
To archive the last backup of each host, i would like to copy the last
extracted version of each host to a usb drive without using tar.
I would like on the usb drive (to restore easily a file if necessa
On Fri, 9 Dec 2016 14:01:01 +0100
Peter Viskup wrote:
> Dear all,
Dear alone,
> would like to ask whether it would be possible to use BackupPC to
> store encrypted archives of sensitive directories from the clients
> *only*.
The easiest way I found to do so is to have each client encrypting it
Dear all,
would like to ask whether it would be possible to use BackupPC to
store encrypted archives of sensitive directories from the clients
*only*.
We do have server with ssh connection to other clients.
We need to create GPG encrypted archives from some directories from all clients.
We do not
Thank you, yes, it's really there...
+--
|This was sent by v...@qrz73.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--
---
>Hello. Ubuntu 12.04/Backuppc 3.3.0. Trying to create archive host for
>removable media. Host created OK in "Edit Hosts". Host appeared in "Hosts with
>>no Backups" in the "Host Summary". After changing transfer metod to "Archive"
>and setting backup destination, host just disappears from Host >
Hello. Ubuntu 12.04/Backuppc 3.3.0. Trying to create archive host for removable
media. Host created OK in "Edit Hosts". Host appeared in "Hosts with no
Backups" in the "Host Summary". After changing transfer metod to "Archive" and
setting backup destination, host just disappears from Host Summar
Stefan Peter wrote on 11/18/2012 04:55:49 PM:
> On 18.11.2012 20:44, Till Hofmann wrote:
> > But now, when I'm trying to archive more clients at once, the
> process receives ALRM after exactly 20 hours.
> >
> > 2012-11-09 19:54:29 Starting archive
> > 2012-11-10 15:54:29 cleaning up after signa
Thank you for the hint, I totally forgot about the timeout setting! Now it
finished archiving properly.
On Sun, Nov 18, 2012 at 10:55 PM, Stefan Peter wrote:
> On 18.11.2012 20:44, Till Hofmann wrote:
> > But now, when I'm trying to archive more clients at once, the process
> receives ALRM after
On 18.11.2012 20:44, Till Hofmann wrote:
> But now, when I'm trying to archive more clients at once, the process
> receives ALRM after exactly 20 hours.
>
> 2012-11-09 19:54:29 Starting archive
> 2012-11-10 15:54:29 cleaning up after signal ALRM
> 2012-11-11 03:16:38 Archive failed (aborted by si
Hey,
I'm having problems trying to archive. I use archive with gzip and I write to a
locally mounted disk (ext4). The disk has enough free space.
When I set up the archive, I tried to archive a single client, which worked
perfectly.
But now, when I'm trying to archive more clients at once, the p
I'm running an archive job of 4 hosts which is around 180GB. The
archive takes about 150 minutes and that works out to around 20MB/sec.
I'm running to /tmp on the same drive hosting the pool. The machine
is dedicated to backuppc and no other backup jobs are running during
the archive. I can move
On Sat, Aug 25, 2012 at 12:52 PM, Trey Dockendorf wrote:
> I've been using BackupPC for many years now, and am now for the first
> time going to be using the Archive features. For me the archives will
> not be for off-site or remote storage, but a sort of "out-of-band"
> copy of the backups. The
I've been using BackupPC for many years now, and am now for the first
time going to be using the Archive features. For me the archives will
not be for off-site or remote storage, but a sort of "out-of-band"
copy of the backups. The idea being that if BackupPC is down, and all
that is available is
On Tue, Jun 19, 2012 at 9:45 AM, Timothy J Massey wrote:
>
> > I think the problem is that your rsync module name doesn't match the
>
> > mountpoint. I don't know if that is even possible for '/'. But the
> > error message still does not make sense to me.
> >
> >
> > I can not use / as rsyncd m
Steve Kieu wrote on 06/19/2012 01:14:46 AM:
> I think the problem is that your rsync module name doesn't match the
> mountpoint. I don't know if that is even possible for '/'. But the
> error message still does not make sense to me.
>
>
> I can not use / as rsyncd module name. So I must use
I think the problem is that your rsync module name doesn't match the
> mountpoint. I don't know if that is even possible for '/'. But the
> error message still does not make sense to me.
>
>
I can not use / as rsyncd module name. So I must use something else.
I would like to emphasize that usin
On Mon, Jun 18, 2012 at 6:40 PM, Steve Kieu wrote:
>
> Directory name (rootfs) is generated by backuppc - It think It is based on
> the rsynd module name (if I named it differently then the foldername is
> changed as well).
>
> When doing restore and select download as Tar archive, all is fine, t
> III days, but really, what else does your CPU have to do during the
> backup time? You are mostly disk bound anyway - unless maybe you are
> using IDE or USB drives that need CPU for the I/O work.
>
>
We have a server (nagios) that constantly having load around 15 to 20 -
only two cores. Before
On Mon, Jun 18, 2012 at 7:50 PM, Shawn Carroll wrote:
>
>> Because I do not want to spend cpu time on ssh - And why not rsyncd
>> for local LAN. It should be done that way when encryption is not
>> needed.
>
> Does anyone choose to deal with this by simply specifying no encryption as an
> ssh opt
> Does anyone choose to deal with this by simply specifying no encryption as
> an ssh option?
>
I have heard that ssh no longer support option cipher=none but I will
recheck - last time I did not work for me
>
> Shawn
>
>
> --
> > When the target is a linux system, why not just use rsync over ssh?
>
>
>
>
> Because I do not want to spend cpu time on ssh - And why not rsyncd
> for local LAN. It should be done that way when encryption is not
> needed.
Does anyone choose to deal with this by simply specifying no encry
> under frootfs folder.
>
> When the target is a linux system, why not just use rsync over ssh?
>
>
Because I do not want to spend cpu time on ssh - And why not rsyncd for
local LAN. It should be done that way when encryption is not needed.
Backuppc hardlinks all identical files for its own s
On Sat, Jun 16, 2012 at 6:52 AM, Steve Kieu wrote:
>
>> What is the ./rootfs/ directory, and why is that not the place it is
>> trying to write? Do you actually have a hardlinked structure like
>> that on the backup target?
>>
>
> I use the rsyncd and modules name for path = / is rootfs . Then b
> What is the ./rootfs/ directory, and why is that not the place it is
> trying to write? Do you actually have a hardlinked structure like
> that on the backup target?
>
>
I use the rsyncd and modules name for path = / is rootfs . Then backup the
whole root (with some exclude of course) using bac
On Thu, Jun 14, 2012 at 8:35 PM, Steve Kieu wrote:
>
> tar: ./rootfs/sbin/tune2fs: Cannot hard link to `sbin/e2label': No such file
> or directory
>
>
> Obvoulsy it is reproducable here.
What is the ./rootfs/ directory, and why is that not the place it is
trying to write? Do you actually have a
Hello
That has always been filled for me.
>
> > For now I have to use the restore options and download as tar archive,
> which
> > works.
>
> I think both approaches should do the same thing. And the same for
> running Backuppc_tarCreate from the command line. Not sure where to
> start to debug
On Tue, Jun 12, 2012 at 6:05 PM, Steve Kieu wrote:
>
>> I don't think I've ever seen that. Is there some simple way to reproduce
>> it?
>>
>
>
> Yes just choose archive host and select a host and archive it, choose tar gz
> as format, and type the path to the file. Then move the file to some othe
> I don't think I've ever seen that. Is there some simple way to reproduce
> it?
>
>
Yes just choose archive host and select a host and archive it, choose tar
gz as format, and type the path to the file. Then move the file to some
other boxes probably different OS than the archived host like in
On Mon, Jun 11, 2012 at 11:10 PM, Steve Kieu wrote:
>
> I used the archive host and archive one host to a file. When extracting the
> file I saw many error like:
>
> tar: ./rootfs/usr/lib/locale/fr_FR/LC_TELEPHONE: Cannot hard link to
> `usr/lib/locale/br_FR/LC_TELEPHONE': No such file or director
Hello everyone,
I used the archive host and archive one host to a file. When extracting the
file I saw many error like:
tar: ./rootfs/usr/lib/locale/fr_FR/LC_TELEPHONE: Cannot hard link to
`usr/lib/locale/br_FR/LC_TELEPHONE': No such file or directory
I guess when backupPC doing teh tar command
Den 2012-03-12 02:51, Les Mikesell skrev:
> On Sun, Mar 11, 2012 at 6:01 PM, Brad Morgan wrote:
>> I'm trying to setup offsite backups using the archive feature. My testing
>> shows that the backup number is part of the archive file name and that
>> generates a couple of questions I couldn't find
2012/3/27 Philip Kimgård :
> Hi again! Sorry I couldn't awnser you any sooner. This solved the problem,
> allthough another one showed up, maybe this is more of a problem with the
> BackupPC_archive(start?) function, the files don't get compressed. I did
> some searches online and tested to edit th
em? Or know how to get Backuppc_archivestart
compress all files?
/Philip
From: r...@hasselbaum.net
Date: Thu, 15 Mar 2012 12:04:55 -0400
To: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] Archive function
2012/3/15 Philip Kimgård
Thank you! The list approved the atta
On Fri, Mar 16, 2012 at 3:47 PM, Timothy J Massey wrote:
>
> Having said that, I've *never* had a RAID controller fail (IBM ServeRAID).
> Some have been pretty poor (such as the ServeRAID 8k), and drives fail, of
> couse, but never an array. And I've never lost an entire IBM RAID array for
>
Arnold Krille wrote on 03/16/2012 04:22:04 PM:
> even if you intended for this question to be off-list, I think my answer
> could be interesting to others as well;-)
No problem. It's kinda off-topic, but if it helps others, then great.
> And it works rather nice. (Some other parts of these f
Hi,
even if you intended for this question to be off-list, I think my answer
could be interesting to others as well;-)
On 16.03.2012 18:36, Timothy J Massey wrote:
> Arnold Krille wrote on 03/15/2012 04:29:45 PM:
>> The scripts are in /var/lib/backuppc/bin, so they are present on both
>> the cl
Hi,
On 15.03.2012 15:39, Rob Hasselbaum wrote:
> Attached is a script I execute from cron that deletes old archives in a
> configured directory and then starts new archives for all hosts except
> localhost. Feel free to use it as a starting point. Should run with minimal
> modifications on Ubuntu
2012/3/15 Philip Kimgård
> Thank you! The list approved the attachment and I tested it on the
> server, though, there's seem to be a problem with the part that returns a
> list of hosts from the hosts file. Only the part of the hostname before the
> first dot(.) gets returned, not the complete h
backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] Archive function
Attached is a script I execute from cron that deletes old archives in a
configured directory and then starts new archives for all hosts except
localhost. Feel free to use it as a starting point. Should run with minimal
m
Attached is a script I execute from cron that deletes old archives in a
configured directory and then starts new archives for all hosts except
localhost. Feel free to use it as a starting point. Should run with minimal
modifications on Ubuntu Server.
Not sure if the mailing list allows attachments
On Wednesday 14 March 2012 13:33:32 Philip Kimgård wrote:
> Hi,
> How can I choose to create an archive of all hosts like I can in the browser
> interface? I wan't to schedule an archive job using cron (or even better,
> in the CGI), but when I try to run BackupPC_archiveStart I can only choose
> o
Philip Kimgård wrote on 03/14/2012 09:33:32 AM:
> Hi,
> How can I choose to create an archive of all hosts like I can in the
> browser interface? I wan't to schedule an archive job using cron (or
> even better, in the CGI), but when I try to run
> BackupPC_archiveStart I can only choose one host
Hi,
How can I choose to create an archive of all hosts like I can in the browser
interface? I wan't to schedule an archive job using cron (or even better, in
the CGI), but when I try to run BackupPC_archiveStart I can only choose one
host, right?
Another (not as important though) problem is t
>> I had to create that directory because I got an error that the archive
>> parameters file could not be created. The actual archive location is
>> specified in the config file (which defaults to /tmp).
> Are you using a distribution-packaged install (.rpm or .deb)? I thought
> the per-host pa
On Mon, Mar 12, 2012 at 1:13 AM, Brad Morgan wrote:
>>> Why do I have to manually create the /var/lib/backuppc/pc/>> host> directory?
>
>> That's not where you would normally put the archive copies. More likely
> you would want them to go on another > machine via an NFS mount, or to
> separate sp
>> Why do I have to manually create the /var/lib/backuppc/pc/> host> directory?
> That's not where you would normally put the archive copies. More likely
you would want them to go on another > machine via an NFS mount, or to
separate space where you would move to different media later.
I had to
On Sun, Mar 11, 2012 at 6:01 PM, Brad Morgan wrote:
> I'm trying to setup offsite backups using the archive feature. My testing
> shows that the backup number is part of the archive file name and that
> generates a couple of questions I couldn't find answers to.
>
>
>
> Why do I have to manually c
I'm trying to setup offsite backups using the archive feature. My testing
shows that the backup number is part of the archive file name and that
generates a couple of questions I couldn't find answers to.
Why do I have to manually create the /var/lib/backuppc/pc/
directory?
If I setup a wee
On 6/10/2011 10:55 AM, Les Mikesell wrote:
> On 6/10/2011 9:06 AM, Joe Konecny wrote:
>> I have backuppc running and tested and it works great but our company
>> requires offsite storage of backups. Someone used to take a tape
>> home each night when we used Amanda. I've read the docs on the arch
On 6/10/2011 9:06 AM, Joe Konecny wrote:
> I have backuppc running and tested and it works great but our company
> requires offsite storage of backups. Someone used to take a tape
> home each night when we used Amanda. I've read the docs on the archive
> function and it says "BackupPC supports ar
I have backuppc running and tested and it works great but our company
requires offsite storage of backups. Someone used to take a tape
home each night when we used Amanda. I've read the docs on the archive
function and it says "BackupPC supports archiving to removable media.
For users that requir
Its not really easy to backup a backuppc, read in the mailing list rsync is not
a good idea because there a lot of hardlinks.
Whats the best way to creat offline backups?
-Original-Nachricht-
Subject: Re: [BackupPC-users] Archive without tarball - directly to the file
system
Date
Mikesell [mailto:lesmikes...@gmail.com]
Sent: Monday, May 23, 2011 12:05 PM
To: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] Archive without tarball - directly to the file
system
On 5/23/2011 4:44 AM, samuel_w...@t-online.de wrote:
> Is it possible to create a archive o
On 5/23/2011 4:44 AM, samuel_w...@t-online.de wrote:
> Is it possible to create a archive of a host without the tarball
> directly to the file system as it is on the server?
>
> My tarballs are now larger than 300 GB :-/
I don't think there is a way within the web interface to add components
to
Holger Parplies wrote at about 17:22:27 +0200 on Monday, May 23, 2011:
> I erraneously wrote to the list
> Holger Parplies wrote on 2011-05-23 16:30:57 +0200 [Re: [BackupPC-users]
> Archive without tarball - directly to the file system]:
> > Hallo,
> > [...]
>
&g
I erraneously wrote to the list
Holger Parplies wrote on 2011-05-23 16:30:57 +0200 [Re: [BackupPC-users]
Archive without tarball - directly to the file system]:
> Hallo,
> [...]
sorry, that was meant to be off-list. *kick my MUA*
Regards,
Hallo,
samuel_w...@t-online.de wrote on 2011-05-23 11:44:33 +0200 [[BackupPC-users]
Archive without tarball - directly to the file system]:
> http://p.sf.net/sfu/intel-dev2devmay
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
L
On Mon, May 23, 2011 at 4:44 AM, samuel_w...@t-online.de
wrote:
> Is it possible to create a archive of a host without the tarball directly to
> the file system as it is on the server?
>
>
>
> My tarballs are now larger than 300 GB :-/
There was some discussion on this subject previously and the
Is it possible to create a archive of a host without the tarball directly to the file system as it is on the server?
My tarballs are now larger than 300 GB :-/
Postfach fast voll? Jetzt kostenlos E-Mail Adresse @t-online.de sichern und endlich Platz für tausende Mails haben.http://www.t-online.
At present I (occasionally) archive my BackupPC onto another system
through an NFS mount.
Is this necessary?
Could I just specify the archive destination as machineX:/archive/ ?
--
Timothy Murphy
e-mail: gayleard /at/ eircom.net
tel: +353-86-2336090, +353-1-2842366
s-mail: School of Mathematic
Whoops, this email has been sitting in my Drafts folder for about a week.
:-) Better late than never, I guess.
@Les: Ahhh, now I see what you mean by, "especially if you intend to
schedule the runs with cron eventually." I had everything working perfectly
via the web GUI, but then nothing worke
On 11/11/10 07:32 PM, Frank J. Gómez wrote:
> That's basically what I've done... the command that should run is:
> BackupPC_tarCreate -t -h $host -n $bkupNum -s $share . | /bin/gzip
> | /usr/bin/gpg -r $gpgUser --encrypt | /usr/bin/split -b 65 -
> $outLoc/0.$host.tar$fileExt.
>
You might want
That's basically what I've done... the command that should run is:
BackupPC_tarCreate -t -h $host -n $bkupNum -s $share . | /bin/gzip
| /usr/bin/gpg -r $gpgUser --encrypt | /usr/bin/split -b 65 -
$outLoc/0.$host.tar$fileExt.
For some reason I'm getting stuff on that split error, though. I'll
On 11/11/2010 5:21 PM, Frank J. Gómez wrote:
> Sorry to be banging on the list so much lately, but I've got another
> issue I don't understand...
>
> I've modified a copy of BackupPC_archiveHost, with the changes being:
>
> * I'm piping output to gpg after compression and before splitting
>
Sorry to be banging on the list so much lately, but I've got another issue I
don't understand...
I've modified a copy of BackupPC_archiveHost, with the changes being:
- I'm piping output to gpg after compression and before splitting
- I want my archive filenames in this format: 0.$host.tar.
Guys
Thank you very much for your help to do this. In the end I have used:
edit /etc/backuppc/config.pl and set:
$Conf{BackupsDisable} = 2;
Many Thanks
Chris
--
This SF.net email is sponsored by Sprint
What will you do
On Tuesday 06 July 2010 16:34:58 Jim Kyle wrote:
> On Tuesday, July 6, 2010, at 10:22:12 AM, Chris Owen wrote:
> > Was hoping there might be away within BackupPC to stop backups from
> > running. But this will do the trick.
>
> Check the docs for the "BackupsDisable" setting in config.pl; it appea
Hi Chris,
When I saw your note, I first thought about just stopping the service
like in Richards suggestion below. If you want to keep the system
simi-active in case you need something from the old archive, I'm
wondering if just setting the black-out period to 24/7 would work.
-- ken
On Tue, 2
On Tuesday, July 6, 2010, at 10:22:12 AM, Chris Owen wrote:
> Was hoping there might be away within BackupPC to stop backups from
> running. But this will do the trick.
Check the docs for the "BackupsDisable" setting in config.pl; it appears to
be exactly what you are looking for!
--
Jim Kyle
m
edit /etc/backuppc/config.pl and set:
$Conf{BackupsDisable} = 2;
The comments above that directive in the config file will explain it.
--
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com
--
Richard
Thanks for this, I didn't think about doing it this way. Was hoping
there might be away within BackupPC to stop backups from running. But
this will do the trick.
Many Thanks
Chris
> On Tue, Jul 6, 2010 at 8:14 AM, Chris Owen
> wrote:
>
>> Hey.
>>
>> I have installed a new backupp
On Tue, Jul 6, 2010 at 8:14 AM, Chris Owen
wrote:
> Hey.
>
> I have installed a new backuppc server and have tested the backups on
> the new server and I am now happy to switch off the old server. The
> server is running other services so I wanted to know if there is away to
> stop the backups fro
Hey.
I have installed a new backuppc server and have tested the backups on
the new server and I am now happy to switch off the old server. The
server is running other services so I wanted to know if there is away to
stop the backups from running so that I can keep the old backups on this
serve
Why do I get this reported during an archive process using backuppc_tarcreate?
I've spot checked some of the files and the file size it reports it being
truncated to is the actual size of the file on the filesystem.
BackupPC_tarCreate -t -b 60 -w 4194304 -h $HN -n -1 -s \* .
is the command I
On 4/19/2010 5:22 PM, Eddie Gonzales wrote:
> ok, i was able to archive but it archive to the backuppc server. i was under
> the assumption that when you create a host(a different machine) and set that
> for archive, it should push the archives to that machine. am i missing
> something?
Yes, t
ok, i was able to archive but it archive to the backuppc server. i was under
the assumption that when you create a host(a different machine) and set that
for archive, it should push the archives to that machine. am i missing
something?
“Those who actually solve problems spend very little time
On 03/07 05:42 , Gerald Brandt wrote:
> I've been trying to automate archive (from iSCSI to USB drive) for offsite
> backups. Last Fridays ran fine (but took almost 20 hours). This Fridays
> failed at exactly 1200 minutes in, by a SIGALRM. Is there a time limit on
> archives?
Check this value in
On Sun, 7 Mar 2010 17:42:10 -0600 (CST), Gerald Brandt
wrote:
> Hi,
>
> I've been trying to automate archive (from iSCSI to USB drive) for
> offsite backups. Last Fridays ran fine (but took almost 20 hours). This
> Fridays failed at exactly 1200 minutes in, by a SIGALRM. Is there a time
> limit o
Hi,
I've been trying to automate archive (from iSCSI to USB drive) for offsite
backups. Last Fridays ran fine (but took almost 20 hours). This Fridays failed
at exactly 1200 minutes in, by a SIGALRM. Is there a time limit on archives?
The backups are only 282 GB total (written to USB drive) o
Hi,
Skip Guenter wrote on 2009-05-23 17:31:06 -0500 [[BackupPC-users] Archive/Pool
target drive]:
>
> I think a quick simple question...
... of which I simply don't understand, in what way it is related to Jeffrey's
question about exponential full backups. Please stop h
I think a quick simple question...
Does BackupPC care about the 'atime' stamp in EXT3 files systems?
In other words can it's target partition/array be mounted 'noatime'
in /etc/fstab?
Skip
--
Register Now for Creativ
Nils Breunese (Lemonbit) wrote:
> Bharat Mistry wrote:
>
>> I have a spare LTO2 as a result of a Windows Server upgrade
>>
>> I'd like to install it on my BackupPC box and Archive to Tape once a
>> week
>> (BackupPC installed on SME Server)
>>
>> I use the following to backup to DAT tape on othe
Bharat Mistry wrote:
> I have a spare LTO2 as a result of a Windows Server upgrade
>
> I'd like to install it on my BackupPC box and Archive to Tape once a
> week
> (BackupPC installed on SME Server)
>
> I use the following to backup to DAT tape on other SME servers:
>
> mt -f /dev/nst0 rewind
>
I have a spare LTO2 as a result of a Windows Server upgrade
I'd like to install it on my BackupPC box and Archive to Tape once a week
(BackupPC installed on SME Server)
I use the following to backup to DAT tape on other SME servers:
mt -f /dev/nst0 rewind
tar -cvf /dev/st0 /home > /home/e-smith/
Brad C wrote:
> But Ive hit a few snags
>
> I have quiet a few linux servers that I use rsyncd to backup from
> (security advantage of read only)
> Because they are servers and perform different functions the pooling of
> files feature doesn't impact on disk utilisation.
If you are backing up t
On Thu, Jan 15, 2009 at 8:43 AM, Brad C wrote:
> Drive A - Onsite Backup for a group of linux servers for instant restores.
BackupPC instance #1 (use vmware/vbox) w/ topdir on Drive A
> Drive B - Onsite Backup dedicated for one specific server that needs to be
> kept onsite but data seperate fr
Hi Brad,
[...]
> Eg: what happens if the building burns down and the backuppc host you where
> using doesnt exist anymore, how would you restore?
> The archive function allows you to tar... but the limit on a tar file is
> 2GB... which would meant it would split into 100 tar files... but then whe
1 - 100 of 222 matches
Mail list logo