Michael Stowe wrote on 12/04/2015 06:17:01
PM:
> I'm *pretty* sure both these things are fixed -- I took a quick look at
> the code on github and it looks like it'll handle 7 drives in the
> scripts and at least it thinks it detects 2012 (I'm pretty sure I tested
Kris Lou wrote on 12/09/2015 01:23:05 PM:
> I've had to build it myself for winexe-waf, but unfortunately the
> lfarkas repo doesn't contain the necessary libraries anymore to do
> that with CentOS 6.
Ouch! You wouldn't happen to still have that binary, would you? :)
Michael Stowe wrote on 12/04/2015 06:17:01
PM:
> I'm *pretty* sure both these things are fixed -- I took a quick look at
> the code on github and it looks like it'll handle 7 drives in the
> scripts and at least it thinks it detects 2012 (I'm pretty sure I tested
Timothy J Massey/OBSCorp wrote on 12/08/2015 01:46:31 PM:
> 2) I looked for a github site for this software and couldn't find
> one, including searching for various permutations of "backuppc
> github". I probably don't need it with the updated items linked
> above, but
Michael Stowe wrote on 12/04/2015 04:07:34
PM:
> I do use rsync, and winexe to handle shadow copies so I don't have to
> worry about open files and such. And I put together an installer
> package with all the pieces.
Funny you should mention that...
I really
martin f krafft wrote on 12/02/2015 05:20:31 PM:
> I am looking for a way to automate sending an archive (tarball) of
> the latest backup of each of my hosts to an offsite machine, using
> scp and GnuPG for encryption. Can this be done within BackupPC and
> scheduled
Les Mikesell wrote on 11/24/2015 01:29:24 PM:
> > Also, can anyone explain why when our updates are set to every 6.
> 97 days are still going out after the completition of every backup?
>
> A stock backuppc system never sends emails about success - only about
> backups
Holger Parplies wb...@parplies.de wrote on 05/23/2015 09:29:25 AM:
for the archives: you don't strictly *need* the free space. You can pipe
the
output of BackupPC_tarCreate directly into a 'tar x' and tell tar to
only
extract files named '*.pdf', something like
BackupPC_tarCreate -h
will experiment with it is a need to
use it in the future.
Timothy J. Massey
Sent from my iPad
On May 23, 2015, at 7:43 AM, Holger Parplies wb...@parplies.de wrote:
Hi,
Timothy J Massey wrote on 2015-05-22 20:40:52 -0400 [Re: [BackupPC-users]
BackupPC_tarCreate with only certain types
Les Mikesell lesmikes...@gmail.com wrote on 05/22/2015 04:24:56 PM:
What am I missing? How do I get BackupPC_tarCreate to create a
tar file that contains all PDF's stored in that path?
Thank you very much for any support you can give me. I've tried
different escapings/not-escapings
Hello!
I need to restore all PDF files from a particular backup share. It's
30,000 files scattered around thousands of locations. So, I was hoping to
use BackupPC_tarCreate to do it. But I'm striking out.
This works:
./BackupPC_tarCreate -l -h server -n 918 -s E /Shares/Shared
But this
Michael Stowe mst...@chicago.us.mensa.org wrote on 03/13/2015 09:47:06
AM:
I find that backing up open files is particularly useful:
http://www.michaelstowe.com/backuppc/
I have found his tool quite helpful in every regard except one: the
installer will fail to install on Windows Server
Holger Parplies wb...@parplies.de wrote on 12/10/2014 10:59:17 AM:
Colin Shorts wrote on 2014-12-10 11:45:41 + [Re: [BackupPC-
users] How to delete backups]:
You might want to press Enter before typing
`/usr/share/BackupPC/bin/BackupPC_nightly 0 255', otherwise it will
get
deleted
something
I'll actually have to play with. However, I have seen this on servers with a
very large image files, usually of things like Very large ISO's.
Timothy J. Massey
Sent from my iPad
On Nov 30, 2014, at 10:29 AM, Christian Völker chrisc...@knebb.de wrote:
Hi all,
I am trying
before they
die? You might be able to alter the time out in smaller increments and see
if the delay changes along with your changes in timeout.
Or it might be something else completely.
Timothy J. Massey
Sent from my iPad
On Nov 30, 2014, at 1:43 PM, Christian Völker chrisc...@knebb.de wrote
xpac backuppc-fo...@backupcentral.com wrote on 10/14/2014 03:59:26 PM:
Ok I found this little tidbit in some documentation:
As mentioned, the BackupPC user created on the system when
installing the RPM has to run Apache in order for everything to work
properly with the CGIs and mod_perl.
Michael Stowe mst...@chicago.us.mensa.org wrote on 08/25/2014 09:16:09
AM:
Given that, a VSS/rsync combination is more or less required. There are
two methods for coordinating the shadow copy service with rsync -- ssh
and
winexe. I use the winexe method here, and put together an installer
Marco Nicolayevsky ma...@specialtyvalvegroup.com wrote on 05/14/2014
08:47:24 PM:
Hello all,
I am using a pretty vanilla installation of BackupPC and love the
simplicity and fact that it just works as it’s supposed to.
My problem arises when trying to back up windows clients over
Holger Parplies wb...@parplies.de wrote on 03/19/2014 07:26:02 PM:
Les Mikesell wrote on 2014-03-19 11:25:38 -0500 [Re: [BackupPC-
users] Centralized storage with multiple hard drives]:
Throwing RAM at a disk performance problem usually helps.
You've used BackupPC before, Les, right? ;-)
Jost Schenck jost.sche...@gmx.de wrote on 03/14/2014 07:37:00 AM:
I guess you're right that backing up the backup servers system
config with backuppc itself may not be such a brilliant idea :)
You can *never* back up something with itself.
What
I thought was, that in case my home server
Les Mikesell lesmikes...@gmail.com wrote on 03/20/2014 01:59:47 PM:
On Thu, Mar 20, 2014 at 12:21 PM, Timothy J Massey
tmas...@obscorp.com wrote:
You've used BackupPC before, Les, right? ;-)
BackupPC prefers pool reads over writes when possible, and it
typically
accesses large
Les Mikesell lesmikes...@gmail.com wrote on 03/20/2014 03:48:51 PM:
It's all about statistics and the odds of having to move the disk head
to get a directory entry or inode vs already having it in cache for
instant access (and sometimes even some data...).
In theory, theory and practice are
thorvald backuppc-fo...@backupcentral.com wrote on 03/19/2014 06:53:19
AM:
Let's say that the storage is not a problem for me and I can have as
many TB or PT as I need. However the main assumption is that every
box has got a separate disk to be backed up to. So now I faced the
problem with
Søren Brøndsted s...@ufds.dk wrote on 03/17/2014 04:43:16 AM:
Hi
On 14-03-2014 16:23, Timothy J Massey wrote:
Have you tried performing the same FTP manually from the command line
of
the BackupPC box?
Yes. I have tried as backuppc user and it works.
Obviously, you need to use
Timothy J Massey tmas...@obscorp.com wrote on 03/18/2014 01:09:16 PM:
Søren Brøndsted s...@ufds.dk wrote on 03/17/2014 04:43:16 AM:
On 14-03-2014 16:23, Timothy J Massey wrote:
Are you using shared folders on i? You could also try to get the
folders using SMB, which is what I have
Søren Brøndsted s...@ufds.dk wrote on 03/14/2014 10:46:37 AM:
Hi
I am trying to do a ftp backup fra an iSeries 5 (AS400), but I get the
following result:
Unneeded log info
remotels: adding name QIMGCLG, type f, size 272496, mode 33152
remotels: adding name VOL001, type f, size
Dr. Boris Neubert om...@online.de wrote on 03/04/2014 12:23:15 PM:
What was irritating to me was the rsyncd log entry Building file
list... with nothing else afterwards and the empty BackupPC host
XferLog during backup. In fact the backup was already transferring all
the files all the time.
Hello!
I tried to install Michael Stowe's BackupPC client (with VSS support) on a
Windows Server 2012 R2 server today. Unfortunately, the installer program
popped up an error: Cannot determine version of vshadow to use. There
is only one choice: OK. When you click on it, the installer
Russell R Poyner rpoy...@engr.wisc.edu wrote on 12/17/2013 11:12:07 AM:
This is a poor comparison since we have different data sets, but it
would appear that BackupPC's internal dedupe and compression is
comparable to, or only slightly worse than what zfs achieves. This in
spite of the
Sorin Srbu sorin.s...@orgfarm.uu.se wrote on 12/16/2013 09:40:56 AM:
Anyway, Anaconda objected at my choosing ext4 for the 40 TB raid-array
when I
recently set up a new system, and defaulted to xfs instead.
EXT4 won't support file systems 16TB at all with 4k blocks, and depending
on OS and
Mark Rosedale mrosed...@vivox.com wrote on 12/16/2013 09:06:07 AM:
I'm working on bringing back a backuppc instance. It is very large 3
+TB. The issue I'm having is e2fsck is taking an extremely long time
to finish. It is stuck on the checking directory structure. We are
going on 48 hours.
Timothy J Massey tmas...@obscorp.com wrote on 12/16/2013 02:35:44 PM:
Mark Rosedale mrosed...@vivox.com wrote on 12/16/2013 09:06:07 AM:
I'm working on bringing back a backuppc instance. It is very large 3
+TB. The issue I'm having is e2fsck is taking an extremely long time
to finish
Craig O'Brien cobr...@fishman.com wrote on 11/01/2013 09:48:23 AM:
This error shows BackupPC_dump segfault, and pointing to libperl.so
How do you install your BackupPC ? From source or from RPM?
I did a yum install backuppc, which got it from epel
That's how I do it.
That tells you it
Craig O'Brien cobr...@fishman.com wrote on 10/31/2013 08:49:15 AM:
The du -hs /backup/pool /backup/cpool /backup/pc/* has finished.
Basically I had 1 host that was taking up 6.9 TB of data with 2.8 TB
in the cpool directory and most of the other hosts averaging a GB each.
Well, there's your
Holger Parplies wb...@parplies.de wrote on 10/30/2013 10:24:05 PM:
as I understand it, the backups from before the change from smb to
rsyncd are
linked into the pool. Since the change, some or all are not. Whether the
change of XferMethod has anything to do with the problem or whether it
Craig O'Brien cobr...@fishman.com wrote on 10/31/2013 01:33:30 PM:
Just out of curiosity, why hadn't you already done that?!?
I didn't know which host was the problem and didn't think of it.
Although I'll readily admit it seems painfully obvious to me now. :)
Just so you're sufficiently
Les Mikesell lesmikes...@gmail.com wrote on 10/31/2013 01:54:24 PM:
On Thu, Oct 31, 2013 at 12:33 PM, Craig O'Brien cobr...@fishman.com
wrote:
fsck the filesystem.
bash-4.1$ fsck /dev/sda1
fsck from util-linux-ng 2.17.2
e2fsck 1.41.12 (17-May-2010)
/dev/sda1: clean,
Sharuzzaman Ahmat Raslan sharuzza...@gmail.com wrote on 10/30/2013
10:06:18 PM:
Hi Holger,
Based on short session of troubleshooting, I believe the machine
actually suffer from low I/O speed to the disk. Average read is
about 3 MB/s, which I considered slow for a SATA disk in IDE
Sharuzzaman Ahmat Raslan sharuzza...@gmail.com wrote on 10/31/2013
02:38:01 PM:
Hi Timothy,
I got the number by observing the output of iotop while file
transfer is running. Also, on BackupPC host summary page, average
transfer rate for full backup is also around 3MB/s
It could be a
Craig O'Brien cobr...@fishman.com wrote on 10/29/2013 08:21:11 PM:
I'm not sure how I can go about determining if a particular backup
is using the pool or just storing the files in the PC folder. What's
the best way to check if a given backup set is represented in the
pool or not? Would
Adam Goryachev mailingli...@websitemanagers.com.au wrote on 10/30/2013
09:18:59 AM:
Not really relevant to this thread, but I have in the past added a
empty file to each of the removable drives, then test if the file
exists before creating the archives. If the drive isn't mounted, the
file
Dan Johansson dan.johans...@dmj.nu wrote on 10/27/2013 02:26:06 PM:
Have you checked the Event Viewer? It usually shows you what's going
on
with rsync...
This seemed to have gotten missed. Is there anything in there? Rsync is
usually pretty expressive in the Event Viewer...
Also,
Craig O'Brien cobr...@fishman.com wrote on 10/29/2013 01:53:31 PM:
On the General Server Information page, it says Pool is 2922.42GB
comprising 6061942 files and 4369 directories, but our pool file
system which contains nothing but backuppc and is 11 TB in size is 100%
full.
My strong
Craig O'Brien cobr...@fishman.com wrote on 10/29/2013 03:30:46 PM:
The topdir is /var/lib/BackupPC which is a link to /backup
I missed that in your previous e-mail. Stupid proportional fonts...
(And you might want add a -h for commands like du and df: the -h is for
human-readable... When
Dan Johansson dan.johans...@dmj.nu wrote on 10/26/2013 08:37:48 AM:
Any suggestions on
a) how to find out why rsyncd dies in the first place
Not really: I have never run rsync on Windows 8. I *have* done it on
Windows Server 2012 (based on Win8) with zero crashes across 3-4 servers
on
Hans Kraus h...@hanswkraus.com wrote on 10/23/2013 11:35:47 AM:
Hi,
that's the strange thing: initial backups worked, only the following
backups showed the timeout.
I'd get that. With the initial backup, there are no checksums performed:
what would you checksum on the backup server? So
Hans Kraus h...@hanswkraus.com wrote on 10/22/2013 01:25:30 PM:
It was very simple: I copied the 'rsyncd.conf' file on the Clients ( all
Debian at the moment) from an example file, as I read in
http://howden.net.au/thowden/2012/11/rsync-on-debian/. Namely I did
'cp
Phil Reynolds phil-backu...@tinsleyviaduct.com wrote on 10/15/2013
08:05:14 AM:
I haven't seen BackupPC write an archive during my testing - maybe I'm
missing some settings?
More likely, you're just expecting too much: BackupPC does not write an
archive on a schedule at all. It only writes
Charles Belarmino charles.belarm...@maximintegrated.com wrote on
10/13/2013 10:54:20 PM:
Hello Everyone,
Do I still first need to do “Connect to Network”….
[image removed]
before I can get my BackupPC Server running like this one?
First, the positive. You *did* ask a question.
Bowie Bailey bowie_bai...@buc.com wrote on 10/11/2013 03:38:29 PM:
The first entry in the BackupFilesOnly (the key) should be the
sharename, Users. After you add that, use the buttons to the
right to add the directories you want to backup within that share.
Or, if there's only one share,
Charles Belarmino charles.belarm...@maximintegrated.com wrote on
10/10/2013 01:52:26 AM:
I cannot run my backuppc server installed in Virtual Box using
CentOS6.4. Attached is the error.
That error message told you exactly what was wrong (the CGI script can't
talk to the BackupPC server)
Tyler J. Wagner ty...@tolaris.com wrote on 10/09/2013 10:15:57 AM:
On 2013-10-09 14:09, Holger Parplies wrote:
Hi,
vano wrote on 2013-10-09 05:17:53 -0700 [[BackupPC-users] no rrd
graphs in version 3.3.0]:
Found that after upgrade to version 3.3.0, rrd graphs is missing
in web
µicroMEGAS microme...@mail333.com wrote on 10/04/2013 09:29:22 AM:
I have red many times the manual and of course I tried many settings,
still without luck. I have 30 hosts which have about 1400Megabytes of
data in summary. As I am using a 2TB harddisk for my backuppc pool, I
would like
Les Mikesell lesmikes...@gmail.com wrote on 10/04/2013 02:20:19 PM:
Are all of your backup runs completing every day?
That is also a great question. If your incrementals span more than a
week, there will be more than one full that they depend on. All of the
assumptions you've put in place
µicroMEGAS microme...@mail333.com wrote on 10/04/2013 02:42:15 PM:
I think here's my problem: let's say one or two host were not able to be
backed up (incremental) Then I miss one or more incremental backups, so
BackupPC doesn't delete the last Full after 7 days (when the new Full
backup
Erik Hjertén erik.hjer...@companion.se wrote on 05/24/2013 12:15:22 PM:
Hi all
I have invested in a used HP Proliant ML150 G5 server as a new
backup server. I have about 500 GB of data in 40 000 files spread
over 8 clients to backup. Data doesn't grow fast so I'm aiming at
two 1TB disks
Erik Hjertén erik.hjer...@companion.se wrote on 05/24/2013 03:40:36 PM:
Thanks for your thorough reply Timothy.
No problem. BackupPC (well, all system backup) is a *lot* more complex
than people think!
About 8000 files a photos between 5 and 15 MB each, in total around
100 GB. This will
Changing the ping command to a different command can be done on a per host
basis.
Timothy J. Massey
Sent from my iPhone
On May 22, 2013, at 4:52 PM, Zach lace...@roboticresearch.com wrote:
This is true only for one host. This is why I wouldn't want to change ping
to /bin/true...I want
Arnold Krille arn...@arnoldarts.de wrote on 04/26/2013 04:27:44 PM:
On Thu, 25 Apr 2013 14:45:50 -0700 Lord Sporkton
lordspork...@gmail.com wrote:
I'm currently backing up mysql by way of dumping the DB to a flat
file then backing up the flat file. Which works well in most cases
except
backu...@kosowsky.org wrote on 04/26/2013 05:04:07 PM:
If you are indeed talking about files in the 50-200GB range, you are
not going to fit more than a handful of files per TB disk... even if
you have a RAID array of multiple disks, you are still probably
talking about only a small number of
backu...@kosowsky.org wrote on 04/26/2013 06:27:32 PM:
My point is that even with o(100) files/copies which assuming you are
backing up multiple versions means you have far fewer distinct files
-- you may be better off just writing a script...
I get your point, though I would ask you to
Koen Vermeer k...@vermeer.tv wrote on 03/13/2013 08:13:45 PM:
On 2013-03-13 21:20, Brad Alexander wrote:
So I'm wondering, is there a way to better force a better
distribution of backup jobs during the day?
What about setting MaxBackups to a smaller number than 4?
That's what I'd do: set
Mark Campbell mcampb...@emediatrade.com wrote on 03/06/2013 11:01:28 AM:
I don't mean to bring up another RTFM moment, but I've searched
around, and I haven't found the location for enabling/disabling the
pooling. The compression option I've found, but not pooling.
There is no way of doing
Andrew Mark andr...@aimsystems.ca wrote on 12/11/2012 10:20:15 AM:
Hi all,
We use MS Outlook for its calendar and contact functions; our email is
web-based.
It there a way to disable BackupPC from checking and warning that
Outlook is not backed up?
ie. stop checking the value of
Stefan Peter s_pe...@swissonline.ch wrote on 11/18/2012 04:55:49 PM:
On 18.11.2012 20:44, Till Hofmann wrote:
But now, when I'm trying to archive more clients at once, the
process receives ALRM after exactly 20 hours.
2012-11-09 19:54:29 Starting archive
2012-11-10 15:54:29 cleaning
Markus unive...@truemetal.org wrote on 11/19/2012 04:03:03 PM:
For fun, here's the output of find / | wc -l:
24478753
real 490m35.602s
user 0m21.013s
sys 1m23.305s
25 million files! OMG. find took 8 hours to complete. Nice, hm? :-)
Wow. If a simple find took 8 hours to complete,
Cassiano Surek c...@surek.co.uk wrote on 11/22/2012 05:43:30 AM:
Dear all,
For reference on the matter I was trying to resolve or improve, I
have increased RAM from 2Gb to 4Gb (the max for that machine) and
backups reduced by 50% in completion time.
I had wrongly assessed in the past
Les Mikesell lesmikes...@gmail.com wrote on 11/08/2012 10:51:53 AM:
On Thu, Nov 8, 2012 at 9:16 AM, Jimmy Thrasibule
thrasibule.ji...@gmail.com wrote:
Hi,
I wonder if it is possible to wake BakupPC at midnight. In the
documentation or on the Internet, they all start at 1 to 23.
Cassiano Surek c...@surek.co.uk wrote on 11/06/2012 05:03:44 AM:
Of course, how could I have missed that! I did find it now, thanks
Michał.
Last full backup (of 100 odd Gb) took slightly north of 10 days to
complete. Incremental, just over 5 days.
I did not see if you mentioned how *many*
Tyler J. Wagner ty...@tolaris.com wrote on 10/25/2012 09:14:47 AM:
On 2012-10-25 14:10, Bowie Bailey wrote:
On 10/24/2012 12:41 PM, dixieadmin wrote:
I am currently using BackupPC 3.2.1 on SME Server 8.0. I wanted
to know if there is a correct procedure for using 2 different
external
, and I think you will find very, very little interest in
your small change .
Timothy J. Massey
Sent from my iPhone
On Sep 24, 2012, at 3:46 PM, Serge SIMON serge.si...@gmail.com wrote:
The point is that it barely miss nothing to have a browsable uncompressed
rsynced backup folder for people
Les Mikesell lesmikes...@gmail.com wrote on 09/18/2012 07:04:21 PM:
The guest servers are not hurting for resources. They are not part of
the
problem. The problem seems to be contained completely inside of the
BackupPC server.
If you aren't seeing big speed differences among clients
backu...@kosowsky.org wrote on 09/18/2012 09:51:11 PM:
Timothy J Massey wrote at about 12:54:35 -0400 on Monday, September 17,
2012:
I have several very similar configurations. Here's an example:
Atom D510 (1.66GHz x 2 Cores)
4GB RAM
CentOS 6 64-bit
4 x 2TB Seagate SATA
Les Mikesell lesmikes...@gmail.com wrote on 09/17/2012 01:34:33 PM:
On Mon, Sep 17, 2012 at 11:05 AM, Timothy J Massey
tmas...@obscorp.com wrote:
I'm writing a longer reply, but here's a quick in-thread reply:
I know exactly what you mean by waiting until after the first full.
Often
that, too. It's just a big pain to use, that I
would rather do nearly anything else than depend on it! :)
Unfortunately, none of this gets us closer to the source of the terrible
performance we're seeing... :)
Timothy J. Massey
Out of the Box Solutions, Inc.
Creative IT Solutions Made
John Rouillard rouilj-backu...@renesys.com wrote on 09/17/2012 02:05:28
PM:
On Mon, Sep 17, 2012 at 12:54:35PM -0400, Timothy J Massey wrote:
No matter the size of the system, I seem to top out at about 50GB/hour
for
full backups. Here is a perfectly typical example:
Full Backup
Tim Fletcher t...@night-shade.org.uk wrote on 09/17/2012 08:50:39 AM:
You are being hit by disk io speeds, check you dont have atime
turned on on the fs.
I agree that noatime is a net win for *very* little pain. I found a
system where I had not mounted the datastore noatime and swiched it.
bigger, is at least *somewhat* comparable. And
you're getting four times the performance, which is what I would have
estimated that my box was capable of doing, but is not.
Timothy J. Massey
Out of the Box Solutions, Inc.
Creative IT Solutions Made Simple!
http://www.OutOfTheBoxSolutions.com
Les Mikesell lesmikes...@gmail.com wrote on 09/17/2012 02:44:20 PM:
On Mon, Sep 17, 2012 at 11:54 AM, Timothy J Massey
tmas...@obscorp.com wrote:
However, I have recently inherited a server that is 3TB big, and 97%
full, too! Backups of that system take 3.5 *days* to complete. I
-based rsync is
*terrible*. But it makes the magic of BackupPC work--if you feed it
enough resources, it seems.
Timothy J. Massey
Out of the Box Solutions, Inc.
Creative IT Solutions Made Simple!
http://www.OutOfTheBoxSolutions.com
tmas...@obscorp.com
22108 Harper Ave.
St. Clair Shores, MI
involved
with the list. His participation is certainly missed.
Timothy J. Massey
Out of the Box Solutions, Inc.
Creative IT Solutions Made Simple!
http://www.OutOfTheBoxSolutions.com
tmas...@obscorp.com
22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796
Timothy J Massey tmas...@obscorp.com wrote on 09/18/2012 11:07:18 AM:
John Rouillard rouilj-backu...@renesys.com wrote on 09/17/2012
02:33:34 PM:
I have another system that is lower power:
2632652.8 MB at 662.5 minutes or 66MB/s
That's 2.6TB in 11 hours. That is perfectly
Les Mikesell lesmikes...@gmail.com wrote on 09/18/2012 12:42:26 PM:
On Tue, Sep 18, 2012 at 10:24 AM, Timothy J Massey
tmas...@obscorp.com wrote:
Fortunately, BackupPC is a backup of the backup right now, and is
not
expected to be used for real. Yet. That's why I can take the time
Les Mikesell lesmikes...@gmail.com wrote on 09/18/2012 03:34:56 PM:
On Tue, Sep 18, 2012 at 1:18 PM, Timothy J Massey tmas...@obscorp.com
wrote:
That is a good point, but if I ever have to do a full 3TB restore from
BackupPC, the 12 hours a (properly performing) BackupPC will take
Les Mikesell lesmikes...@gmail.com wrote on 09/17/2012 11:51:09 AM:
On Mon, Sep 17, 2012 at 10:16 AM, Mark Coetser m...@tux-edo.co.za
wrote:
Its the first full run but its taking forever to complete, it was
running
for nearly 3 days!
As long is it makes it through, don't make any
they compare to see at what performance cost
this protection is coming.
Thank you very much for your help!
Timothy J. Massey
Out of the Box Solutions, Inc.
Creative IT Solutions Made Simple!
http://www.OutOfTheBoxSolutions.com
tmas...@obscorp.com
22108 Harper Ave.
St. Clair Shores, MI
the user tells me so when I restore it... :(
Timothy J. Massey
Out of the Box Solutions, Inc.
Creative IT Solutions Made Simple!
http://www.OutOfTheBoxSolutions.com
tmas...@obscorp.com
22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796
doubt that
this is it.
Timothy J. Massey
Out of the Box Solutions, Inc.
Creative IT Solutions Made Simple!
http://www.OutOfTheBoxSolutions.com
tmas...@obscorp.com
22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796
of the connection--even if it's a Gigabit connection.
For subsequent runs, see my other (very long) e-mail. Examine the CPU
usage (and I/O usage!) of your BackupPC server and see what is limiting
you.
Timothy J. Massey
Out of the Box Solutions, Inc.
Creative IT Solutions Made Simple!
http
At a high-level, VSS works the same way as LVN snapshots. At an executional
level, it's completely different. You do not need a separate partition; you
just need enough free space on the volume.
Timothy J. Massey
Sent from my iPhone
On Sep 12, 2012, at 2:27 PM, Kenneth Porter sh
Michael Stowe mst...@chicago.us.mensa.org wrote on 09/03/2012 07:33:39
PM:
I'd recommend rsync+vshadow to get all the files, of course -- for a
bare
metal restore, if you don't recover the registry, you won't have
anything
to map to, anyway, so I'm assuming you're going to do that as well,
Ray Frush ray.fr...@avagotech.com wrote on 08/21/2012 11:30:22 AM:
You need to exclude /proc from your backups. It's a virtual file
system maintained by the kernel, and does not need to be backed up.
Here's the excludes we use for Linux hosts:
$Conf{BackupFilesExclude} = {
'*' = [
Of course, in the original request, inside the office was already working
perfectly. Therefore, there's no need to do anything.
Of course, if that's what the original person wanted, they wouldn't have
written the message in the first place.
Timothy J. Massey
Sent from my iPhone
On Jul 27
bubolski backuppc-fo...@backupcentral.com wrote on 07/24/2012 05:51:47
AM:
I got a problem with this (topic). When i'm connected to the same
wireless via cable I can start backup on backuppc.
When i got internet from wifi and i'm connected to my work wireless
via vpn i can ping my
Arthur Darcet arthur.darcet+l...@m4x.org wrote on 07/26/2012 03:20:15
PM:
You can easily configure the VPN to give static IP to your clients,
and then just map a dummy name to the VPN IP using /etc/hosts on the
BackupPC server.
Static IP addresses that are on the same local broadcast domain
, a VM of backup PC as a supported solution does
not seem like a good idea to me.
Timothy J. Massey
Sent from my iPhone
On Jul 20, 2012, at 6:03 PM, Bryan Keadle (.net) bkea...@keadle.net wrote:
Sorry for the late reply.
anything/everything in a VM is an attraction. :-)
I had hoped
ones, it is still very much a small
NAS box.
BackupPC really wants to be set up on a standalone PC with directly attached
disks.
Timothy J. Massey
Sent from my iPhone
On Jul 13, 2012, at 12:39 PM, Bryan Keadle (.net) bkea...@keadle.net wrote:
Thanks for your reply. Yeah, we're using a NAS
and unique from the
parts used for your production data. Otherwise, it can't act as a backup for
those parts.
A self-contained box that includes the processing and storage is by far the
simplest way to achieve this.
Timothy J. Massey
Sent from my iPhone
On Jul 13, 2012, at 1:05 PM, Mike ispbuil
Michael Stowe mst...@chicago.us.mensa.org wrote on 07/10/2012 11:14:34
AM:
How are you backing the junction points up. AFAIK backuppc treats
those as actual directories and not junction points (i.e. the concept
of a junction point doesn't exist in backuppc's universe like a
symbolic link
Depends on what you mean by secure. If you mean, is the connection
encrypted?, then no. You want to use rsync over SSH, which is perfectly
possible on Windows as well.
Timothy J. Massey
Sent from my iPhone
On Jul 4, 2012, at 8:34 AM, galemberti greg galembe...@hotmail.com wrote:
Hi,
I have
1 - 100 of 297 matches
Mail list logo