Re: [BackupPC-users] NAS / SAN and other storage devices

2009-03-18 Thread Jack Coats
If you are usig a SAN or NAS, it should look like 'just another disk'
to your BackupPC server.
So, yes, you can use them.

The problem Ihave found is needing the backuppc backup storage to
appear as a single file system.
I guess you could use software striping across just about any kind of
disk images (real disks, external
RAID, SAN or NAS based drives) but it would be up to you to ensure
redundancy in case one of the
components failed (or just became unavailable temporarily).

IHS ... Jack



On Wed, Mar 18, 2009 at 7:37 AM, yodo64
backuppc-fo...@backupcentral.com wrote:

 Hi all

 I am wondering if it is possible to store the backup datas, on non resident 
 devices disks, with Backuppc ?
 Can I have a NAS or a disk in network wich is not inside the backuppc server ?
 IF YES where are the informations about those possibilities ?

 If NOT how do I do when all the possibilities of insides disks are full ? Do 
 I have to add a second backuppc server ?

 Thanks for your help

 +--
 |This was sent by lionel@equipement-agriculture.gouv.fr via Backup 
 Central.
 |Forward SPAM to ab...@backupcentral.com.
 +--



 --
 Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are
 powering Web 2.0 with engaging, cross-platform capabilities. Quickly and
 easily build your RIAs with Flex Builder, the Eclipse(TM)based development
 software that enables intelligent coding and step-through debugging.
 Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are
powering Web 2.0 with engaging, cross-platform capabilities. Quickly and
easily build your RIAs with Flex Builder, the Eclipse(TM)based development
software that enables intelligent coding and step-through debugging.
Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Server reboots nightly

2009-02-06 Thread Jack Coats
check any log files you find in /var/log too!  messages is a good one to 
start with, but depending on the
exact configuration, you might find other log files there too.

Chris Robertson wrote:
 Chris Baker wrote:
   
 Which logs should I check? 
 

 /var/log/messages

   
 And what should I look for in these logs?
 

 Run the command dmesg (first, man dmesg so you know what this command 
 does) and take a look at the output.  This is the bootup message and 
 should be replicated in /var/log/messages.  Take a look at the entries 
 above for clues as to why your server is restarting.  The last log entry 
 before the dmesg output should be something along the lines of Kernel 
 log daemon terminating. and should be proceeded by evidence of exiting 
 processes.  If not, you probably have power or hardware problems.  If 
 you do see evidence of a graceful restart, at least you know about when 
 it's happening.  Start looking through /etc/cron.* and /etc/crontab for 
 jobs that run around that time.

   
  I do
 know where the logs are on the system.
   
 

 Chris

 --
 Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
 software. With Adobe AIR, Ajax developers can use existing skills and code to
 build responsive, highly engaging applications that combine the power of local
 resources and data with the reach of the web. Download the Adobe AIR SDK and
 Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

   

--
Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
software. With Adobe AIR, Ajax developers can use existing skills and code to
build responsive, highly engaging applications that combine the power of local
resources and data with the reach of the web. Download the Adobe AIR SDK and
Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Junctions on WinXP

2008-11-26 Thread Jack Coats
It has the same issue with UNIX mount points.  Such is the nature of 
rsync.  I have managed
to avoid Vista sofar, but I guess I may need to build a system sometime.

Jeffrey J. Kosowsky wrote:
 Does BackupPC know how to treat NTFS junction points.
 They are analogous to *nix symbolic links but only work on
 directories.

 Based on a little test, it seems like BackupPC does not know about
 them since it seems to have copied over all the data -- i.e. it
 treated the junction as a real directory rather than as a
 symbolic-type link. The attribute file also seems to treat it like a
 normal directory. Now to be fair, rsync does the same, so some of this
 may be an rsync artifact.

 Also, my understanding is that Vista introduced a notion of Unix-like
 symbolic links. Does anyone know whether they are treated properly by
 BackupPC? 
 (I don't have Vista around to test but it would even be interesting to
 know how rsync treats them).

 Thanks

 -
 This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
 Build the coolest Linux based applications with Moblin SDK  win great prizes
 Grand prize is a trip for two to an Open Source event anywhere in the world
 http://moblin-contest.org/redirect.php?banner_id=100url=/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

   

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC Working Well - Except Archiving

2008-11-17 Thread Jack Coats
Yes, the ubuntu install is different from the 'source' install, so the 
documentation doesn't fit exactly.
After trying the Ubuntu package, I had to totally remove it and install 
from the sourceforge files (not .deb either)
The install went nicely, even though it isn't the 'ubuntu way', it 
worked well and the documents were totally in sync.

[EMAIL PROTECTED] wrote:
 InstallDir on my system = /usr/share/backuppc
 In /usr/share/backuppc/bin are various files including:
 BackupPC_archive  (Note not BackupPC_archiveStart)
 BackupPC_archiveHost

 The on-line docs at Sourceforge do not correspond with my Ubuntu 8.04-Server 
 LTS distribution.

 At the command line, using the full path, both can be started.  From the CGI 
 it appears that BackupPC_archiveHost is started with the following command:
 $Installdir/bin/BackupPC_archiveHost $tarCreatePath $splitpath $parpath $host 
 $backupnumber $compression $compext $splitsize $archiveloc $parfile *

 Still, it doesn't explain why it it fails to run when scheduled or fails to 
 create a log entry, but works when started manually using the button in the 
 CGI and creates the log entries.

 SamK

   
 From: Nils Breunese (Lemonbit) [EMAIL PROTECTED]
 Date: 2008/11/17 Mon AM 09:45:41 GMT
 To: General list for user discussion,
  questions and support backuppc-users@lists.sourceforge.net
 Subject: Re: [BackupPC-users] BackupPC Working Well - Except Archiving

 SamK wrote:

 
 The on-line documentation at Sourceforge mentions creating an  
 archive at the command line using BackupPC_archiveStart.  This  
 method fails as the system cannot find BackupPC_archiveStart.
   
 The BackupPC_archiveStart binary is probably not in your path. Try  
 calling it using the full path, e.g. /usr/local/BackupPC/bin/ 
 BackupPC_archiveStart on our systems.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Different TopDir for different clients.

2008-11-15 Thread Jack Coats
I don't know what happened to it, but at one time there was development 
being done on a 'distributed
file system', where the data was 'raided' across many systems, so if 
some of the systems 'went away'
the data was still there and updated.  And when they came back, it was 
automatically put back in and
'synced'.  It was supposed to use all much of the 'unused space' on 
client desktops, but the people on
the desktop would have to go into the 'front door' to see the data, not 
just what was on their desk.

Oz Dror wrote:
 Thanks for responding:

 Regarding the first issue.

 I have a limited backup space in my server. On the other had I have disk 
 space in some clients that is wasted.
 Thus I was hopping to backup one client to another client's disk, rather 
 than to my main storage.

 Regarding the second issues. The client is a windows client. My 
 understanding is that when you use rsyncd in windows. you can either 
 exclude dirs or include
 dirs. not both. I only need to backup a couple of user accounts in that 
 client. I was hoping to do it with two backup schedules, because of the 
 limitation above.
 These accounts have different file exclusions.

 -Oz

 Martin Leben wrote:
   
 Hi Oz,

 Read on...

 Oz Dror wrote:
   
 
 I am sure that it was asked before, but I was not able to find 
 satisfying answer on the net.

 1. How can I have different TopDir assigned to different client computers.
 
   
 No, you can't. Tell us more about what problem you are trying to solve 
 instead.


   
 
 2. How can I have different backup schedules for the same PC client.
 
   
 Create an extra host in backuppc that points to the same client machine. But 
 why 
 would you want to do that? Just as with the question above, tell us more 
 about 
 the actual problem you are trying to solve.

 I am not really an expert on Backuppc, so if I misunderstood anything please 
 correct me. And don't hesitate to follow up if Oz has more questions if he 
 follows up on this subject and I am not jumping the gun.

 BR
 /Martin Leben


 -
 This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
 Build the coolest Linux based applications with Moblin SDK  win great prizes
 Grand prize is a trip for two to an Open Source event anywhere in the world
 http://moblin-contest.org/redirect.php?banner_id=100url=/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
   
 


 -
 This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
 Build the coolest Linux based applications with Moblin SDK  win great prizes
 Grand prize is a trip for two to an Open Source event anywhere in the world
 http://moblin-contest.org/redirect.php?banner_id=100url=/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/

   

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Different TopDir for different clients.

2008-11-14 Thread Jack Coats
Oz,

Try multiple implementations of BackupPC server!  Even one 'big backup 
server' and several
'virtual servers' to handle special needs.

For most clients, I agree, multiple schedules are not needed, but I too 
have come into that
kind of issue (a directory has lots of changes, so we needed to back it 
up hourly, even though
the data was never there over 24 hours before it was merged into a big 
database that was backed
up daily.  I had a different 'client name' and different 'backup 
schedule' for that client alone.
This was NOT using BackupPC (It was IBMs TSM).  It might be possible to 
do that in BackupPC,
but we need to get someone that knows more BackupPC than I do to answer 
that.

But I agree, please give us a bit more detail of the problem / issue you 
are trying to address,
the we can brainstorm together!

... Jack



-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Removing a client backups?

2007-10-15 Thread Jack Coats
What is the right way to remove a client computer and its backups?

Is there a clean way without letting them 'age off'?


-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now  http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] wiki and forums

2007-10-11 Thread Jack Coats
I think only registered users can edit anyway, if I remember right.
So there is some 'social protection from vandalism.

Yes, keeping it all in one place would be great!

On Thu, 2007-10-11 at 13:23 -0700, Craig Barratt wrote:
 Nils writes:
 
   http://www.wiki.sourceforge.net/
  
   (i'm not quite sure how membership/authorization work.)
  
  I think that might be the option for continuity and everything will
  be kept in the backuppc.sourceforge.net space. I think Craig may have
  to 'enable' this somehow?
 
 I like the idea of everything still being hosted by SF.  I just
 turned it on.  It's in public mode - anyone can view and edit.
 
 If there is vandalism then I'll switch it to protected mode - that
 allows viewing by anyone and I'll give users permission to edit.
 
 Craig
 
 -
 This SF.net email is sponsored by: Splunk Inc.
 Still grepping through log files to find problems?  Stop.
 Now Search log events and configuration files using AJAX and a browser.
 Download your FREE copy of Splunk now  http://get.splunk.com/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/


-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now  http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC: Antivirus suggestion.

2007-10-09 Thread Jack Coats
I think it would be great to have using ClamAV or AVG as an OPTION on
scanning files coming in from clients.  If they are found in need, then
need to be quarantined (and admin alarmed at the end of the backup
period once a day) with the normal information kept that the AV software
already does plus client information.

We might also want this to be on a client by client basis (I don't want
to scan my servers usually, but I do want to scan stuff from desktops
normally.)

On Tue, 2007-10-09 at 13:31 +0100, Simon Avery wrote:
 Hi.
 
 Debian Etch and 2.1.2.6
 
 I'm not convinced this is a sensible thing to ask, but worth a try...
 
 I'd like to scan for viruses on our company LAN periodically, in 
 addition to a resident guard program. This has historically found bad 
 things in caches before they've been run.
 
 I have been using AVG on each Windows client but the free version 
 doesn't allow useful scheduling, only once per day every day and as 
 it's very intrusive causes much unrest from my users who suddenly find 
 their computers crawling.
 
 So - would it be possible to scan the backups as they are made using 
 BackupPC? Possibly during the link phase after a successful backup.
 
 I know I could run clam(d)scan on the backup directories but this would 
 cause huge duplication and would take twice forever so is there scope 
 for a clever addition to BackupPC which calls a program for each new 
 file that's added to the pool, such as clam(d)scan?
 
 If it returns an error code, then that would be emailed to the admin 
 along with the filename, path and computer it came from so I could run a 
 full scan.
 
 Benefits: (To me at least!)
 Clients don't get any additional disk access or cpu usage beyond a 
 normal backup.
 No virus re-scanning of files that haven't changed.
 No increase of backup duration if done afterward during link.
 
 Drawbacks:
 Increase of cpu usage on server.
 
 
 Is this a good idea, has it been done already, is it beyond the scope of 
 BackupPC's remit or am I barking?
 
 
 
 -
 This SF.net email is sponsored by: Splunk Inc.
 Still grepping through log files to find problems?  Stop.
 Now Search log events and configuration files using AJAX and a browser.
 Download your FREE copy of Splunk now  http://get.splunk.com/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/


-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now  http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] New Hardware [war Troubleshooting a slow backup]

2007-10-03 Thread Jack Coats
From when I used to be pre-sales and tech support for a VAR that sold
different backup solutions:

Run your numbers before you decide on hardware.

It is all about bandwidth.

There is bandwidth you need to consider everywhere.
 1. client read speed during the backup session (what else is going on)
 2. client network communication speed under that CPU load
 3. network issues. ... over the internet in a simple example you often
go:

client - local switch - router - [internet cloud] - router -
 switch - backup host

and there can be more layers in there.  At one bank I worked for that
was EXACTLY what we saw, only the 'internet cloud' was 'wan', with DS3
at the central site, but T1 at each the client end (37 remote client
sites, some sites had multiple servers).  Over a T1 at one client site
(1.44Mbit) the effective throughput rate was considerably less after you
take off protocols, encryption, etc (like .6 of the total bandwidth was
all we could really transport data on).

On your server, again, network NIC bandwidth,
How much are you expecting to recieve at one time?
Do you have enough CPU to drive the NIC full out and do other tasks?
CPU, memory, and backplane bandwidth, to drive the data. (if you can do
DMA and not have the data go through the CPU, that is a good thing).
Do you have enough memory on your backup server (if you swap or page at
all, you do not have enough)?
Are your disk controllers / raid controllers / drives fast enough to
deal with it?

RAID other than RAID 1 has CPU costs to calculate parity or do
comparisons.  Good RAID controllers present already raided drives to the
processor, so the CPU does not know/care that there is RAID back there.
But the RAID controller CPU/memory/io speed must be fast enough to keep
up.

On all these rates, you will also find 'burst rates' versus 'sustained
rates'.  Burst rates are usually a lot faster and are often quoted in
the sales literature.  Snag the information you can from 'white papers'
and from 'product comparisons' in trade rags/journals.  Even 'maximum
PC' does interesting non-server raid hardware comparisons sometimes.

This is the BackupPC list, so I won't bore you with the same diatribe
about multi-teer disks and tapes.  In short the scenario is the same.

If you would like to send me some information offlist, I would be glad
to help you work through some of it.  I am not up on the latest
hardware, but if you have a Dell VAR you use, have THEIR pre-sales SE
(systems engineer) do it and get you the gory detail.  Tell them you
need to make it 'run very well' and show me at least 3 different
options.  If they are a good VAR, they will be happy to do it, if they
think you are really interested.  Talk to their pre-sales SE first, kind
of an interview to see if you believe that 'he knows from where he
speaks'.  

I hope this helps. ... Jack
 



-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now  http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Migration? 2.x to 3.x

2007-10-03 Thread Jack Coats
I have a 2.x install I am using that came in via apt to our ubuntu
backup server.  I would like to go to 3.x.

Is there some migration information I have missed?  Or does it need to
be a 'wipe and start over' situation?

TIA, Jack


-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now  http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] multiple IP for the same host

2007-10-01 Thread Jack Coats
If you have your own (versus an appliance) DHCP server, you could code
the DHCP to resolve both of the same ether addresses to a single IP
address.  Just make sure to NOT use BOTH at the same time on your
laptop. (unless you figure out how to do the old 'shotgun' or bonding
kind of network things to give you the bandwidth of both ... ugh)


On Sat, 2007-09-29 at 17:39 +0200, Sébastien Barthélemy wrote:
 Le vendredi 28 septembre 2007 à 20:23 -0400, Rod Dickerson a écrit :
  One possible solution is just to hard code your machines’ IP
  addresses. You only have 4, so set them all with static ip addresses
  and then use /etc/hosts. 
  Set the wireless and wired IP addresses to be the same; you only use
  one at a time, right? 
 
 nothing garanties that ! Normally this is the case, but...
 
  I would hate to see you use Samba just for some sort of name
  resolution just for 4 machines. 
 
 I agree, I'm afraid of samba and have no time to read the huge full doc.
 
  Yes, it is a little inconvenient when traveling with the laptop. My
  Mac has profiles for different network locations, not sure if Windows
  has that or not. 
 
 There is no windows laptop involved here
 
  Otherwise you will just have to remember to change it back to DHCP
  when mobile.
  
  Another possible solution is to set up static arp entries on the
  backuppc server, and then still use /etc/hosts.
 
 Ok, you suggest two way of giving the same IP to the two network
 interfaces of the laptop. Can't we give the different IP but the same
 hostname ? (Jack: if yes, how ?)
 
 I also heard of avahi, zeroconf ans bonjour as system useful to
 discover services in a network. I'm not sure about what this means, but
 this is used by linux and mac. Maybe this is the solution ? ? (Jack: if
 yes, how ?)
 
 Thanks a lot again
 
 cheers
 
 
 
 -
 This SF.net email is sponsored by: Microsoft
 Defy all challenges. Microsoft(R) Visual Studio 2005.
 http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] multiple IP for the same host

2007-09-28 Thread Jack Coats
You are on the right track.  I would suggest enable samba network
sharing but first, go in and edit your smb.conf (/etc/samba/smb.conf on
my machine) to disable all of the open network sharing. ... share only
what you want, if anything).  The important thing is that samba is
running so it could be detected by your backuppc server.

Other folks, please speak up in case I am spouting information 'from
where I do not know' :) 

On Fri, 2007-09-28 at 19:15 +0200, Sébastien Barthélemy wrote:
 Le vendredi 28 septembre 2007 à 09:55 -0500, Jack Coats a écrit :
  yes, expounded upon :) --- my wife gets on my case for answering the
  question she asked rather than what she meant!
  
  the facility to detect a machine by name rather than just IP works well.
  I would just try it first.  Don't put the laptop in your /etc/hosts
  files, and see it you can get to it from the server using nmblookup. (I
  forget the full syntax right now, sorry)
 
 Thank you for your suggestion,
 
 nmblookup does work for a windows computer but not for my ubuntu laptop
 (nor for a MacOS X laptop I tried a few month ago).
 
 Is there a simple and secure way to enable such reply ? (if yes, what is it 
 ?) Will it work on mac ?
 
 (I'm rather suspicious on that topic: last time I enabled samba sharing on 
 ubuntu, people where able to see all the computer user names)
 
 
 cheers
 
 
 


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] multiple IP for the same host

2007-09-28 Thread Jack Coats
yes, expounded upon :) --- my wife gets on my case for answering the
question she asked rather than what she meant!

the facility to detect a machine by name rather than just IP works well.
I would just try it first.  Don't put the laptop in your /etc/hosts
files, and see it you can get to it from the server using nmblookup. (I
forget the full syntax right now, sorry)

On Fri, 2007-09-28 at 11:30 +0200, Sébastien Barthélemy wrote:
 Le jeudi 27 septembre 2007 à 09:54 -0500, Jack Coats a écrit :
 On Thu, 2007-09-27 at 16:41 +0200, Sébastien Barthélemy wrote:
   Hello everybody.
   
   I use backuppc at home to backup 4 computers, including a laptop. All of
   them are on a private network behind a wireless router (provided by my
   ISP). The laptop can be connected by wire (with IP 192.168.15.5) or by
   wifi (with IP 192.168.15.6).
   
   I would like backuppc to backup the laptop when it is on the network,
   whatever it is by wire or wifi.
   
   Is it possible ?
   
   Let's also say that I have no DNS on the private network, I resolve
   names through /etc/hosts
 
  The short answer is, yes.
 
 ok. Thanks Jack !
 
 The next (short) question is how ?
 
 cheers
 
 Sebastien
 
 
 
 -
 This SF.net email is sponsored by: Microsoft
 Defy all challenges. Microsoft(R) Visual Studio 2005.
 http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] multiple IP for the same host

2007-09-27 Thread Jack Coats
The short answer is, yes.

On Thu, 2007-09-27 at 16:41 +0200, Sébastien Barthélemy wrote:
 Hello everybody.
 
 I use backuppc at home to backup 4 computers, including a laptop. All of
 them are on a private network behind a wireless router (provided by my
 ISP). The laptop can be connected by wire (with IP 192.168.15.5) or by
 wifi (with IP 192.168.15.6).
 
 I would like backuppc to backup the laptop when it is on the network,
 whatever it is by wire or wifi.
 
 Is it possible ?
 
 Let's also say that I have no DNS on the private network, I resolve
 names through /etc/hosts
 
 Thanks a lot for any help
 
 Cheers
 


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] XP Backup of system state

2007-09-26 Thread Jack Coats
Does someone have a 'systems state' backup procedure for a Windows XP
system (I need to do it for a Win2003 server also, but one issue at a
time)?

I used to do IBM TSM backups, and TSM generated a directory in the root
of the windows system drive, and it put all the needed systems state
backup there then backed that up as flat files.  It is what made doing a
bear metal restore possible.


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] restricting cgi users restore to their own files, or how to handle many users.

2007-09-26 Thread Jack Coats
I only ever have seen one backup package that allowed that fine grained
restores. ... SyBack by SyncSort, Inc. on the IBM VM operating system
for CMS users.  

Any user could restore any file they owned or could read to wherever
they had write permissions to.

That would be a 'killer enhancement to BackupPC!

On Wed, 2007-09-26 at 04:31 -0700, Craig Barratt wrote:
 Ronny writes:
 
  I am taking backup of a directory /home, containing ~1000 users.
  And i want to allow each of the users access to restore his own files.
  But NOT to read/restore files that he normaly would not.
  
  Example: user1 have a file in /home/user1/private.txt that have 600
  permissions. I dont want user2 to be able to read this thru the backuppc
  cgi.
  
  i have tested this with a line in hosts that say
  server  0   rootuser1,user2
  
  and it seams to me that user2 can read all files of the backup, even
  files he normaly would have no access to.
  
  So how others solve this problem ?
  must you have 1000 lines in hosts, one line for each homedir ?  Or are
  there a different way where i can have backuppc check the orginal
  permissions and deny restore if the user in question dont have the right
  access.
 
 BackupPC doesn't provide a mechanism to have fine-grained
 per-user permissions when browsing backups.  The host file
 users have permissions for the entire host: browsing, editing
 the configuration, starting and canceling backups, etc.
 
 Enforcing permissions is a bit difficult since apache doesn't
 provide the uid and gid - just the username - and the backups
 just contain the client uid/gid.  There is no guarantee that
 user names and uid/gids are common between the server and
 client.
 
 Perhaps we could have a new config variable which forces the
 browse path for non-admin users, eg:
 
 $Conf{CgiUserBrowseChroot} = {
 'user1' = '/home:/user1',
 'user2' = '/home:/user2',
 };
 
 (/home is the share, and /user1 is the path relative to
 that share)
 
 There could also be a wildcard form that allows any user to
 browse their folder:
 
 $Conf{CgiUserBrowseChroot} = {
 '*' = '/home:/*',
 };
 
 One drawback is this host won't appear in the pulldown in
 the navigation bar, since that is based on the hosts file.
 So the user has to navigate to their host by knowing the
 correct URL.
 
 Craig
 
 -
 This SF.net email is sponsored by: Microsoft
 Defy all challenges. Microsoft(R) Visual Studio 2005.
 http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] backup Linux/Mac clients that are DHCP

2007-09-20 Thread Jack Coats
The company I work for sells Asterisk based machines and DHCP is a big
deal for VOIP phones.

We find that it helps sometimes to TURN OFF the Linksys DHCP and let our
server do it.  You might try that with your backup server.  The Linksys
can still be the default gateway to the 'world', but you can manage the
DHCP yourself and make it 'work right' for you!

Another thought, set the DHCP timeout to be VERY LONG.  So you don't
hand out new IPs except to really NEW machines!  Think of how long a
machine is normally off your network, and double it.  That is a good
starting point!

On Thu, 2007-09-20 at 11:54 -0700, Jon Saints wrote:
 My only aversion to assigning fixed IPs via dhcp is that my clients generally 
 use cheap linksys routers that do not all support this feature.
 
 But i agree, it does sound like the way to go.
 Thanks
 Jon
 
 - Original Message 
 From: Carl Wilhelm Soderstrom [EMAIL PROTECTED]
 To: backuppc-users@lists.sourceforge.net
 Sent: Thursday, September 20, 2007 11:00:54 AM
 Subject: Re: [BackupPC-users] backup Linux/Mac clients that are DHCP
 
 On 09/20 08:37 , Jon Saints wrote:
  The problem occurs when the DHCP server changes the IP address of one of
  the clients. Once this happens, the security key exchange that i did
  originally with the client is no longer valid. As a result, subsequent
  backups fail.
  
  What is the best way around this? I prefer to keep my clients using DHCP.
  Would using a local Certificate Authority solve this problem? or is my
  best option to have the DHCP server assign fixed IPs to the clients that
  need to be backed up (i would prefer to avoid this if possible)?
 
 I find it simplifies administration to statically assign IP addresses to
 hosts, via the DHCP server. This way you can also set up DNS and be able to
 find hosts via DNS names, which makes it much easier when sharing the load
 with other administrators.
 
 Is there a reason you are opposed to static entries for each host?
 
 If you're willing to learn its idiosyncracies, Dynamic DNS is an option as
 well. Be warned that it's a bit fragile tho.
 


-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Newbie issue

2007-09-17 Thread Jack Coats
Ok, I assume this is a dumb user problem. ... I am testing getting
backuppc to work with this Linux client, but I cannot get it to work for
me.

I do have Dirvish working with the same client.

My log says:


2007-09-17 14:51:09 vcch164 is dhcp 192.168.98.164, user is vcch164

2007-09-17 14:51:09 full backup started for directory /
2007-09-17 14:51:13 Got fatal error during xfer (fileListReceive failed)
2007-09-17 14:51:18 Backup aborted (fileListReceive failed)



client config:


Contents of file /etc/backuppc/vcch164.pl, modified 2007-08-23 12:55:59 



$Conf{XferMethod} = 'rsync';
$Conf{RsyncClientPath} = '/usr/local/bin/rsync-new';
#Conf{RsyncClientCmd} = '$sshPath -q -x -l vcchtech $host $rsyncPath $argList+';
$Conf{RsyncClientCmd} = '$sshPath-x [EMAIL PROTECTED] $rsyncPath $argList+';
#$Conf{RsyncClientRestoreCmd} = '$sshPath -q -x -l vcchtech $host $rsyncPath 
$argList+';
$Conf{RsyncClientRestoreCmd} = '$sshPath -q -x -l [EMAIL PROTECTED] $rsyncPath 
$argList+';

$Conf{TarShareName} = ['/'];

#Conf{TarClientCmd} = '/usr/bin/sudo /usr/local/bin/tarCreate -v -f - -C 
$shareName --totals';
$Conf{TarClientCmd} = '/usr/local/bin/tarCreate -v -f - -C $shareName --totals';

$Conf{BackupFilesExclude} = ['/proc', '/dev', '/cdrom', '/media', '/floppy', 
'/mnt', '/var/lib/backuppc', '/backup', '/lost+found', '/tmp', '/temp'];



And the rsync-new is a script in /usr/local/bin:


#!/bin/sh
sudo /usr/bin/rsync $*


And the user vcchtech has permission to issue sudo /usr/bin/rsync
without being prompted for a password.


I really don't want to clutter up the list, but I will gladly send any config 
information that is needed for debugging.

I have been beating my head on this for quite a while, so your assistance is 
appreciated. ... Jack
-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Newbie issue

2007-09-17 Thread Jack Coats
Thanks Steven. ... to me it looks like it is getting the list compiled, 
Here is what I see in the log:



File /var/lib/backuppc/pc/vcch164/XferLOG.bad.z (Extracting only Errors)
Contents of file /var/lib/backuppc/pc/vcch164/XferLOG.bad.z, modified
2007-09-17 14:51:18 (Extracting only Errors)


Running: /usr/bin/ssh -x [EMAIL PROTECTED] /usr/local/bin/rsync-new --server 
--sender --numeric-ids --perms --owner --group --devices --links --times 
--block-size=2048 --recursive --exclude=/proc --exclude=/dev --exclude=/cdrom 
--exclude=/media --exclude=/floppy --exclude=/mnt --exclude=/var/lib/backuppc 
--exclude=/backup --exclude=/lost+found --exclude=/tmp --exclude=/temp 
--ignore-times . /
Xfer PIDs are now 18687
Got remote protocol 29
Negotiated protocol version 26
Sent exclude: /proc
Sent exclude: /dev
Sent exclude: /cdrom
Sent exclude: /media
Sent exclude: /floppy
Sent exclude: /mnt
Sent exclude: /var/lib/backuppc
Sent exclude: /backup
Sent exclude: /lost+found
Sent exclude: /tmp
Sent exclude: /temp
fileListReceive() failed
Done: 0 files, 0 bytes
Got fatal error during xfer (fileListReceive failed)
Backup aborted (fileListReceive failed)




On Mon, 2007-09-17 at 14:15 -0600, Steven Whaley wrote:

 That error message is unfortunately pretty general.  It means that
 something hung rsync before could gather the file list, likely before it
 even attempted to.  If you go to the host summary and view the error log
 it might give you more detail on what problem was.  If that doesn't help
 try running the RsyncClientCmd by hand on the CLI and see what kind of
 error message you get. 
 
 Jack Coats wrote:
  Ok, I assume this is a dumb user problem. ... I am testing getting
  backuppc to work with this Linux client, but I cannot get it to work
  for me.
 
  I do have Dirvish working with the same client.
 
  My log says:
 
  2007-09-17 14:51:09 vcch164
  http://vcch159/backuppc/index.cgi?host=vcch164 is dhcp
  192.168.98.164, user is vcch164
  http://vcch159/backuppc/index.cgi?host=vcch164
  2007-09-17 14:51:09 full backup started for directory /
  2007-09-17 14:51:13 Got fatal error during xfer (fileListReceive failed)
  2007-09-17 14:51:18 Backup aborted (fileListReceive failed)
 
   
 
 
  client config:
 
   
 
  Contents of file /etc/backuppc/vcch164.pl, modified 2007-08-23 12:55:59 

 
  $Conf{XferMethod} 
  http://vcch159/backuppc/index.cgi?action=viewtype=docs#item_%24conf%7bxfermethod%7d
   = 'rsync';
  $Conf{RsyncClientPath} 
  http://vcch159/backuppc/index.cgi?action=viewtype=docs#item_%24conf%7brsyncclientpath%7d
   = '/usr/local/bin/rsync-new';
  #Conf{RsyncClientCmd} = '$sshPath -q -x -l vcchtech $host $rsyncPath 
  $argList+';
  $Conf{RsyncClientCmd} 
  http://vcch159/backuppc/index.cgi?action=viewtype=docs#item_%24conf%7brsyncclientcmd%7d
   = '$sshPath-x [EMAIL PROTECTED] $rsyncPath $argList+';
  #$Conf{RsyncClientRestoreCmd} 
  http://vcch159/backuppc/index.cgi?action=viewtype=docs#item_%24conf%7brsyncclientrestorecmd%7d
   = '$sshPath -q -x -l vcchtech $host $rsyncPath $argList+';
  $Conf{RsyncClientRestoreCmd} 
  http://vcch159/backuppc/index.cgi?action=viewtype=docs#item_%24conf%7brsyncclientrestorecmd%7d
   = '$sshPath -q -x -l [EMAIL PROTECTED] $rsyncPath $argList+';
 
  $Conf{TarShareName} 
  http://vcch159/backuppc/index.cgi?action=viewtype=docs#item_%24conf%7btarsharename%7d
   = ['/'];
 
  #Conf{TarClientCmd} = '/usr/bin/sudo /usr/local/bin/tarCreate -v -f - -C 
  $shareName --totals';
  $Conf{TarClientCmd} 
  http://vcch159/backuppc/index.cgi?action=viewtype=docs#item_%24conf%7btarclientcmd%7d
   = '/usr/local/bin/tarCreate -v -f - -C $shareName --totals';
 
  $Conf{BackupFilesExclude} 
  http://vcch159/backuppc/index.cgi?action=viewtype=docs#item_%24conf%7bbackupfilesexclude%7d
   = ['/proc', '/dev', '/cdrom', '/media', '/floppy', '/mnt', 
  '/var/lib/backuppc', '/backup', '/lost+found', '/tmp', '/temp'];
   
 

 
  And the rsync-new is a script in /usr/local/bin:
 
  #!/bin/sh
  sudo /usr/bin/rsync $*
 
  And the user vcchtech has permission to issue sudo /usr/bin/rsync
  without being prompted for a password.
 
  I really don't want to clutter up the list, but I will gladly send any 
  config information that is needed for debugging.
 
  I have been beating my head on this for quite a while, so your assistance 
  is appreciated. ... Jack

  
 
  -
  This SF.net email is sponsored by: Microsoft
  Defy all challenges. Microsoft(R) Visual Studio 2005.
  http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
  
 
  ___
  BackupPC-users mailing list
  BackupPC-users

Re: [BackupPC-users] Idea: disk usage graphs

2007-07-31 Thread Jack Coats
I can see a desire for the small installations to want a 'one size fits
all' backup and monitoring product.  BackupPC does most of what is needed
pretty well.

I would like just a simple 'red light'/'yellow light'/'green light' -
backup did not start (red), backup had problems but backed up some files
(yellow), and (green) if everything completed without incident. ... But
that is MY perspective.  And yes, then I would complain tha I couldn't
click on the 'light' and see the log files :)

But if something like that would be put in, I think it should be put in so
as to replace a small 'module' with hooks to another appropriate
monitoring package, so it can be EASILY extensible to whatever someone
else might want at least from the BackupPC side.

On Tue, July 31, 2007 8:22 am, Michael Mansour wrote:
 Hi Ludovic,

 Hi !

 I think that the following feature would be nice:
 - having a graph of the pool size, and remaining disk usage.

 It would allow us to predict future disk usage and anticipate a disk
 upgrade. This could be easily added using the rrdtool binary, but I
 may have not time to do this, so, if someone has more free time :-)

 There's plenty of tools to already do this for you. For the simplest you
 can
 use Webminstats as a module installed into Webmin. Install it, turn it on.
 It
 doesn't get harder than that.

 For the more complex you can go with Nagios or Cacti. I see no reason why
 BackupPC should go the route of graphing disk usage when there's so many
 other
 tools that specialise in it.

 Regards,

 Michael.

 Cheers,

 --
 Ludovic Drolez.

 http://zaurus.palmopensource.com   - The Zaurus Open Source
 Portal http://www.drolez.com  - Personal site - Linux, Zaurus
 and PalmOS stuff

 -
 This SF.net email is sponsored by: Splunk Inc.
 Still grepping through log files to find problems?  Stop.
 Now Search log events and configuration files using AJAX and a browser.
 Download your FREE copy of Splunk now   http://get.splunk.com/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/
 --- End of Original Message ---


 -
 This SF.net email is sponsored by: Splunk Inc.
 Still grepping through log files to find problems?  Stop.
 Now Search log events and configuration files using AJAX and a browser.
 Download your FREE copy of Splunk now   http://get.splunk.com/
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/backuppc-users
 http://backuppc.sourceforge.net/



-- 
Jack Coats [EMAIL PROTECTED] 615-382-4758

-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now   http://get.splunk.com/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/