On Jan 31, 2012, at 5:10 PM, Les Mikesell wrote: > On Tue, Jan 31, 2012 at 5:25 PM, Kimball Larsen <quang...@gmail.com> wrote: >> We are a small office (6 employees) with a mixture of windows and mac >> machines sitting on desks. I have set up a server (Ubuntu linux) that has >> been happily running backuppc for several years handling backups for all the >> machines in the office with grace AND style. We love it. >> >> However, in the last few months some of the users have noticed that when >> backuppc is running a backup (incremental or full - does not seem to matter >> which) it can have a serious impact to the performance of their local >> machine. Stuff comes to a crawl and they are nearly unable to work because >> simple things like switching from one application to another starts to take >> several seconds, etc. The machine behaves like it is hammering swap space >> and thrashing for memory. At least one user reports this goes on for >> several hours (and I confirmed that his latest incremental took 119 minutes >> to complete). >> >> All the machines affected in this way are wired to the gigabit network (not >> wireless), and I'm using rsync for the transfer method. The users with the >> complaints are all using OS X on late model high-end MacBook Pro laptops. >> >> Is there anything I can to to have the backups run in a more transparent >> manner? We are not all that concerned with speed of backup process - we're >> all here all day anyway, so as long as everyone gets a backup at least once >> a day we're happy. > > Are you using any 'scan on access' type of virus protection? That > would be odd for a Mac, but I think there are such things. Nope, no realtime scanning stuff at all. > Do any > have local time machine backups that might be included?
No, time machine is on external drives, specifically excluded from backups. > Or > directories with very large numbers of files? This I can check on. What is considered "very large numbers of files"? More than 1024? More than 102400? > I think the rsync at > each end will keep a copy of the whole directory tree in memory while > both ends walk and compare contents. Normally this would be very fast > on incrementals where it doesn't do more than the directory check for > files that match but the list might be big enough to swap to disk. Hmm.. Does it produce a copy of the whole directory tree for each backup location? If so, would it be beneficial to split up the backups such that instead of telling it to backup /Users/myusername/ I explicitly list each of the directories in my home: /Users/myusername/Documents/ /Users/myusername/Library ...etc? Thanks! -- Kimball ------------------------------------------------------------------------------ Keep Your Developer Skills Current with LearnDevNow! The most comprehensive online learning library for Microsoft developers is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3, Metro Style Apps, more. Free future releases when you subscribe now! http://p.sf.net/sfu/learndevnow-d2d _______________________________________________ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/