Re: [BackupPC-users] Backups -- too big
Rob Poe escribió: I'm running out of disk space on my BackupPC server. We'll just say that there's a lot of unnecessary PST duplication that I cannot control (i.e. backup a 1.5g PST, then compress, leave the backup there, forget it's there, it gets backed up). I notice that a lot of the PC's have multiple FULL backups, and many inc. Is there a way to keep 1 FULL and maybe 3-4 days of INCR ? Even if that means running a full every 3-4 days (as long as the current full expires quickly). This is temporary until more disk space is aquired. But for now .. It's necessary (pool is 98%) Thanks! Hi!! When i have a host with pst files , i create a new host only for copy pst files. For example: You have a host *pc1 *- where your are coping all. ( here you must to exclude the pst files). And now create in backuppc an other host *pc1PST *- where you will copy only the pst files. In *pc1PST *you have to change the configuration in schedule. Put FullKeepCnt http://bearnas:8004/backuppc?action=viewtype=docs#item__conf_fullkeepcnt_ = 1 and IncrKeepCnt http://bearnas:8004/backuppc?action=viewtype=docs#item__conf_incrkeepcnt_ = 2, whith this you copy only 1 full and 2 incr. -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Omar Llorens Crespo Domínguez. JPL TSOLUCIO, SL o...@tsolucio.com www.tsolucio.com www.bearnas.com 902 88 69 38 -- Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] send emails to customers AND admin?
error403 escribió: Hi, I'm trying to find a way to send an email to the personal email of the people I'm doing their backups for. I tried to search but the terms email and message are so general it gives me almost all the posts on the forum! :? Hi, In each host you can configure the email only with this, EMailAdminUserName http://192.168.0.100/backuppc?action=viewtype=docs#item__conf_emailadminusername_ = ho...@email.com, and backuppc send a warning or error to this host. The EMailAdminUserName http://192.168.0.100/backuppc?action=viewtype=docs#item__conf_emailadminusername_, it's for know who send the email. +-- |This was sent by krunchyf...@videotron.ca via Backup Central. |Forward SPAM to ab...@backupcentral.com. +-- -- Crystal Reports - New Free Runtime and 30 Day Trial Check out the new simplified licensing option that enables unlimited royalty-free distribution of the report engine for externally facing server and web deployment. http://p.sf.net/sfu/businessobjects ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Omar Llorens Crespo Domínguez. JPL TSOLUCIO, SL o...@tsolucio.com www.tsolucio.com www.bearnas.com 902 88 69 38 -- Crystal Reports - New Free Runtime and 30 Day Trial Check out the new simplified licensing option that enables unlimited royalty-free distribution of the report engine for externally facing server and web deployment. http://p.sf.net/sfu/businessobjects ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Backup only new file(s)
Mirco Piccin escribió: Hi all, i have to backup a folder (using smb). Every day (except sunday) a procedure stores in this folder a 120GB file. The name of the file is the day name. So, in a week, i have 6 different files generated (about 720 GB). Every week the files are overwritten by the procedure. I'd like to backup only the newest file, and not all the folder. The problem is that i suppose i must have a full backup of the folder (720 GB), because of $Conf{FullKeepCnt} must be = 1, plus incremental backup. So, configuring: $Conf{FullPeriod} = 6.97; $Conf{IncrKeepCnt} = 6; i'll have : on sunday the full backup - 720 GB on monday the incremental backup - 720 GB (the full backup) plus 120 GB (the new monday file) on tuesday the incremental backup - 840 GB (the full backup plus incremental) plus 120 GB (the new tuesday file) and so on, for a total of 1440 GB (the double of the effective disk space needed). And again, sunday BackupPC will move 720 GB of files, and so on. Is there a way to backup only the new file (maybe playing with $Conf{IncrLevels}), without a full? Or a way to optimize it? Thanks Regards M Hi, I think is better that you changer your xfermetoth to rsyncd. rsyncd only copy the new files. Also you can change your configuration in $Conf{FullKeepCnt} = 1, because you only need the last copy and $Conf{IncrKeepCnt} = 1 or 2; -- Omar Llorens Crespo Domínguez. JPL TSOLUCIO, SL o...@tsolucio.com www.tsolucio.com www.bearnas.com 902 88 69 38 -- Crystal Reports - New Free Runtime and 30 Day Trial Check out the new simplified licensing option that enables unlimited royalty-free distribution of the report engine for externally facing server and web deployment. http://p.sf.net/sfu/businessobjects ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Anyone using BackupPC on OpenSUSE 11.1?
Bharat Mistry escribió: any simple to follow HowTo's around or a RPM? Hi all!! You can dowload this: wget http://downloads.sourceforge.net/backuppc/BackupPC-3.1.0.tar.gz?modtime=1196037667big_mirror=0 http://downloads.sourceforge.net/backuppc/BackupPC-3.1.0.tar.gz?modtime=1196037667big_mirror=0 After uncompress this tar and excute this *perl configure.pl *and you only have to answer any questions about path, backuppc user ,etc,... Ahora nos copiamos el siguiente fichero a /etc/init.d Now you must copy the next file to /etc/init.d 1. cp /xx/xx/xx/BackupPC-3.1.0/init.d/linux-backuppc /etc/init.d/backuppc 2. cd /etc/init.d 3. chmod +x backuppc 4. cd /etc/rc3.d 5. ln -s ../init.d/backuppc S99backuppc PS: before to install backuppc you need in your pc this Perl's modules that you can install with Yast: Compress::Zlib To enable compression, you will need to install Compress::Zlib from http://www.cpan.org. You can run ``perldoc Compress::Zlib'' to see if this module is installed. Archive::Zip To support restore via Zip archives you will need to install Archive::Zip, also from http://www.cpan.org. You can run ``perldoc Archive::Zip'' to see if this module is installed. XML::RSS To support the RSS feature you will need to install XML::RSS, also from http://www.cpan.org. There is not need to install this module if you don't plan on using RSS. You can run ``perldoc XML::RSS'' to see if this module is installed. File::RsyncP To use rsync and rsyncd with BackupPC you will need to install File::RsyncP. You can run ``perldoc File::RsyncP'' to see if this module is installed. File::RsyncP is available from http://perlrsync.sourceforge.net. Version 0.52 or later is required. Regards Omar Llorens Crespo Domínguez JPL TSOLUCIO S.L. www.tsolucio.com http://bearnas.com 902 886 938 Spain -- This SF.net email is sponsored by: High Quality Requirements in a Collaborative Environment. Download a free trial of Rational Requirements Composer Now! http://p.sf.net/sfu/www-ibm-com ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- This SF.net email is sponsored by: High Quality Requirements in a Collaborative Environment. Download a free trial of Rational Requirements Composer Now! http://p.sf.net/sfu/www-ibm-com___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] help-automatic BackupPC
mohan infant escribió: Need help... Automatic incremental backup is not working.. BackupPC_dump incremental not working.. please help. Hi, Can you give more information about the problem? any error? Omar Llorens JPL TSOLUCIO S.L www.tsolucio.com http://bearnas.com 902 886 938 Spain -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Adding an Audit Interface
Mike Loseke escribió: Hello all, I wanted to drop a note here about a short blog post I put up on adding an audit interface to a BackupPC install. This is basically just a limited read-only interface to allow a non-admin user to look at backup status, backup set contents, etc, for all hosts being backed up on a server. It's fairly simple and doesn't get into the code much and is far from perfect but it works. http://www.tummy.com/journals/entries/mike_20090305_152711 This is something that will help us shift some workload around when a large number of backup servers need auditing, which is part of an set of business requirements. Mike Hi, Thank you for this article it's very interesting Omar Llorens o...@tsolucio.com JPL TSOLUCIO S.L www.tsolucio.com 902 886 938 Spain -- Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA -OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise -Strategies to boost innovation and cut costs with open source participation -Receive a $600 discount off the registration fee with the source code: SFAD http://p.sf.net/sfu/XcvMzF8H ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA -OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise -Strategies to boost innovation and cut costs with open source participation -Receive a $600 discount off the registration fee with the source code: SFAD http://p.sf.net/sfu/XcvMzF8H ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Issue with BackupPC
Rob Poe escribió: One of my clients backs up about 15 PCs with BP. One of the PC's is backing up, nightly incrementals, full weekly. But the incremental files don't show up when I click on the incremental backup in the interface. They exist on the disk, just don't show up in the interface. Anyone else seen this behaviour? Hi, You check the file permissions of your incrmentals copy, maybe the Backuppc can't read this files, because changed the user of this directories. Omar Llorens JPL TSOLUCIO S.L o...@tsolucio.com www.tsolucio.com 902 886 938 Spain -- Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA -OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise -Strategies to boost innovation and cut costs with open source participation -Receive a $600 discount off the registration fee with the source code: SFAD http://p.sf.net/sfu/XcvMzF8H ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA -OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise -Strategies to boost innovation and cut costs with open source participation -Receive a $600 discount off the registration fee with the source code: SFAD http://p.sf.net/sfu/XcvMzF8H ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] BackupPC possible on Amazon cloud and bare metal recovery?
Sanjay Arora escribió: Hello all I would like to know if it s possible to implement backupPC on the Amazon cloud or any similar virtual hosting, so that local storage/risk issues may be avoided? Hi, I think is not a problem to implement a backupPC on Amazon, but you must to have a AMC2 for the virtual machine and AMS3 for storage your information. Because if happens something with your virtual machine or I'm not sure restarting this , you will lost your information, but if you have AMS3 with your files, no will be a problem. Also you have to think in price for the transfer of your files, you pay for GB in or out from your virtual machine. Regards. Omar Llorens Crespo Domínguez JPL TSOLUCIO S.L o...@tsolucuo.com 902 886 938 Spain Also, how is it possible to store the hard-disk partition information and to restore the partitions, followed by restoring all files from backupPC? What are the issues involved? Is some project doing it? What kind of extension would be required to do it under backupPC? With best regards. Sanjay. -- Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA -OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise -Strategies to boost innovation and cut costs with open source participation -Receive a $600 discount off the registration fee with the source code: SFAD http://p.sf.net/sfu/XcvMzF8H ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA -OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise -Strategies to boost innovation and cut costs with open source participation -Receive a $600 discount off the registration fee with the source code: SFAD http://p.sf.net/sfu/XcvMzF8H ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] I received the error No files dumped for share
Holger Parplies escribió: Hi, Craig Barratt wrote on 2009-01-07 22:46:09 -0800 [Re: [BackupPC-users] I received the error No files dumped for share]: Omar writes: $Conf{TarClientCmd} = ' env LC_ALL=C /usr/bin/sudo $tarPath -c -v -f - -C $shareName+' . ' --totals'; $Conf{TarClientRestoreCmd} = ' env LC_ALL=C /usr/bin/sudo $tarPath -x -p --numeric-owner --same-owner' . ' -v -f - -C $shareName+'; Both of these are wrong - they start with a space. BackupPC doesn't know what program to exec. You need something like: $Conf{TarClientCmd} = '/usr/bin/sudo env LC_ALL=C $tarPath -c -v -f - -C $shareName+ --totals'; though I would like to point out that configuring sudo to allow execution of 'env' is effectively allowing arbitrary commands as root. You need to have something like backuppcALL=NOPASSWD: /usr/bin/env LC_ALL=C /usr/bin/tar -c * in /etc/sudoers to be fairly secure (you might want to specify more or less of the command line, depending on what you want; more to limit it to a specific directory, less to allow restores with the same sudoers entry). If you use something like '/usr/bin/env LC_ALL=C /usr/bin/sudo ...' that probably won't work, depending on which environment variables your sudo propagates and which it doesn't. Regards, Holger -- Check out the new SourceForge.net Marketplace. It is the best place to buy or sell services for just about anything Open Source. http://p.sf.net/sfu/Xq1LFB ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] I received the error No files dumped for share
Holger Parplies escribió: Hi, Craig Barratt wrote on 2009-01-07 22:46:09 -0800 [Re: [BackupPC-users] I received the error No files dumped for share]: Omar writes: $Conf{TarClientCmd} = ' env LC_ALL=C /usr/bin/sudo $tarPath -c -v -f - -C $shareName+' . ' --totals'; $Conf{TarClientRestoreCmd} = ' env LC_ALL=C /usr/bin/sudo $tarPath -x -p --numeric-owner --same-owner' . ' -v -f - -C $shareName+'; Both of these are wrong - they start with a space. BackupPC doesn't know what program to exec. You need something like: $Conf{TarClientCmd} = '/usr/bin/sudo env LC_ALL=C $tarPath -c -v -f - -C $shareName+ --totals'; though I would like to point out that configuring sudo to allow execution of 'env' is effectively allowing arbitrary commands as root. You need to have something like backuppcALL=NOPASSWD: /usr/bin/env LC_ALL=C /usr/bin/tar -c * in /etc/sudoers to be fairly secure (you might want to specify more or less of the command line, depending on what you want; more to limit it to a specific directory, less to allow restores with the same sudoers entry). If you use something like '/usr/bin/env LC_ALL=C /usr/bin/sudo ...' that probably won't work, depending on which environment variables your sudo propagates and which it doesn't. Regards, Holger Hi. Yes, i had to edit /etc/sudoers for put my backuppc user, with NOPASSWD, and in the config.pl i have write the said me Craig $Conf{TarClientCmd} = '/usr/bin/sudo env LC_ALL=C $tarPath -c -v -f - -C $shareName+ --totals'; At now i don't have any problem. Thank your four your help!! Regards, Omar Llorens JPL TSOLUCIO, S.L www.tsolucio.com 902 886 938 Spain -- Check out the new SourceForge.net Marketplace. It is the best place to buy or sell services for just about anything Open Source. http://p.sf.net/sfu/Xq1LFB ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] I received the error No files dumped for share
Hi, I have the same problem , but not with a Windows XP. I try to backup the same server where i have instaled the backuppc. My configuration is the next: $Conf{FullKeepCnt} = [ 4, 0, 6 ]; $Conf{IncrKeepCnt} = 28; $Conf{TarShareName} = [ '/etc' ]; $Conf{XferMethod} = 'tar'; $Conf{TarClientCmd} = ' env LC_ALL=C /usr/bin/sudo $tarPath -c -v -f - -C $shareName+' . ' --totals'; $Conf{TarClientRestoreCmd} = ' env LC_ALL=C /usr/bin/sudo $tarPath -x -p --numeric-owner --same-owner' . ' -v -f - -C $shareName+'; But when i try to do full backup i have the next error: Executing DumpPreUserCmd: /var/lib/raa/scripts/precopy /etc Exec of /var/lib/raa/scripts/precopy /etc failed Running: /usr/bin/sudo /bin/tar -c -v -f - -C /etc --totals . full backup started for directory /etc Xfer PIDs are now 32571,32570 Exec failed for /usr/bin/sudo /bin/tar -c -v -f - -C /etc --totals . tarExtract: Done: 0 errors, 0 filesExist, 0 sizeExist, 0 sizeExistComp, 0 filesTotal, 0 sizeTotal Executing DumpPostUserCmd: /var/lib/raa/scripts/postcopy /etc Exec of /var/lib/raa/scripts/postcopy /etc failed Got fatal error during xfer (No files dumped for share /etc) Backup aborted (No files dumped for share /etc) Not saving this as a partial backup since it has fewer files than the prior one (got 0 and 0 files versus 0) Thank you for your answer. JPL TSOLUCIO S.L www.tsolucio.com 902 886 938 Craig Barratt escribió: Sean writes: I have tried to do a full backup of a Windows XP PC. the Backup is successful. Although I get the error ?No files dumped for share. What is wrong? The backup isn't successful (since no files were dumped for one (or more) shares). Please look at the XferLOG.bad file (which should be quite short) and if the answer isn't apparent, email the contents of the file (or at least the first few lines) to this thread. You should also explain which XferMethod you are using and the corresponding Share and Include/Exclude settings. Craig -- Check out the new SourceForge.net Marketplace. It is the best place to buy or sell services for just about anything Open Source. http://p.sf.net/sfu/Xq1LFB ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ -- Check out the new SourceForge.net Marketplace. It is the best place to buy or sell services for just about anything Open Source. http://p.sf.net/sfu/Xq1LFB ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] upgrading backuppc 3.0 to 3.1
Leandro Tracchia escribió: can anyone give me or direct me to good instructions on upgrading backuppc from 3.0 to 3.1 on ubuntu 8.04?? has anyone done this successfully?? I upgrade backuppc to 3.1 but not in ubuntu. I downloaded new backuppc from sourceforge and executed ./configure and answer all the questions for the configuration again, but before you have to save all the pc.pl that you have. I think that with apt in ubuntu you only have to save your pc.pl and make a apt-get update backuppc Omar Llorens Crespo Domínguez JPL TSOLUCIO, S.L. [EMAIL PROTECTED] www.tsolucio.com 902 886 938 Spain - This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100url=/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ - This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100url=/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] incremental backups taking longer than full
Mark Maciolek escribió: hi, Using rsync between two linux servers, the full took 2.5 hours, the incremental backups are taking longer each day. 2008-10-21 23:00:01 full backup started for directory / 2008-10-22 02:32:55 full backup 0 complete, 294903 files, 203401100538 bytes, 28 xferErrs (0 bad files, 0 bad shares, 28 other) 2008-10-22 23:00:00 incr backup started back to 2008-10-21 23:00:01 (backup #0) for directory / 2008-10-23 04:59:40 incr backup 1 complete, 541 files, 170443233887 bytes, 11 xferErrs (0 bad files, 0 bad shares, 11 other) 2008-10-23 23:00:00 incr backup started back to 2008-10-21 23:00:01 (backup #0) for directory / 2008-10-24 06:41:00 incr backup 2 complete, 889 files, 169752997145 bytes, 10 xferErrs (0 bad files, 0 bad shares, 10 other) 2008-10-24 23:00:00 incr backup started back to 2008-10-21 23:00:01 (backup #0) for directory / 2008-10-25 06:41:42 incr backup 3 complete, 1190 files, 170136700321 bytes, 1 xferErrs (0 bad files, 0 bad shares, 1 other) 2008-10-25 23:00:01 incr backup started back to 2008-10-21 23:00:01 (backup #0) for directory / 2008-10-26 06:28:34 incr backup 4 complete, 1491 files, 170214964434 bytes, 13 xferErrs (0 bad files, 0 bad shares, 13 other) 2008-10-26 23:00:01 incr backup started back to 2008-10-21 23:00:01 (backup #0) for directory / 2008-10-27 06:54:31 incr backup 5 complete, 1826 files, 170294261665 bytes, 2 xferErrs (0 bad files, 0 bad shares, 2 other) 2008-10-27 23:00:01 incr backup started back to 2008-10-21 23:00:01 (backup #0) for directory / 2008-10-28 07:01:38 incr backup 6 complete, 2174 files, 170711300863 bytes, 14 xferErrs (0 bad files, 0 bad shares, 14 other) 2008-10-28 23:12:21 incr backup started back to 2008-10-21 23:00:01 (backup #0) for directory / 2008-10-29 08:27:44 incr backup 7 complete, 2481 files, 171012691775 bytes, 2 xferErrs (0 bad files, 0 bad shares, 2 other) 2008-10-29 23:18:49 incr backup started back to 2008-10-21 23:00:01 (backup #0) for directory / The command that is running: incr backup started back to 2008-10-21 23:00:01 (backup #0) for directory / Running: /usr/bin/ssh -q -x -l root stella /usr/bin/rsync --server --sender --numeric-ids --perms --owner --group -D --links --hard-links --times --block-size=2048 --recursive . / Xfer PIDs are now 1783 Got remote protocol 28 Negotiated protocol version 28 Sent exclude: /tmp Sent exclude: /net Sent exclude: /sys Sent exclude: /proc Sent exclude: /dev Xfer PIDs are now 1783,1792 Any suggestions on what to look for as too why the incremental backups take so much longer than the full? Mark Do you have any virtual machine? If you have anyone, the vm ocuped 4gb for example and if you modificated anything in the vm , when you make an incremental backup , BackupPc copy again the vm's image. This isn't only for image i referer any file so big , that is modify and the backuppc think, it's a new file !!! ooohhh i have to copy!! jeje. Omar Llorens Crespo Domínguez JPL TSOLUCIO, S.L Informatic solutions www.tsolucio.com 902 886 938 Spain - This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100url=/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] Incremental dumps hanging with 'Can't get rsync digests' 'Can't call method isCached'
If any one is interested, in my company we have a new solution for a NAS, SAN, Backups, etc, ... We make a new product bearNAS it's a pendrive that has a BackupPc and an Openfiler, all integrate. Only you need simple machine with all the hard disk that you want and usb port. This solution is cheap and easy for work and permit you do a lot of things, clusters, bonding, lvm, iscsi, You can speak with me, [EMAIL PROTECTED] or my boss , Joe Bordes, [EMAIL PROTECTED] We want colobarate with this comunity and with others, for this don't dude for ask us. JPL TSOLUCIO S.L www.tsolucio.com 902 886 938 [EMAIL PROTECTED] Spain Jeffrey J. Kosowsky escribió: Rob Owens wrote at about 11:57:01 -0400 on Monday, October 27, 2008: Jeffrey J. Kosowsky wrote: What is the alternative if you don't have room on your server and if you can't afford something fancier than a SAN? For me, using NAS is very economical given the cost of drives and the existence of cheap embedded Linux NAS devices. Maybe I am missing an easy better alternative. I'm not sure what a NAS costs these days, but my BackupPC server is a white box desktop-class machine with SATA drives in software RAID 1. It cost me $600. It runs the server software and stores the backups locally. You can keep it cheap by using a mini/micro ATX motherboard -- they've usually got onboard video and onboard LAN, and you're not likely to need much in the way of PCI slots. Just make sure the motherboard has plenty of room for expansion in terms of RAM and hard disks. Well I bought a dns-323 for about $130 and got 2 1-TB Seagate drives for $149 each. So under $450 for a 1TB of RAID-1 backup. Also it uses only a few watts of power (and even less when the disks power down since I mount root off of a small surplus usb stick). Also the dns-323 is extremely well built (metal, solid, not platic) and small - not much bigger than the 2 drives themselves side-to-side) But I agree that one could do well also by going the white box way... - This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100url=/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/ - This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100url=/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
[BackupPC-users] How to find the fisical files to remove?
Hi, I have a PC to make a backups with BackupPC, but the problem is that the hard disk si full and backuppc can't do more backupps. I was checking the configuration of the rsyncd in my clients and i saw that backuppc have to copy a (mp3, iso,...). When i saw this, i excluded this files in a winxp.pl. Now i need to remove this files (mp3, iso,), i used BackupPC_deleteBackup.sh but i can't remove only this files , it remove the number o backup, and if i do ./BackupPC_deleteBackup.sh -c pcWinxp -d 411 -f , i haven`t the real free space that i must have. Are there any way to remove only the files i want? not only the links, the fisical file too. Thanks for your help, and sorry for my english. JPL TSOLUCIO S.L Omar Ll. Crespo Domínguez [EMAIL PROTECTED] www.tsolucio.com 902 886 938 Spain - This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100url=/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
Re: [BackupPC-users] How to find the fisical files to remove?
Thank you for your help, i have use rm, not BackupPC_deleteBackup.sh, and after BackupPC_nightly. I have free space :) JPL TSOLUCIO S.L Omar Ll. Crespo Domínguez [EMAIL PROTECTED] www.tsolucio.com 902 886 938 Spain Adam Goryachev escribió: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Omar Llorens Crespo Domínguez wrote: Hi, I have a PC to make a backups with BackupPC, but the problem is that the hard disk si full and backuppc can't do more backupps. I was checking the configuration of the rsyncd in my clients and i saw that backuppc have to copy a (mp3, iso,...). When i saw this, i excluded this files in a winxp.pl. Now i need to remove this files (mp3, iso,), i used BackupPC_deleteBackup.sh but i can't remove only this files , it remove the number o backup, and if i do ./BackupPC_deleteBackup.sh -c pcWinxp -d 411 -f , i haven`t the real free space that i must have. Are there any way to remove only the files i want? not only the links, the fisical file too. Thanks for your help, and sorry for my english. After you remove data from the backuppc/pc/hostname/num/ directory, you also need to run the BackupPC_nightly which will remove the files from the pool/cpool directory as needed. You browse the directories in the backup folders and delete random files from the backup, or something like: cd backuppc/pc/host rm -vi `find . |grep .iso$` Which will delete all the iso files for that host. Then you can run the nightly script, and it will actually free up the space. Regards, Adam -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.6 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org iD8DBQFI/FQmGyoxogrTyiURAvMhAJ4+2OXUbyo4bO2Uv04B/jX4GxN24wCgvvFL LQ2ynAIQV5Z60G1b+YxaSN4= =+m3j -END PGP SIGNATURE- - This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100url=/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/