Re: [BackupPC-users] How to really/immediately exclude stuff from next backup
It should be more or less as slow as doing the new backup without removing them, as it's doing essentially the same thing. The difference would be to do it out of "backup hours" or when the server has more free resources. On Thu, Oct 19, 2023 at 11:49 AM marki wrote: > Alright. In any case removing files from the current backup (because > they don't exist anymore / have been excluded / ...) is painfully slow. > > On 2023-10-19 16:04, Guillermo Rozas wrote: > >> It's not a capacity problem. > >> It's a performance problem. > > > > Sorry, you implied that the problem was the capacity when you said > > > >>> Also in the original example the disk isn't large enough, so > > we're>> not > >>> even making it to that stage. > > > >> But again, that was not the question. > >> The question is, why is that new backup set being populated with > >> stuff > >> just to be deleted again after that (the exclusion list). > > > > It's by design, in V4 the backup is stored as reverse deltas, and the > > latest backup starts as an identical copy of the previous one and it's > > modified to match the current status. Check the manual for the > > reasons: > > https://backuppc.github.io/backuppc/BackupPC.html#BackupPC-4.0, > > specially the "more detailed discussion". > > > > Perhaps a solution for you would be to delete the ignored folder from > > the last valid backup, so the new backup doesn't have it at the start. > > Check the BackupPC_backupDelete command > > ( > https://backuppc.github.io/backuppc/BackupPC.html#Other-Command-Line-Utilities > ), > > but be careful as it could be pretty destructive. > > > > Regards, > > Guillermo > > ___ > > BackupPC-users mailing list > > BackupPC-users@lists.sourceforge.net > > List:https://lists.sourceforge.net/lists/listinfo/backuppc-users > > Wiki:https://github.com/backuppc/backuppc/wiki > > Project: https://backuppc.github.io/backuppc/ > > > ___ > BackupPC-users mailing list > BackupPC-users@lists.sourceforge.net > List:https://lists.sourceforge.net/lists/listinfo/backuppc-users > Wiki:https://github.com/backuppc/backuppc/wiki > Project: https://backuppc.github.io/backuppc/ > ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:https://github.com/backuppc/backuppc/wiki Project: https://backuppc.github.io/backuppc/
Re: [BackupPC-users] How to really/immediately exclude stuff from next backup
Alright. In any case removing files from the current backup (because they don't exist anymore / have been excluded / ...) is painfully slow. On 2023-10-19 16:04, Guillermo Rozas wrote: It's not a capacity problem. It's a performance problem. Sorry, you implied that the problem was the capacity when you said Also in the original example the disk isn't large enough, so we're>> not even making it to that stage. But again, that was not the question. The question is, why is that new backup set being populated with stuff just to be deleted again after that (the exclusion list). It's by design, in V4 the backup is stored as reverse deltas, and the latest backup starts as an identical copy of the previous one and it's modified to match the current status. Check the manual for the reasons: https://backuppc.github.io/backuppc/BackupPC.html#BackupPC-4.0, specially the "more detailed discussion". Perhaps a solution for you would be to delete the ignored folder from the last valid backup, so the new backup doesn't have it at the start. Check the BackupPC_backupDelete command (https://backuppc.github.io/backuppc/BackupPC.html#Other-Command-Line-Utilities), but be careful as it could be pretty destructive. Regards, Guillermo ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:https://github.com/backuppc/backuppc/wiki Project: https://backuppc.github.io/backuppc/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:https://github.com/backuppc/backuppc/wiki Project: https://backuppc.github.io/backuppc/
Re: [BackupPC-users] How to really/immediately exclude stuff from next backup
> > It's not a capacity problem. > It's a performance problem. > Sorry, you implied that the problem was the capacity when you said >> Also in the original example the disk isn't large enough, so we're >> not >> even making it to that stage. But again, that was not the question. > The question is, why is that new backup set being populated with stuff > just to be deleted again after that (the exclusion list). > It's by design, in V4 the backup is stored as reverse deltas, and the latest backup starts as an identical copy of the previous one and it's modified to match the current status. Check the manual for the reasons: https://backuppc.github.io/backuppc/BackupPC.html#BackupPC-4.0, specially the "more detailed discussion". Perhaps a solution for you would be to delete the ignored folder from the last valid backup, so the new backup doesn't have it at the start. Check the BackupPC_backupDelete command ( https://backuppc.github.io/backuppc/BackupPC.html#Other-Command-Line-Utilities), but be careful as it could be pretty destructive. Regards, Guillermo ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:https://github.com/backuppc/backuppc/wiki Project: https://backuppc.github.io/backuppc/
Re: [BackupPC-users] How to really/immediately exclude stuff from next backup
It's all V4. It's not a capacity problem. It's a performance problem. And trying to figure out why it is so slow, I tried looking at what it is actually doing. Which brought me to what I was initially asking. I'm not sure where the stuff in 2846 is coming from before being deleted again (because of exclusion). Anyway "new entries in metadata" (represented by files) means handling millions of files and directories as well. But again, that was not the question. The question is, why is that new backup set being populated with stuff just to be deleted again after that (the exclusion list). On 2023-10-19 13:17, Guillermo Rozas wrote: Are you using V3 or V4? According to my understanding, this step However 2846 seems to be populated with the content from 2845 (including the stuff I have excluded). It looks like it's first copying all the stuff from 2845 (even the excluded path) and then later tries to remove it again from 2846. Which is also taking forever in the original example as it's a directory tree with millions of files. Also in the original example the disk isn't large enough, so we're not even making it to that stage. should require barely extra space. In V3 the "copy" is hard-linking and in V4 it's just new entries in metadata files, in neither case an actual copy of the files in the pool is done (because of deduplication). Maybe you're running out of inodes? Best regards, Guillermo ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:https://github.com/backuppc/backuppc/wiki Project: https://backuppc.github.io/backuppc/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:https://github.com/backuppc/backuppc/wiki Project: https://backuppc.github.io/backuppc/
Re: [BackupPC-users] How to really/immediately exclude stuff from next backup
Are you using V3 or V4? According to my understanding, this step However 2846 seems to be populated with the content from 2845 (including > the stuff I have excluded). > It looks like it's first copying all the stuff from 2845 (even the > excluded path) and then later tries to remove it again from 2846. > Which is also taking forever in the original example as it's a directory > tree with millions of files. > Also in the original example the disk isn't large enough, so we're not > even making it to that stage. > should require barely extra space. In V3 the "copy" is hard-linking and in V4 it's just new entries in metadata files, in neither case an actual copy of the files in the pool is done (because of deduplication). Maybe you're running out of inodes? Best regards, Guillermo ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:https://github.com/backuppc/backuppc/wiki Project: https://backuppc.github.io/backuppc/
Re: [BackupPC-users] How to really/immediately exclude stuff from next backup
Sure we have and that's actually what we are doing. But until that is in place we need a workaround which is excluding stuff. So I would like to understand why the system behaves this way when I exclude a directory. On October 19, 2023 12:37:05 PM GMT+02:00, Paul Leyland wrote: >This may sound fatuous but I am serious. Have you considered installing more >disk(s)? > >Sometimes throwing hardware rather than liveware at a problem is more >cost-effective. The former gets ever cheaper whereas the latter is ever more >expensive. > >I doubled the size of my pool over the last year by replacing each element of >the RAID array in turn, thereby maintaining continuous service. > > >Paul > >On 14/10/2023 21:43, marki wrote: >> Hello, >> >> I'm having a hard time here excluding stuff starting with the next backup. >> >> The problem is the disk containing the pools is not large enough to host >> more full backups. >> But we don't want to touch our retention policy for now, so we are trying to >> exclude some things starting now. >> But it's not working, details below: >> >> An example/lab: >> Let's say we add /path/to/exclusion to the exclusion list. >> In this example 2845 is the last full, and I'm trying to do another full >> which is going to be 2846. >> However 2846 seems to be populated with the content from 2845 (including the >> stuff I have excluded). >> It looks like it's first copying all the stuff from 2845 (even the excluded >> path) and then later tries to remove it again from 2846. >> Which is also taking forever in the original example as it's a directory >> tree with millions of files. >> Also in the original example the disk isn't large enough, so we're not even >> making it to that stage. >> >> How do I actually exclude a path from the next backup, such that it is not >> even temporarily created? >> I.e. such that the disk doesn't fill up AND such that it is fast. >> >> Thanks, >> Marki >> >> >> ___ >> BackupPC-users mailing list >> BackupPC-users@lists.sourceforge.net >> List: https://lists.sourceforge.net/lists/listinfo/backuppc-users >> Wiki: https://github.com/backuppc/backuppc/wiki >> Project: https://backuppc.github.io/backuppc/ > > >___ >BackupPC-users mailing list >BackupPC-users@lists.sourceforge.net >List:https://lists.sourceforge.net/lists/listinfo/backuppc-users >Wiki:https://github.com/backuppc/backuppc/wiki >Project: https://backuppc.github.io/backuppc/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:https://github.com/backuppc/backuppc/wiki Project: https://backuppc.github.io/backuppc/
Re: [BackupPC-users] How to really/immediately exclude stuff from next backup
This may sound fatuous but I am serious. Have you considered installing more disk(s)? Sometimes throwing hardware rather than liveware at a problem is more cost-effective. The former gets ever cheaper whereas the latter is ever more expensive. I doubled the size of my pool over the last year by replacing each element of the RAID array in turn, thereby maintaining continuous service. Paul On 14/10/2023 21:43, marki wrote: Hello, I'm having a hard time here excluding stuff starting with the next backup. The problem is the disk containing the pools is not large enough to host more full backups. But we don't want to touch our retention policy for now, so we are trying to exclude some things starting now. But it's not working, details below: An example/lab: Let's say we add /path/to/exclusion to the exclusion list. In this example 2845 is the last full, and I'm trying to do another full which is going to be 2846. However 2846 seems to be populated with the content from 2845 (including the stuff I have excluded). It looks like it's first copying all the stuff from 2845 (even the excluded path) and then later tries to remove it again from 2846. Which is also taking forever in the original example as it's a directory tree with millions of files. Also in the original example the disk isn't large enough, so we're not even making it to that stage. How do I actually exclude a path from the next backup, such that it is not even temporarily created? I.e. such that the disk doesn't fill up AND such that it is fast. Thanks, Marki ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: https://github.com/backuppc/backuppc/wiki Project: https://backuppc.github.io/backuppc/ ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:https://github.com/backuppc/backuppc/wiki Project: https://backuppc.github.io/backuppc/
[BackupPC-users] How to really/immediately exclude stuff from next backup
Hello, I'm having a hard time here excluding stuff starting with the next backup. The problem is the disk containing the pools is not large enough to host more full backups. But we don't want to touch our retention policy for now, so we are trying to exclude some things starting now. But it's not working, details below: An example/lab: Let's say we add /path/to/exclusion to the exclusion list. In this example 2845 is the last full, and I'm trying to do another full which is going to be 2846. However 2846 seems to be populated with the content from 2845 (including the stuff I have excluded). It looks like it's first copying all the stuff from 2845 (even the excluded path) and then later tries to remove it again from 2846. Which is also taking forever in the original example as it's a directory tree with millions of files. Also in the original example the disk isn't large enough, so we're not even making it to that stage. How do I actually exclude a path from the next backup, such that it is not even temporarily created? I.e. such that the disk doesn't fill up AND such that it is fast. Thanks, Marki ___ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List:https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki:https://github.com/backuppc/backuppc/wiki Project: https://backuppc.github.io/backuppc/