On 7/15/24 9:26 AM, Marco Gaiarin wrote:

We have found that a dir (containing mostly home directories) with roughly
one and a half million files, took too much time to be backud up; it is not
a problem of backup media, also with spooling it took hours to prepare a
spool.

There's some strategy i can accomplish to reduce backup time (bacula
side; clearly we have to work also filesystem side...)?

For example, currently i have:

     Options {
       Signature = MD5
       accurate = sm
     }

if i remove signature and check only size, i can gain some performance?


Thanks.

Hello Marco,

The typical way to help with this type of situation is to create several Fileset/Job pairs and then run them all concurrently. Each Job would be reading a different set of directories.

Doing something like backing user home directories that begin with [a-g], [h-m], [n-s], [t-z] in four or more different concurrent jobs.


A coup
le FileSet examples that should work in how I described:
----8<----
FileSet {
  Name = Homes_A-G
  Include {
    Options {
      signature = sha1
      compression = zstd
      regexdir = "/home/[a-g]"
    }
    Options {
      exclude = yes
      regexdir = "/home/.*"
    }
  File = /home
  }
}

FileSet {
  Name = Homes_H-M
  Include {
    Options {
      signature = sha1
      compression = zstd
      regexdir = "/home/[h-m]"
    }
    Options {
      exclude = yes
      regexdir = "/home/.*"
    }
    File = /home
  }
}

...and so on...
----8<----


Hope this helps!
Bill

--
Bill Arlofski
w...@protonmail.com

Attachment: signature.asc
Description: OpenPGP digital signature

_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to