> We have an SGI client with 65/512. Trying to find out if these are > kilobytes or megabytes. The backup fails because it runs out of memory. > MemoryEfficientBackup does not help. -dirsonly does not help. > > What are people using on large UNIX filesystem clients for these numbers? > There are probably 7 million files in this file system of about 1TB.
We just had this kind of problem with HP-UX. Tivoli support quoted an estimate of 300 bytes per active backup file. Our experience suggests that this estimate is a bit conservative. With a 536 megabyte limit on the data segment size we started running out of memory when the number of files was somewhere between 2.9 million and 3.1 million files. This corresponds to using somewhere between 173 and 185 bytes per file.