So one of my machines has a few zillion tiny little files.
My full backup took 44 hours. I can deal with that if I have to.
My incremental backup has been running for 10 hours now.
Files=71,560 Bytes=273,397,510 Bytes/sec=7,666 Errors=0
Files Examined=14,675,372
I know that bacula has
Il 19/04/2011 15:37, hymie! ha scritto:
So one of my machines has a few zillion tiny little files.
My full backup took 44 hours. I can deal with that if I have to.
My incremental backup has been running for 10 hours now.
Files=71,560 Bytes=273,397,510 Bytes/sec=7,666 Errors=0
Am 19.04.2011 15:37, schrieb hymie!:
So one of my machines has a few zillion tiny little files.
My full backup took 44 hours. I can deal with that if I have to.
My incremental backup has been running for 10 hours now.
Files=71,560 Bytes=273,397,510 Bytes/sec=7,666 Errors=0
Files
Maybe I can answer to follow-ups at once. Easy one first:
Il 19/04/2011 15:37, hymie! ha scritto:
So one of my machines has a few zillion tiny little files.
My full backup took 44 hours. I can deal with that if I have to.
My incremental backup has been running for 10 hours now.
So one of my machines has a few zillion tiny little files.
My full backup took 44 hours. I can deal with that if I have to.
My incremental backup has been running for 10 hours now.
Files=71,560 Bytes=273,397,510 Bytes/sec=7,666 Errors=0
Files Examined=14,675,372
I know that bacula
On 4/19/2011 10:21 AM, hymie! wrote:
Marcello Romani writes:
Maybe it's not relevant to your case, but have you tried to enable
spooling ?
I don't think spooling will solve my problem. First off, I'm using disks
as my storage, not tapes; spooling is not recommended. Second, the
bottleneck
hymie! So one of my machines has a few zillion tiny little files.
Here's your problem right there. Reading all the metadata for those
files is the killer.
If the client is beefy enough, you can try splitting it up so there
are multiple readers all hitting the disk at once. This will