I'm still new to Bacula and I'm doing some testing of Windows performance.
If I have a client which I know has a good disk subsystem and which has good
ethernet connectivity to the director/storage director, and a mixture of file
types ranging from folders containing a few huge files through to folders
containing thousands of small files, what I'm seeing if I monitor a job using
various Windows tools is that the backup flies on the large files (large
sequential reads I'm guessing) but when dealing with the thousands of small
files the throughput is low.
I'm still getting my head around all the options in Bacula and wondered if
anyone had any feedback on how to optimise Windows backups where there is a
mixture of file sizes?
Thanks.
------------------------------------------------------------------------------
Site24x7 APM Insight: Get Deep Visibility into Application Performance
APM + Mobile APM + RUM: Monitor 3 App instances at just $35/Month
Monitor end-to-end web transactions and take corrective actions now
Troubleshoot faster and improve end-user experience. Signup Now!
http://pubads.g.doubleclick.net/gampad/clk?id=272487151&iu=/4140
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users