Angel Mieres wrote:
> Hi all,
> 
> Im testing bacula with two jobs. One of them, backup over 70.000 files
> and have 2 Gb. Second one have over 100 files and have 1Gb.
> Why the first job is getting speed of 3.000 KB/sec and the second one
> 25.000 KB/sec?(the backup is to a file on both cases)
> Have bacula less performance with small files?
> 
> Thx in advance.
> 
> 
> 
Hi Angel,

We are making abstraction about filesystem performance and consider that all 
files reside on the same type of filesystem.

First question is : did you use compression ? If yes, different factor 
compression of each file could explain differences.
If not, I suspect a bottleness in the database insert.
running a query with 70.000 insert is quite longer than just 100 :-)

Perharps you could check your DB configuration and optimize certain value to 
get insert goes quickly.
Here even my-huge.cnf wasn't suffisent to do the job nicely. (saving about 
700.000 files)

If you use the lastest Bacula version the ./configure --enable-batch-insert can 
help also.

If I don't abuse the rate is calculate by ratio kb backup in whole time.



-- 

     Bruno Friedmann  [EMAIL PROTECTED]

Ioda-Net Sàrl   - www.ioda-net.ch
  2830 Vellerat - Switzerland

  Tél : ++41 32 435 7171
  Fax : ++41 32 435 7172
  gsm : ++41 78 802 6760

C'est Facile et Cool d'Évoluer en ligne : www.cfcel.com


-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >>  http://get.splunk.com/
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to