All,
     I am running Bacula 5.0.1 on Solaris 10 x86. I'm currently running
MySQL 4.1.22 for the database server. I do plan on upgrading to a compatible
version of MySQL 5, but migrating to PostgreSQL isn't an option at this
time.

     I am trying to backup to tape a very large number of files for a
client. While the data size is manageable at around 2TB, the number of files
is incredibly large.
The first of the jobs had 27 million files and initially failed because the
batch table became "Full". I changed the myisam_data_pointer size to a value
of 6 in the config.
This job was then able to run successfully and did not take too long.

    I have another job which has 42 million files. I'm not sure what that
equates to in rows that need to be inserted, but I can say that I've not
been
able to successfully run the job, as it seems to hang for over 30 hours in a
"Dir inserting attributes" status. This causes other jobs to backup in the
queue and
once canceled I have to restart Bacula.

    I'm looking for way to boost performance of MySQL or Bacula (or both) to
get this job completed.

Thanks,
Shon
------------------------------------------------------------------------------
Beautiful is writing same markup. Internet Explorer 9 supports
standards for HTML5, CSS3, SVG 1.1,  ECMAScript5, and DOM L2 & L3.
Spend less time writing and  rewriting code and more time creating great
experiences on the web. Be a part of the beta today.
http://p.sf.net/sfu/beautyoftheweb
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to