Re: [Bacula-users] Bacula-Web project - web site update

2011-09-24 Thread Bacula-Dev
Hello,

Thanks for your feedback Dan, and sorry for the mistake ...;)

Bacula-Web project: http://bacula-web.dflc.ch

Regards

Davide

On Thu, Sep 22, 2011 at 5:02 PM, Bacula-Dev bacula-...@dflc.ch wrote:

 Dear all,

 I'm proud to announce that the Bacula-Web project's web site has been
 updated with more content and better design

- Documentation page and content
- RSS feeds subscriptions
- Newsletter subscription (coming soon)
- and so on...

 Hope that you'll appreciate and as usual, any feedback are welcome ;)

 Kind regards

 Davide




-- 
Davide

*Bacula-Web project site: http://bacula-web.dflc.ch*
--
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security
threats, fraudulent activity, and more. Splunk takes this data and makes
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2dcopy2___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


[Bacula-users] Problems doing concurrent jobs, and having lousy performance

2011-09-24 Thread Boudewijn Ector
Hi  Guys,


For some time, I've been trying to get concurrent jobs in bacula to work.
For doing so, I've created a pool for each client, and made sure all 
parts of the setup have got the max concurrent jobs = 1 .

Please allow  me to elaborate about my configuration:

This is part of my bacula-dir (well, this is a file for a client 'www', 
and it's being included in bacula-dir, along with some exactly the same 
files except for passwords/hostnames):

JobDefs {
   Name = www-weekly
   Type = Backup
   Level = Incremental
   Client = www
   FileSet = Full Set
   Schedule = WeeklyCycle
   Storage = leiden-filestorage
   Messages = Standard
   Pool = wwwPool
   Priority = 10
}



Job {
   Name = wwwjob
   JobDefs = www-weekly
   Write Bootstrap = /var/lib/bacula/www.bsr
}

Client {
   Name = www
   Address = www.KNIP
   FDPort = 9102
   Catalog = MyCatalog
   Password = KNIP  # password for FileDaemon
   File Retention = 30 days# 30 days
   Job Retention = 6 months# six months
   AutoPrune = yes # Prune expired Jobs/Files
}


Pool {
   Name = wwwPool
   LabelFormat = wwwVol
   Pool Type = Backup
   Recycle = yes   # Bacula can automatically 
recycle Volumes
   AutoPrune = yes # Prune expired volumes
   Volume Retention = 365 days # one year
   Volume Use Duration = 23h
}



As you can see, I've removed some sensitive information. A clone of this 
config is also used for 'mail', and some more machines.  Each has it's 
own pool (because of concurrency).


Well the bacula-sd.conf:

Storage { # definition of myself
   Name = leiden-filestorage
   WorkingDirectory = /var/lib/bacula
   Pid Directory = /var/run/bacula
   Maximum Concurrent Jobs = 50
   SDAddresses = {
 ip = { addr = 192.168.1.44; port = 9103 }
 ip = { addr = 127.0.0.1; port =9103 }
   }
}
Director {
   Name = leiden-dir
   Password = *
}
Director {
   Name = leiden-mon
   Password = *
   Monitor = yes
}
Device {
   Name = leiden-filestorage
   Media Type = File
   Archive Device = /bacula
   LabelMedia = yes;   # lets Bacula label unlabeled media
   Random Access = Yes;
   AutomaticMount = yes;   # when device opened, read it
   RemovableMedia = no;
}

Messages {
   Name = Standard
   director = leiden-dir = all
}





Pretty standard, should I change something in here?



And my bacula-fd.conf:

Director {
   Name = leiden-dir
   Password = *
}

Director {
   Name = www.*-mon
   Password = *
   Monitor = yes
}

FileDaemon {  # this is me
   Name = www.*-fd
   FDport = 9102  # where we listen for the director
   WorkingDirectory = /var/lib/bacula
   Pid Directory = /var/run/bacula
   HeartBeat Interval = 15
   Maximum Concurrent Jobs = 20
   FDAddress = *
}

Messages {
   Name = Standard
   director = www.*-dir = all, !skipped, !restored
}
Also quite boring.




Can someone please explain to me why bacula still is not able to run 
concurrent Jobs? Do I have to create a storage for each client (for 
instance)? And what's the reason for having to do so?


Furthermore, I've enabled the compression on some clients, but 
nevertheless the system's performance isn't very good. It tends to 
stagger at about 1800kb/s , but both ends of the line are 100mbit... and 
almost not being used at all.
The director and sd are on the same machine, attached to a NAS (which 
performs fine by itself), and the machine has a dual-core Atom CPU 
running debian and 2gb of RAM. It also has no other jobs except for 
Nagios (which is not very heavily loaded).


Cheers,

Boudewijn Ector

--
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security
threats, fraudulent activity, and more. Splunk takes this data and makes
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2dcopy2
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users