Hello guys,

So the scenario is as follows:

On a virtual host I've got 10 linux containers and each container has a 
website associated with. Therefore, for that website to work I have to copy 
around 100 + files using Puppet.

At the moment I am using rsync to do so as I have tried using Puppet to 
copy the files over but the Puppet agent run speed was considerably slow, 
and that's just for 10 websites, which is bound to grow.

However, using rsync saves me some time, however, the run it is still quiet 
slow.

What are you guys suggesting that I should do? or how are you guys going 
about copying a large number of files using Puppet?

Thanks,

Regards,
Sergiu

-- 


This message and its attachments are private and confidential. If you have 
received this message in error, please notify the sender and remove it and 
its attachments from your system.

The University of Westminster is a charity and a company 
limited by guarantee. Registration number: 977818 England. 
Registered Office: 309 Regent Street, London W1B 2UW.

-- 
You received this message because you are subscribed to the Google Groups 
"Puppet Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/puppet-users/a1ea8c3c-898e-4b3f-baec-fa8706ac12f7%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to