Thanks John, your opinion is really helpful :)
El martes, 17 de junio de 2014 15:30:53 UTC+2, jcbollinger escribió:
On Tuesday, June 17, 2014 3:45:57 AM UTC-5, Félix Barbeira wrote:
I always heard that serve large files over puppet is a bad practice.
But...I guess it depends of what you consider a large file. Everyone agree
that serve for example a 25MB file over puppet it's definitely not
recommended.
It is generally useful in such cases to understand *why* a thing is
considered poor practice. Otherwise it's very hard to reason about
questions such as the one you are posing.
The general advice to avoid serving large files via the Puppet master's
built-in file server is based on Puppet's default behavior of using MD5
checksums to determine whether the target file's content is already in
sync. Checksumming the source and target files is comparatively expensive,
and the master must do it for each catalog request for each client for each
File resource in its catalog (that uses the default checksum method).
My question is wether a text file of ~7000 lines and ~700KB would be
acceptable. Do you think this file rebase puppet recommended size limits
for file and it's big enough to use the advices of the following thread??
https://ask.puppetlabs.com/question/627/serving-large-files-formally-code-artifacts-best-practices/
https://www.google.com/url?q=https%3A%2F%2Fask.puppetlabs.com%2Fquestion%2F627%2Fserving-large-files-formally-code-artifacts-best-practices%2Fsa=Dsntz=1usg=AFQjCNEX6OGIKtjD9bPDQi_xRBYq9BN6LA
There is no one-size-fits-all answer. If your master can support the
combined load, and if the load on your clients (from checksumming on their
side) is acceptable, then you are basically ok. Beware, however, of the
load creeping up as you add more Files, and mind that your master's client
capacity is affected by how much work it must perform for each client.
Note, too, that there are multiple possible approaches. If the file(s)
you want to serve is static and doesn't change too frequently then
packaging it up and managing it via a Package is a good solution, and I
would certainly consider that for a 700kB file. Especially so if it's part
of a collection that you can package up together. On the other hand, you
can also reduce the computational load by switching to a lighter-weight
checksum method
http://docs.puppetlabs.com/references/3.4.stable/type.html#file-attribute-checksum,
at the expense of a greater risk of Puppet mistaking whether the File is
already in sync. Or if you put it on a network file server accessible to
your clients, then 'source'ing it from there works, and spares the master
from checksumming.
John
--
You received this message because you are subscribed to the Google Groups
Puppet Users group.
To unsubscribe from this group and stop receiving emails from it, send an email
to puppet-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/puppet-users/c0f34e4d-771c-41fb-b520-8db1a90e8896%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.