I've just started experimenting with using some modules on our puppet installation, and observing behaviour. >From these observations, it suddenly struck me, that the current module implementation, violates what has previously been described as good puppet design.
It has previously been said, by multiple people in multiple places, variants of, "'Dont use puppet to distribute a lot of files; it's inefficient! use rsync, or (that other file transport thingie)" or, "Use packages!" Oddly, the new module architecture, and plugins, in general, seem to violate both principles. In my testing with puppet version 2.7.9, I dropped in the files for stdlib module, into the module dir. Ran the client side. It synced up. okay, great. Then I created a new random .rb file under stdlib/lib/puppet. It got synced on the next run. "Hmm.. maybe it just tests for dates on directories?" I thought to myself. "new file = new sync?" So I tested this by updating just the file. It got resynced. In just stdlib alone, there are 63 files. That's more than halfway to 100. A not insignificant amount of files. And this syncing is done via full md5 checksum? That is **less** efficient than normal rsync, which normally checks just timestamp/size! Does this module/plugin design not violate long-standing puppet "best practices" ? Along with my critique, I will also offer some suggested fixes. A) Make plugin syncing, be done rsync style; only resync if timestamp changed B) Make plugin syncing, be more "package"-like. Check for version in Modulefile. Compare version for both client and server. Update only if version mismatch. -- You received this message because you are subscribed to the Google Groups "Puppet Developers" group. To view this discussion on the web visit https://groups.google.com/d/msg/puppet-dev/-/bNfqZY_aJ6YJ. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/puppet-dev?hl=en.
