On Sat, Sep 25, 2010 at 2:34 PM, Nigel Kersten <[email protected]> wrote: > On Sat, Sep 25, 2010 at 2:20 PM, Luke Kanies <[email protected]> wrote: >> On Sep 25, 2010, at 9:58 AM, Nigel Kersten wrote: >> >>> On Sat, Sep 25, 2010 at 6:57 AM, Brice Figureau >>> <[email protected]> wrote: >>>> Hi Nigel, >>>> >>>> On 24/09/10 23:46, Nigel Kersten wrote: >>>>> Is there any way I can feasibly access a parameter that was set in a >>>>> node definition provided by an external node classifier from inside a >>>>> fact? >>>> >>>> Out of the box I don't think it is possible, since the facts run in >>>> puppetd and the node classifier on the master. >>> >>> Yeah. :( >> >> Indeed. >> >>>>> The problem I'm facing is that I have several teams who wish to share >>>>> a modulepath, but don't want the facts checked in by one team to >>>>> module A to be evaluated on clients who are not applying module A. >>>>> >>>>> Thus I would like to be able to confine the facts to only be evaluated >>>>> on those hosts that are actually including those modules. >> >> Can you provide more information about the background for this need? That >> is, is it a performance problem, a security problem, that the facts aren't >> functional, or what? > > Completely a security issue. > >> >>>> To my knowledge the facts are evaluated prior to the node catalog >>>> evaluation so we don't yet know this information. >> >> Correct. >> >>>> The only way I can think about it is to have your facts query your node >>>> classifier out of band of puppet. >>> >>> Which kind of sucks :) >>> >>> I feel like our pluginsync model is really problematic for trying to >>> share modulepaths amongst separate groups and services. >>> >>> If you're not the only person checking code into a modulepath, then >>> you're essentially allowing arbitrary code execution on your clients, >>> even if you're not consuming the modules that provide such plugins. >> >> Not necessarily - downloading the plugins isn't enough, they need to >> actually be used (all facts will be used, but not necessarily types and >> providers). But yeah, I get your point. What would you recommend instead? > > I'm not entirely sure. I've been thinking this through a lot lately, > and I honestly can't think of a great way to resolve this in a single > run other than the external node classifier setting the modulepath and > manifestpath and killing the use of environments altogether. (when you > have a hammer...) > > One possible option would be (and I'm not entirely opposed to this) to > have two puppet runs. > > 1. a 'prerun' environment that syncs facts, takes parameters from > external node classifer and writes them to disk
My incomplete sentence there was meant to say "that syncs facts from a centrally managed and tightly controlled modulepath". > > 2. The normal puppet run syncs all the shared facts to the client, > plus a special fact that reads the results of the first run, and the > other facts are all set to be confined on this special fact. > > ie your external node definition has: > > parameters: > - enable_fact_foo: true > > and the foo fact has: > :confine enable_fact_foo => true > > That way the people sharing the modulepath have to explicitly choose > to enable any fact from their shared code base, and you enforce the > presence of such confine statements in the pre-commit hooks. > > This doesn't do anything to solve the problem of non-fact plugins, but > honestly in this case I would simply deny those based upon a > pre-commit hook and only allow modules/*/lib/facter > > >>> Also, given that it's possible to take a core puppet provider, modify >>> it, distribute it via pluginsync, and override the core provider... >>> you're even allowing the possibility of someone checking something >>> into a module you don't consume that actually changes the way your >>> puppet client works. >>> >>> This infers that modulepaths are not to be shared, but the consequence >>> here is that you end up with an explosion of environments to cope with >>> modulepath permutations. >> >> Interesting. This is the first time this has come up for me, but I >> definitely see the problem. Is your primary concern facts, or types and >> providers? Or something else? And is it security you're worried about, or >> something else? >> >> We've shifted away from having a separate directory for facts and Puppet >> code, but this implies we should not have done so. > > I wouldn't go that far. As you say, I'm the first person to bring this > up, and we absolutely needed a way to make modules truly > self-contained, including facts/plugins. Pluginsync and > plugins-in-modules resolved a lot of the crufty issues we had around > factsync and environments. > >> The only short-term way I can see of solving this is to use the node >> classifier to figure out what classes are used, map those classes to >> modules, and only include the appropriate code from those modules. That >> doesn't really work, though, because there are always going to be classes >> included by those specified classes, and their code won't be sent to the >> client, which is a problem, along with the resource types and such that are >> outside of classes but are used by required classes. >> >> Another (not very good) option is to record which modules are used from the >> last compile and send the code down for the next one. This doesn't work for >> somewhat obvious reasons. >> >> The only reasonable option I can think of requires us to separate facts out >> again. I don't think there's a way around sending all facts to the client. >> If we do that, though, then we can wrap the rest of the plugins in with the >> catalog, just like we do for files here: > > Yeah, these are the options I've come up with so far that I've > rejected as being incomplete and/or complicating the client run > significantly. > >> >> http://projects.puppetlabs.com/issues/4817 >> >> In that model, maybe we automatically create a 'plugins' stage that goes >> before any of the existing stages and add a file resource for each of the >> modules we are using. Again, this doesn't really fix it for you, because >> your facts are still combined, and they all have to be maintained separate >> from the rest of the ruby code, but it gets you close. >> >> The only other option I can think of, and this is something that multiple >> people at Puppet Labs want but is very far away, is to have partially >> compiled catalogs that resolve as much as possible without having the facts. >> I'm not actually convinced this is possible in anything resembling a >> reasonable way, but even if it is, it's far enough away that you shouldn't >> bank on it. > > I'm not quite convinced it's possible either, but I've dreamt of this too... > :) > > Thinking about this issue in conjunction with the feature request for > Puppet to understand package dependencies natively does make me wonder > if we need to aim for a more dynamic system than our current catalog > can provide. > -- You received this message because you are subscribed to the Google Groups "Puppet Developers" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/puppet-dev?hl=en.
