Issue #2681 has been updated by Rein Henrichs.

Branch changed from http://github.com/reinH/puppet/tree/tickets/master/2681 to 
http://github.com/reinH/puppet/tree/tickets/0.25.x/2681

Updated: http://github.com/reinH/puppet/tree/tickets/0.25.x/2681
----------------------------------------
Bug #2681: "Duplicate generated resource;skipping" for each managed resource
http://projects.reductivelabs.com/issues/2681

Author: Marc Fournier
Status: Code Insufficient
Priority: Normal
Assigned to: Rein Henrichs
Category: plumbing
Target version: 0.25.2
Affected version: 0.25.1rc1
Keywords: 
Branch: http://github.com/reinH/puppet/tree/tickets/0.25.x/2681


I've noticed a new warning emitted on clients with 0.25.0 or 0.25.1rc1:

<pre>
info: /Host[localhost.localdomain]: Duplicate generated resource;skipping
info: /Host[test.example.com]: Duplicate generated resource; skipping
</pre>

This message comes from lib/puppet/transaction.rb:362 in
generate_additional_resources()

The manifest on the server side:

<pre>
node 'test.example.com' {

  resources { "host":
    purge => true,
  }

  host { "$fqdn":
    ip => $ipaddress,
    alias => $hostname,
  }

  host { "localhost.localdomain":
    ip => "127.0.0.1",
    alias => "localhost",
  }
}
</pre>

This happens for every resource managed by "resource".

NB: this warning doesn't occur when running puppet against a file containing 
the same resources. It only happens when puppetd fetches them from the 
puppetmaster.



-- 
You have received this notification because you have either subscribed to it, 
or are involved in it.
To change your notification preferences, please click here: 
http://reductivelabs.com/redmine/my/account

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Puppet Bugs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/puppet-bugs?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to