We also use Pulp in this exact way.  Puppet drops down repo definitions to each 
host which are associated with the host's Puppet environment (dev, qa, prod).  
The repo definitions point to "snapshot" repositories in Pulp.  We promote 
packages up through the environments as you are describing.

We do not use the pulp-consumer client.   Instead we trigger "yum updates" 
using mCollective on groups of hosts.

Josh

From: [email protected] [mailto:[email protected]] On 
Behalf Of Trey Dockendorf
Sent: Sunday, January 25, 2015 12:22 PM
To: Mathew Crane
Cc: [email protected]
Subject: Re: [Pulp-list] Using Pulp in a server-only configuration?


Your use case matches exactly how we use Pulp to manage repo contents for a HPC 
cluster where a consumer service is not possible.  I've had no issues and just 
push out repo files for all pulp managed repos using Puppet.  Since I'm using 
self signed certs still in Pulp and our network is private I made sure to serve 
all repos via http.

- Trey
On Jan 21, 2015 3:06 PM, "Mathew Crane" 
<[email protected]<mailto:[email protected]>> wrote:
In my environment, it doesn't really make sense to have a single point 
propagating changes to numerous hosts. Instead we'd opt to have the consumers 
pull down from the Pulp server manually. I understand that this hides a portion 
of Pulp's featureset (consumer management and reporting) but what I'm more 
interested in is the ability to manually 'promote' packages into different 
repos with required or updated deps on the server. Is there any downside to 
keeping the consumers 'dumb' and hitting the Pulp-managed repositories manually 
via standard /etc/yum.repos.d/*.conf files?

_______________________________________________
Pulp-list mailing list
[email protected]<mailto:[email protected]>
https://www.redhat.com/mailman/listinfo/pulp-list
_______________________________________________
Pulp-list mailing list
[email protected]
https://www.redhat.com/mailman/listinfo/pulp-list

Reply via email to