Hi, >> > Questions aside, this is theoretically doable. You'll need to be setup for >> > pulp's deferred content download feature, which includes deploying squid or >> > an equivalent proxy. >> > >> > http://docs.pulpproject.org/user-guide/deferred-download.html >> > >> > You would first restore your database, and then create a repo in pulp for >> > each of these backup repos. For each one: >> > >> > - set the download policy to "on_demand" >> > - sync. This should discover that each content unit is already in the >> > database, associate it to the repo, and populate the on_demand catalog with >> > knowledge of its location in this giant feed >> > - run the download_repo task with the "verify_all_units" option set to >> > True. This will go through each file of each unit, discover it's missing, >> > and then download it from the link that was cataloged above. >> > --- >> > http://docs.pulpproject.org/dev-guide/integration/rest-api/repo/sync.html#download-a-repository >> > - delete your "backup" repos from pulp >> > >> > This is only possible for yum repos currently, until support for deferred >> > download is added to other plugins. >> > >> > If you do go through with this as a plan, let us know how testing goes, and >> > what tips you would have for the next person who tries it. > Great, that was the kind of idea I was looking for, thanks a lot! > I guess it will need using for the temporary repo a different feed > URL from that of the saved repo. > OK, I will try it and give some feedback.
I was about to answer that I was unsuccessful but in the end I tried with the default immediate download policy and it worked! So, the method is simply: * recreate another repo with the same feed (change the relative_url), using default download_policy * sync * delete the new repo That's it, RPMs have been restored in /var/lib/pulp/content/units/rpm/ Thanks a lot, _______________________________________________ Pulp-list mailing list [email protected] https://www.redhat.com/mailman/listinfo/pulp-list
