Hey Chris,

If you are up for the challenge you can try a hybrid of squid + local repo.
Local repo is based upon the basic nature of rsync which copies everything.
You can write a script that will filter a list of urls of mirrors and will prepare a "fetch" list of files which will be fetched only the *rpm* from one of of couple mirrors into local repo. For each file it has in the cache it will first verify if the file exists in the local repo and if it is then it can redirect the client (transparently or with 302 redirection) into the local server.

You can use do something similar with nginx to store the file permanently like in the idea of:
https://code.google.com/p/youtube-cache/source/browse/#svn%2Ftrunk%2Fnginx

The main issue would be the rpms while the packages sql\xml and other repo related stuff should be handled only by squid caching.

Email me if it's was interesting to hear about the idea.

Eliezer

On 09/29/2014 09:19 PM, Les Mikesell wrote:
I don't think there is a way to do it that doesn't take more human
effort than it is worth unless you have limited internet access.  It
is basically designed not to work.   A simple squid proxy with the
file size bumped up will work with no extra attention (and be useful
for all your internet accesses), but the first dozen or so runs are
probably going to pick different mirror URLs instead of reusing the
copy you have already cached. You can change the repo mirrorlist entry
to a fixed system - but then your updates will break if it is down.
Or you can mirror a bunch of stuff you'll never need into your own
repo.  Or set up some special-case thing that only works for Centos -
or maybe even just one version of Centos.

_______________________________________________
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos

Reply via email to