On Thu, Jul 15, 2010 at 01:58:16PM -0400, Kyle McDonald wrote: > Once I do have the repository mirrored. I'm guessing from the lack of > detail in the rsync command, that I will have a copy of every pkg from > every build that used IPS?
Depends which repository you're mirroring. > Previously I only kept on hand the DVD iso's for the last 5 sNV builds. > > So my question is, either while I'm downloading it, or after I've gotten > it, is there anyway to prune out the older builds I don't care about? > > I don't really need a local mirror of every build ever made, and I'd > like to not use up the disk space if possible. Using a ISOs and repo-on-a-stick is a good way to stand up a repository if you're trying to mirror just a few specific builds. Shawn is working on a slick set of changes that will allow you to compose multiple repositories into a unified catalog. In that situation, you would have to store every build in the same repository. > Another user pointed me to some scripts that will fully mirror the whole > repository. These do allow me to only download a specific build, but the > only way I can think of to be able to delete old builds is to create a > separate repository for each build. This is how we do it today for our snapshots and nightly builds. > This is doable, but doesn't seem that efficient since it appears that > not all packages are built with each build. (At least my test of only > downloading the 'latest' or even only b134 have also gotten some b133 > and some b111 packages, and when I check it doesn't appear that there > are b134 versions of those.) > > Of course ZFS's DeDup feature would eliminate that in-efficiency, but if > possible, it'd be nicer to be able to maintain a single mirror repo, and > prune old builds every now and then. I'm not sure what our plans for removing packages from a repository are. Shawn might be able to tell you more. What problem are you actually trying to solve? We've come up with a number of different ways to mirror package content without using rsync. A number of these methods are things that you could deploy now, a few more require us to write some more code. If you could explain a bit more about what you're trying to do, we might be able to point you in a different direction. If you're really just interested in caching recently built bits to speed up download times, using a HTTP cache is a very good option. If you clean the cache every couple of weeks, you'll only have content from recent builds, and as long as your users download frequently enough to keep the cache hot, performance will be good. I'd need to know more about what you're doing to know if this would actually be a solution, but it is one that we've tested here, and know that it works. -j _______________________________________________ pkg-discuss mailing list [email protected] http://mail.opensolaris.org/mailman/listinfo/pkg-discuss
