Hi Todd,Mark, everyone, I've another tricky situation to beat. Got this huge 3rd party code repository (+2G of code) that I need on each of my 40 build machines (for different platforms of course). This repository is compiled everytime there's a code change and the resulting libraries are used for compiling my product's code.
The problem is that I need to fire cvs update on each of the machines early in the morning before I start building my product. And just the damn update takes over 40 minutes. And since all machines start this update process at around the same time, there's some bandwidth crunch being recorded. I was thinking of this solution around the problem and would appreciate if you can chip in with your criticism. Update on only 1 machine, and from the update log, get all the U and P entries and create a tarball which I can send over LAN to the other 39 machines and unzip them. This would serve my purpose but when later I try updating the working copies on the 39 machines, I will get C entries for the updated files naturally. Any way around this problem? Is my idea alrite? Did anyone of you solve this kind of a problem earlier? Please share your thoughts! -Chaitanya
