Hello: First off -- I must say thanks to everyone who has helped me in the past -- I've learned a lot from the users listserv and my answers are always (incredibly!) quickly answered.
I'm running into major speed issues when generating a fileset during a copy process which I'm doing to back up files during a deployment. The exact code follows: <copy todir="${deploy.backupPath.root}"> <fileset dir="${deploy.path.root}" excludes="**/${pathnames.wwwroot}/attributes/**,**/${pathnames.wwwroot}/imag es/**,**/${pathnames.wwwroot}/text/**"> <include name="${pathnames.bin}/**" /> <include name="${pathnames.certs}/**" /> <include name="${pathnames.conf}/**" /> <include name="${pathnames.cybersource}/**" /> <include name="${pathnames.data}/**" /> <include name="${pathnames.datadefn}/**" /> <include name="${pathnames.properties}/**" /> <include name="${pathnames.wwwroot}/**" /> </fileset> </copy> Now this fileset is going through 520 sub-folders, for around 69,000 files *including* files in the excludes folders. On my test environment with a significantly slower machine but 1/8th the number of files this runs incredibly fast. I'm talking upwards of 15-20 minutes of the deployment server to generate this list. The only thing I can think is that it's itterating throgh the excluded folders as well (if I don't include the excluded folders the number of files drops by 2/3) which is causing it to be so slow. Is there another way I should be handling this? My only requirments are that my exclude list be able to be modified fairly easily (depending on the location of the build I'm doing). Anyone else had problems with slow fileset generation? The hope is that automating this process (done by hand now) will speed things up but at 15-20 minutes for just the backup step it's deffinately not any faster!! Simply copying the files in windows explorer only takes about 4-5 minutes. Any tips would be GREATLY appreciated! Thanks, Brent