however in some cases you may want to use perl to do this. if removing the files is part of a larger perl effort using the system or exec function to call find could consume more resources than using native perl calls. if you want to do it all in perl then look at the stat command for files to get file modification times and then use the unlink command to remove them once you've determined which files it is you want to remove.
i am replying to this post from home but once i get into work i will post a copy of a script we use to periodically remove Oracle archive logs after they've bee backed up to tape.
Travis
On Thursday, November 6, 2003, at 08:00 PM, Michael A Nachbaur wrote:
On Thursday 06 November 2003 04:36 pm, Craig Sharp wrote:Hi,
I guess the subject says it all. I have a set of directories on Unix. I need to remove all files that are older than 10 days from each directory.
I am having a brain lock problem and cannot get started.
Any ideas would be a big help.
You don't actually need perl for this. You can just use the Unix "find"
command. For example (I haven't tested this, so caveat emptor):
find /root/to/search/from -ctime +10 -exec rm '{}' ';'
I don't know if I have the -ctime argument right, but basically you want to
say for every file not updated within the last ten days, execute "rm" on it.
Anyone know for certain what the correct syntax would be for this?
-- /* Michael A. Nachbaur <[EMAIL PROTECTED]> * http://nachbaur.com/pgpkey.asc */
...[Arthur] leapt to his feet like an author hearing the phone ring...
_______________________________________________ Perl-Unix-Users mailing list [EMAIL PROTECTED] To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs
_______________________________________________ Perl-Unix-Users mailing list [EMAIL PROTECTED] To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs