The other approaches mentioned were certainly good beginnings, but I am 
wondering if something possibly faster and easier is possible.....aka...

Is it a *requirement* or just an assumption that the data be stored in a 
single file?  What about possibly creating individual files for each 
date? Then rather than looking through one large file for individual 
lines you can simply delete any files older than 30 days??  Depending on 
what you are trying to do and how large the files are, this approach 
could be much much faster to code and run.

On a side note you might check out Date::Calc from CPAN, there are other 
date manipulation modules but this was always a good one for me.

http://danconia.org/



[EMAIL PROTECTED] wrote:
> Hi
> 
> I have a script that writes a username, password and date to a database
> file
> 
> Like this
> 
> use strict;
> 
> my $database = 'c:\apache\members.db';
> 
> # everything ok, let's write to database.
> 
> open (DATABASE, ">>$database");
> flock (DATABASE, 2);
> print DATABASE "$username | $password | $date\n";
> flock (DATABASE, 8);
> close (DATABASE);
> 
> What I'm trying to do is write a routine that will check the date, and if
> the date is over 30 days
> then delete the users info from the database and the .htpasswd file.
> 
> I have Perl for dummies, The Perl cookbook, and the perl blackbook
> could someone please point me in the right direction?
> 
> Thanks
> 



-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to