On Wed, Mar 10, 2010 at 11:24:46PM +0000, Lyle wrote: > I've just had a new customer that is importing a member list of over > 20,000 emails.
I do hope that your customer can prove that they all opted in :-) > At the moment the emailing portion of my script is pretty > basic, it just loops through the members, piping off individual emails > to sendmail or SMTP. Obviously with 20,000 emails, this is going to > reach the CGI timeout limit, so I need to adopt a new strategy. > I'm sure some of the people on here will have set ways of dealing with > much larger email lists than this. Do it asynchronously - that is, the script triggers some other job that then runs in the background. eg, you could fork, then in the child, close STDIN/OUT/ERR, and have the child process them in the background. If you need status updates, then the child could either drop them in a file or in a database, or if you want to get fancy, the child could listen for connections on a socket and when anything connects to the socket let it know what's going on. The thingy that connects to that socket could be another CGI that then passes the information on to the user. Or the script could create an 'at' job. Or it could put something into a jobs queue in a database. Or ... FWIW, I'd go down the database route, but if you want maximum portability, including to legacy platforms, then you'll need a pure-perl solution. fork() may need to be emulated with some Win32::* shenanigans in that case. -- David Cantrell | Nth greatest programmer in the world Do not be afraid of cooking, as your ingredients will know and misbehave -- Fergus Henderson _______________________________________________ BristolBathPM mailing list [email protected] http://mailman.bristolbath.org/mailman/listinfo/bristolbathpm
