> -----Original Message----- > From: Max Clark [mailto:[EMAIL PROTECTED]] > Sent: Thursday, August 22, 2002 12:41 AM > To: [EMAIL PROTECTED] > Subject: Help parsing a large file > > > Hi, > > I am trying to parse a large text file (10 MB, 400K lines) with the > following code. This is running FreeBSD-Stable (dual proc, 1GB ram), > however I keep receiving messages that I am out of memory, or that the > "query timed out". I need to parse a file with email addresses to sort > out garbage. > > How can I do this better? > > Thanks in advance, > -Max > > #!/usr/bin/perl -w > > use Email::Valid; > > open (GOOD, ">valid.good") || die "$!"; > open (BAD, ">valid.bad") || die "$!"; > > while (<>) { > if (Email::Valid->address( -address => $_, > -mxcheck => 1, > -fudge => 1 )) { > print GOOD; > } else { > print BAD; > } > }
Your code is fine. There is possibly a memory leak in Net::DNS. See: http://groups.google.com/groups?selm=m3901yxau6.fsf%40gw.thesalmons.org If you take of -mxcheck, does the problem go away? If you need mxcheck, can you use nslookup? Try setting: $Email::Valid::DNS_Method = 'nslookup'; Or, you may have to arrange to validate the addresses in batches. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]