On Thu, 2003-10-09 at 12:13, Dan Muey wrote:
> > Hello everyone,
> 
> Howdy
> 
> > 
> > We use the Barracuda Spam appliance (barracudanetworks.com) 
> > to filter our spam and their web based interface is written 
> > in Perl.  They have a form that allows the user to search 
> > messages for key words.  Evidentally it stores the each 
> > message in a file in a directory and when trying to search 
> > several hundred thousand messages for a word the response back
> > is:
> > 
> > egrep: argument list too long
> > 
> 
> If I was trying to grep a zillion files at once and it wouldn't 
> let me I'd probably grep them one at a time.
> For instance via the backtivck execution You might have:
> 
> my @matchedfiles = qx(cat `ls /files/` |grep $string);
> # a really bad way to do this but for example's sake...
> 
> You could do:
> 
> for(`ls /files/`) {
>       if(`cat $_ |grep $string`) { push(@matchedfiles,$_); }
> }

Are you sure about using ls?  We have directory here that has several
thousand files in it and when doing an ls *.whatever-extension we always
get an "argument list too long".

Any idea what the actual file limit is for grep?

> 
> Then you are only greping one file at a time instead of a list of too many.
> Of course what would be better is to use the readdir() functions to list the files 
> and open() and grep() combo to grep the contents. But the same principle applies.
> 
> Just make sure the barracuda folks says thanks for fixing their problem :)

Yeah, we're hoping for a few months of service for free.....:)

This was also a personal quest to find the answer for myself.  So either
way I win.

Thanks for your help,
Kevin
-- 
Kevin Old <[EMAIL PROTECTED]>


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to