On Sat, Sep 29, 2012 at 5:37 PM, website reader
<[email protected]> wrote:
> To all:
>
> This is a question on efficiently searching for text items in a large
> file > 1 gigabyte in size.
>
> I have a list of about 2 to 5 thousand items where an item is a couple
> of text words such as "Side 2050" and S is always in the starting
> column, and have to search a large file around 24 gigabytes in size
> for these items.  The file is a simple text file, delineated by line
> feeds.
>
> Using the typical shell grep command script such as:
>
> grep "Side 2050"
> grep "Side 2061"
> etc.
>
> results in a very slow execution time, since the 24 gig file has to be
> searched for each line of the script and I am finding this to be very
> laborious and time consuming, not to mention all the hits on the hard
> drive as the script grinds through each line.
>
> I am aware of combining the grep into a pattern set, but then run into
> command line length limitations.  Is it best to go this way?
>
> What is the quickest way to do this type of search, when large files (
>> 1 gig ) are involved?
>
> Thanks for your tips, or suggestions.
>
> - Randall
> _______________________________________________
> PLUG mailing list
> [email protected]
> http://lists.pdxlinux.org/mailman/listinfo/plug

HI

Can  you just grep for S and save that in a file?

Then grep the new/smaller file for the rest?

HTH

Marvin
_______________________________________________
PLUG mailing list
[email protected]
http://lists.pdxlinux.org/mailman/listinfo/plug

Reply via email to