>> Just out of curiosity, why are you trying to
>> edit a 1 GB file with any text editor?  I'm
>> assuming that these files are flat file
>> databases.  
> 
> I need to do that quite often. They are usually
> log files from a long running program in debug
> mode.


I find that pre-processing with grep/sed/awk first in a
stream-like fashion can trim the file down to a manageable size
before I heave the power of Vim at the problem.

If, for example, I'm only interested in the ERRORS and want to
skip the WARNINGs or the DEBUGs in the file, I might use

  bash> grep ERROR infile.log > smaller.log
  bash> vi smaller.log

or I might want to strip out the DEBUG level lines:

  bash> grep -v DEBUG infile.log > smaller.log

I've had a couple log files clock in at over 300 megs...nothing
so gargantuan as over a gig, but the principle is the same.

-tim


Reply via email to