On Tue, 16 Feb 1999, Ray Olszewski wrote:
# Yeah, why not? If I had to do this here, or if the list were very long (over
# 1000 entries, say), I'd write a perl program to read the file into a sorted
# array, discarding duplicates along the way, then write out a new, clean file.
#
# If the list was short and I was feeling lazy (or didn't know perl), I'd use
# "sort" to process the file and output a new copy, then use vi (or whatever
# text editor you prefer) to delete the dups by hand.
Is there a reason why
cat inname | sort | uniq > outname
isn't appropriate?
G'day!
-- n i c h o l a s j l e o n
elegance through simplicity*http://mrnick.binary9.net*[EMAIL PROTECTED]
good fortune through truth*roaming:[EMAIL PROTECTED]*ICQ#2170994*U+($++)
TRA#6805*not all questions have answers*pseudogeek:P+++($++)L+($++)W=lm@b9
trust no-one with an iq under 150*understand yourself before trying others