Hmmm ... no need to make it too complicated, I suspect. I suspect a variant of 
the following command line (if you are on a Unix system anyway) will probably 
work:

sort < inputfile.txt | uniq -c | grep -v "   1 "

to find any lines that are present _more_ than once.

(There are three spaces before, and one space character after, the digit 1 
above ... to ignore single occurrences.)

In my PC, I wrote my own uniq and grep years ago to emulate the Unix commands 
in a DOS window, and "sort" has always been available in a DOS command prompt 
... even in today's Windows 7 environment .

Z

-----Original Message-----
From: framers-bounces at lists.frameusers.com 
[mailto:[email protected]] On Behalf Of Craig Ede
Sent: Thursday, May 17, 2012 11:06 AM
To: framers at lists.frameusers.com
Subject: RE: utility to check that all heading names are unique?

One can pretty easily write a little script (perl, python, etc.) that ticks 
through a text dump of this list and flags you for dupes. If you haven't tried 
to do this sort of thing, it is worth the effort and will spur you on to 
inventing more such tools. That beats the heck out of visually scanning for 
dupes.

Craig

Reply via email to