There are some more examples of neat things you can do with sort and/or uniq
from a thread back in October:
https://groups.google.com/group/nlug-talk/browse_thread/thread/821561d29d690524/75b74ed66552670c?lnk=gst&q=command+of+the+day+uniq#75b74ed66552670c

Cheers,

Brandon

On Thu, Feb 12, 2009 at 4:50 AM, Drew <[email protected]> wrote:

> I haven't seen this for a while, so I'll throw this one out that I had to
> use yesterday:
>
> sort
>
> lets say you've got a file that contains a list of servers, which were
> added to the file because of x event:
>
> lauas...@angel(~)$ more somefile
> web3
> web1
> web3
> web2
> web2
> web1
> web2
>
> So now you want to un-clutter the file and put each server together. This
> is where sort comes in:
>
> lauas...@angel(~)$ sort somefile
> web1
> web1
> web2
> web2
> web2
> web3
> web3
>
> Now they're all together. Of course, maybe you don't need multiple entries
> for each server. Maybe you only need to know IF a certain event happened on
> a server. There's a command that works well with sort: uniq
> uniq will show you only one copy of any line repeated in a file:
>
> lauas...@angel(~)$ sort somefile|uniq
> web1
> web2
> web3
>
> Of course, you might also want to know how many times this event happened
> with each server, but be too lazy (or too blind) to count the lines in the
> file. So uniq has a nice option, -c which tells you how many times a line
> was repeated in a file:
>
> lauas...@angel(~)$ sort somefile|uniq -c
>    2 web1
>    3 web2
>    2 web3
>
> Now, there's always the outside chance you want to know which servers the
> event happened on most. The answer is obvious - pipe it through sort again!
>
> lauas...@angel(~)$ sort somefile|uniq -c|sort
>    2 web1
>    2 web3
>    3 web2
>
> So there you are. Obviously this becomes more and more handy as you
> progress upwards in the number of lines in somefile (some number of
> thousands, I was dealing with!) but I loaded up on the job and took care of
> what I was told was an impossible task, and handled it in record time,
> thanks to these two commands, which also helped me prove something in
> addition to providing some valuable information, something it pays to
> remember every day:
>
> The people who have been designing and using Unix since it's inception are
> smart/lazy enough to have wanted to do many of the things we occasionally
> run in to that seem like tedious tasks. They have designed the tools to make
> doing these things easy. Knowing these tools exist can make your life
> easier, and get you to the bar faster.
>
>
>
> >
>


-- 
Brandon D. Valentine
http://www.brandonvalentine.com

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"NLUG" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/nlug-talk?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to