I used to use pwebstats to generate sites that my users had browsed, it can
be a very detailed tool, with graphs on usage and is nicely HTML'ised.

http://martin.gleeson.com/pwebstats/

It requires Perl and a coupla other little things.

Oh (looks at address) say hello to Breeanna for me.

--Steve


-----Original Message-----
From: MacFarlane, Jarrod [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, September 06, 2000 9:43 AM
To: '[EMAIL PROTECTED]'
Subject: [SLUG] Removing duplicate entries from a file


Hey sluggers,

I need to look at a particular machines web hits.. I am currently using:

cat /usr/local/squid/logs/access.log |grep 1.2.3.4 |cut -f4 -d"/" >
logfile.txt

This outputs something like:
www.reallynaughtysite.com
www.smackmeimbad.com
and so on....

The problem is that it has many double ups... are there a long confusing
string of commands that will go through my logfile and remove all but one
instance of every domain listed?

Thanks,
Jarrod.


--
SLUG - Sydney Linux User Group Mailing List - http://slug.org.au/
More Info: http://slug.org.au/lists/listinfo/slug


--
SLUG - Sydney Linux User Group Mailing List - http://slug.org.au/
More Info: http://slug.org.au/lists/listinfo/slug

Reply via email to