Just a bit of Friday technical trivia that might warn others ...

I analysed all of the 2013 data so far from my IIS web logs and found the
following interesting facts:

robots.txt was read 11270 times
My total sent bytes was 10.2GB of which 5.3GB was searching engines and bots
143374 requests came from bots (%46 of the total)

So %52 of all traffic out of my web site was food for bots. I think this is
an extraordinary volume of data and I have declared war on them. I found a
sample robots.txt file on the web which looks quite comprehensive, then I
added others ones I found in my logs. I installed the "IP Address and
Domain Restrictions" Role to IIS 7.5 so I can totally block the worst
offenders (I'm looking at you Ezooms and GoogleImages!).

It will be interesting to see how my web request stats change over the
coming months. I think this "search engine" zoo is now well out of control
in the wild.

So I've got charities and market researchers pestering me on the phone all
day while my web server is pestered with a flood of bots.

Greg

Reply via email to