David,

that's for the good ones. Check who's reading the robot.txt file and put them automatically on your "Is_a_bot-list". Save the IP address _and_ the browser client. For my own system I generated so a list of possible robots and just send them a "bad-robot-page" from the OWA.

Am 11.04.2009 um 18:15 schrieb David Ringsmuth:

Is there a better way to solve this problem beside using the HTML and
robots.txt?

<META NAME="ROBOTS" CONTENT="NOINDEX">

<META NAME="ROBOTS" CONTENT="NOFOLLOW">

Mit freundlichen Grüßen
[4D-Consulting.com]eK, Wiesbaden
Peter Schumacher
--------------------------------------------------------
Web: http://www.4D-Consulting.com/
FreeCall:  0800-434 636 7
Tel.:      0611-9406.850 - Fax: 0611-9406.744
iChat/Skype: PeterInWiesbaden
4D-Consulting.com eK - Scharnhorststr. 36 - 65195 Wiesbaden
HR Wiesbaden: HRA 4867
Mitglied im Entwicklernetzwerk http://www.die4dwerkstatt.de







_______________________________________________
Active4D-dev mailing list
[email protected]
http://mailman.aparajitaworld.com/mailman/listinfo/active4d-dev
Archives: http://mailman.aparajitaworld.com/archive/active4d-dev/

Reply via email to