> 
> Everytime I submit my site to search engines, I start seeing a lot of
> 404's in my logs for a file called 'robot.txt'.  It's my understanding that
> search engines look for that file for a list of pages to include or exclude
> in their indexing.  Does anyone know where I can find information on
> the formatting, creation and locating of such a file?  Thanks.

   It tells robots where not to tread.  Here is a version of one.

#  Web Robot / Search Engine Control Protocol
User-Agent: *
Disallow: /abc  # don't look here!

[EMAIL PROTECTED]  ------------------  [EMAIL PROTECTED]      
----------------------- IMAGINEERING --------------------------
----------------- Every mouse click, a Vote -------------------
---------- Do they vote For, or Against your pages? -----------
----- What people want: http://www.mall-net.com/se_report/ ----
---------------------------------------------------------------
--- Have you analyzed your viewer's footprints in the logs? ---
--- Webmaster's Resources: http://www.mall-net.com/webcons/ ---
--- Web Imagineering -- Architecture to Programming CGI-BIN ---
---------------------------------------------------------------

____________________________________________________________________
--------------------------------------------------------------------
 Join The Web Consultants Association :  Register on our web site Now
Web Consultants Web Site : http://just4u.com/webconsultants
If you lose the instructions All subscription/unsubscribing can be done
directly from our website for all our lists.
---------------------------------------------------------------------

Reply via email to