I second Michael's suggestion of IIRF, http://www.codeplex.com/IIRF - it's
pretty good.

Now, back to the original question, how to make a dynamic robots.txt file.

Why not generate it? Throw together a script that generates the content of
your file, then write it to the web root of your web sites. ba-da-bing, no
server installs for rewrite plugins, no strange 404 error catching and no
reconfiguration of JRun to get it to parse *.txt files. Put it on a schedule
to run every day or every hour or just run it when you want.


Now that that's solved (...right...) what's this about an app that runs 2000
web sites? Please tell me there aren't 2000 entries in your web server for
different domains and that you instead, intelligently, map them to the same
web front end with some certain things switched in and out based on the
domain... please?

I once worked at this shop where we had 10 nearly identical sites. Just the
pain in managing that was hard enough. When the owner wanted to add 5 more,
I stopped him, wrote a generic customizable version and then let him add as
many as he wanted. Less management for me, quicker response time for him.


nathan strutz
[Blog and Family @ http://www.dopefly.com/]
[AZCFUG Manager @ http://www.azcfug.org/]
[Twitter @nathanstrutz]


On Wed, May 27, 2009 at 11:17 AM, Michael Dinowitz <
mdino...@houseoffusion.com> wrote:

>
> I've gone the route of rewriting IIS's 404 handler in the past and I highly
> suggest against it. Mapping .txt to CF would work, but that means every
> .txt
> file will be handled, not just robots.txt. What I do for my clients is use
> the ionic isapi to watch for any call to robots.txt and then redirect the
> request (on the back end) to ColdFusion. To the visitor, it looks like an
> actual robots.txt. I do the same for dynamic sitemaps as well as a number
> of
> other things.
>
> http://www.codeplex.com/IIRF
>
> Yes it's free but that should not be a negative.
>
> On Wed, May 27, 2009 at 9:41 AM, Andy Matthews <amatth...@dealerskins.com
> >wrote:
>
> >
> > We have an application which delivers about 2000 websites. This is really
> > nice for lots of things, but not so nice for items such as robots.txt
> where
> > we might want unique values. Reviewing the spec for robots.txt it
> requires
> > that sitemap references be absolute paths (while everything else can be
> > root
> > relative). As a hard coded file we obviously can't do this.
> >
> > A coworker had the idea to "delete" our robots.txt file and use our 404
> > handler to toss that request to ColdFusion, and write in the absolute
> path
> > to our sitemap each time the file is requested.
> >
> > 1) Can this be done reliably?
> > 2) Is anyone out there doing this already and could answer these
> questions?
> >
> > Andy Matthews
> > Senior Web Developer
> >
> > www.dealerskins.com
> >
> > P Please consider the environment before printing this e-mail.
> >
> > Total customer satisfaction is my number 1 priority! If you are not
> > completely satisfied with
> > the service I have provided, please let me know right away so I can
> correct
> > the problem,
> > or notify my manager Aaron West at aw...@dealerskins.com.
> >
> >
> >
> >
> >
>
> 

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
Want to reach the ColdFusion community with something they want? Let them know 
on the House of Fusion mailing lists
Archive: 
http://www.houseoffusion.com/groups/cf-talk/message.cfm/messageid:322842
Subscription: http://www.houseoffusion.com/groups/cf-talk/subscribe.cfm
Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4

Reply via email to