Thanks for the reply Massimo.

But my requirement is that the bots and Facebook scrapers should be
able to access the page, without authentication. I've seen a few
newspaper sites, and online tech journal sites have this
functionality, where in Google will have the content indexed, but as a
user, I need to be a registered user to access the content. I'm
looking for such a functionality on my site.

On Sep 2, 6:32 pm, mdipierro <[email protected]> wrote:
> In your model you can do something like this:
>
> if not "Mozilla" in request.env.http_user_agent: raise HTTP(500)
>
> On Sep 2, 4:19 am, Narendran <[email protected]> wrote:
>
>
>
>
>
>
>
> > Hello all,
> > I have a few controller methods (and corres. web pages) that are
> > authorized to be viewed only by registered users. But I don't want to
> > authenticate search engine bots, Facebook scrapers, and other such
> > services. Is there a recommended way of doing this? (like setting auth
> > requirements based on user agents, etc.)

Reply via email to