Hi,

Can you tell what is the exact problem. When you say copy data from your
website is it the website or the database.
secondly what do you mean by scrap data.

For 300+ http requests you need to set rule to limit number of connections
per ip.

regards,
Viket

On Fri, Oct 3, 2008 at 1:34 PM, Pratik Thakkar <[EMAIL PROTECTED]>wrote:

>   Hi all,
>
> I am working on a PHP based portal with more than 100,000 pages. We are
> industry leaders and because of this a lot of people often try to copy data
> from our website.
>
> We have faced instances where people run various scripts/software on our
> site to scrap all data which is a major concern. Most of the times these
> are
> Chinese/Korean guys and it becomes difficult to track them. These people
> send 300+ http requests every second which often leads to hanging of the
> web
> server or the web server not taking requests from genuine users.
>
> Is anyone aware of any anti-scraping utility or any ideas that could help
> me
> write some scripts to avoid such a situation.
>
> Will really appreciate thoughts from you guys!!!
>
> Thanks
>
> Regards,
> Pratik Thakkar
>
> [Non-text portions of this message have been removed]
>
>  
>


[Non-text portions of this message have been removed]

Reply via email to