Hi all,

I am working on a PHP based portal with more than 100,000 pages. We are
industry leaders and because of this a lot of people often try to copy data
from our website.

We have faced instances where people run various scripts/software on our
site to scrap all data which is a major concern. Most of the times these are
Chinese/Korean guys and it becomes difficult to track them. These people
send 300+ http requests every second which often leads to hanging of the web
server or the web server not taking requests from genuine users.

Is anyone aware of any anti-scraping utility or any ideas that could help me
write some scripts to avoid such a situation.

Will really appreciate thoughts from you guys!!!

Thanks

Regards,
Pratik Thakkar


[Non-text portions of this message have been removed]

Reply via email to