Hi
What you explaining now and what you explained before are completely
different story.
Simple and main cause for your issue is inefficiency and poor design.
why don't you simply count number of users registering on particular ip and
stop at threshold for certain amount of time.
A simple
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
On 5/20/15 4:22 AM, javalishixml wrote:
More detail information as below:
presudo-code step:
This isn't pseudo-code. This is a re-statement of your problem.
1. a register page named http://mywebsite.com/register1.jsp; is
set up, and this
On 5/19/2015 1:03 AM, javalishixml wrote:
Thanks a lot for your information.
This solution is based on tomcat level. If I always handle this issue at java
level, I'm afraid it has performance issue. Because this web site afford a very
big concurrency access.
Taking a consideration on its
On 5/19/2015 7:53 AM, javalishixml wrote:
I doubt you're going to be able to do this in httpd, unless you have a very
simple, straight forward way of identifying the robots.
Yes. I just want to have a way to block the duplicated requests at httpd level.
After all, my website has to face
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
To whom it may concern,
On 5/19/15 8:09 AM, javalishixml wrote:
Just understood you. Really appreciate for your feedback.
How do we judge it's a robot? item1: we find the request IP is
always the same one. item2: our page may contains
On 5/19/2015 8:09 AM, javalishixml wrote:
Just understood you. Really appreciate for your feedback.
How do we judge it's a robot?
item1: we find the request IP is always the same one.
item2: our page may contains several keep-alive connections. But the attack
connection only focus on
Christopher Schultz wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
To whom it may concern,
On 5/19/15 8:09 AM, javalishixml wrote:
Just understood you. Really appreciate for your feedback.
How do we judge it's a robot? item1: we find the request IP is
always the same one. item2: our
David kerber wrote:
On 5/19/2015 8:09 AM, javalishixml wrote:
Just understood you. Really appreciate for your feedback.
How do we judge it's a robot?
item1: we find the request IP is always the same one.
item2: our page may contains several keep-alive connections. But the
attack connection
How would you tell that a request is from a robot?
On 5/18/2015 11:44 AM, javalishixml wrote:
Hi,
I have a website. It is built by apache + tomcat.
Now we make a lottery activity at this website. But we find that some robots
always raise the duplicated requests to hit this lottery activity.
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
To whom it may concern,
On 5/18/15 11:44 AM, javalishixml wrote:
I have a website. It is built by apache + tomcat.
Now we make a lottery activity at this website. But we find that
some robots always raise the duplicated requests to hit this
10 matches
Mail list logo