php-general Digest 2 Jan 2012 09:21:55 -0000 Issue 7632
Topics (messages 316144 through 316153):
Re: count clicks to count most important news
316144 by: Tedd Sperling
316145 by: Ashley Sheridan
316146 by: Maciek Sokolewicz
316147 by: Maciek Sokolewicz
316148 by: Stuart Dallas
316149 by: muad shibani
PHP 5.3.2 max_execution_time
316150 by: Chris Tapp
316152 by: Duken Marga
316153 by: Chris Tapp
Re: Question about date calculations
316151 by: Matijn Woudt
Administrivia:
To subscribe to the digest, e-mail:
php-general-digest-subscr...@lists.php.net
To unsubscribe from the digest, e-mail:
php-general-digest-unsubscr...@lists.php.net
To post to the list, e-mail:
php-gene...@lists.php.net
----------------------------------------------------------------------
--- Begin Message ---
On Jan 1, 2012, at 11:26 AM, muad shibani wrote:
> I have a website that posts the most important news according to the number
> of clicks to that news
> the question is : what is the best way to prevent multiple clicks from the
> same visitor?
Not a fool-proof method, but use Javascript on the client-side to stop users'
from continuous clicking.
Then create a token and verify the click on the server-side before considering
the click as being acceptable.
Cheers,
tedd
_____________________
t...@sperling.com
http://sperling.com
--- End Message ---
--- Begin Message ---
On Sun, 2012-01-01 at 11:49 -0500, Tedd Sperling wrote:
> On Jan 1, 2012, at 11:26 AM, muad shibani wrote:
>
> > I have a website that posts the most important news according to the number
> > of clicks to that news
> > the question is : what is the best way to prevent multiple clicks from the
> > same visitor?
>
> Not a fool-proof method, but use Javascript on the client-side to stop users'
> from continuous clicking.
>
> Then create a token and verify the click on the server-side before
> considering the click as being acceptable.
>
> Cheers,
>
> tedd
>
>
> _____________________
> t...@sperling.com
> http://sperling.com
>
>
>
>
>
There are still problems with this, GET data (which essentially only
what a clicked link would produce if you leave Javascript out the
equation - you can't rely on Javascript) shouldn't be used to trigger a
change on the server (in your case a counter increment)
I did something similar for a competition site a few years ago, and
stupidly didn't think about this at the time. Someone ended up gaming
the system by including an image with the clicked-through URL in the src
attribute, and put that on their MySpace profile page, which had more
than a few visitors. Each of those visitors browser attempted to grab
that "image" which registered a click, and because of the number of
unique visitors, the clicks were registered as genuine.
I'd recommend using POST data for this reason, as it's a lot more
difficult for people to game.
--
Thanks,
Ash
http://www.ashleysheridan.co.uk
--- End Message ---
--- Begin Message ---
On 01-01-2012 20:08, Ashley Sheridan wrote:
On Sun, 2012-01-01 at 11:49 -0500, Tedd Sperling wrote:
On Jan 1, 2012, at 11:26 AM, muad shibani wrote:
I have a website that posts the most important news according to the number
of clicks to that news
the question is : what is the best way to prevent multiple clicks from the
same visitor?
Not a fool-proof method, but use Javascript on the client-side to stop users'
from continuous clicking.
Then create a token and verify the click on the server-side before considering
the click as being acceptable.
Cheers,
tedd
_____________________
t...@sperling.com
http://sperling.com
There are still problems with this, GET data (which essentially only
what a clicked link would produce if you leave Javascript out the
equation - you can't rely on Javascript) shouldn't be used to trigger a
change on the server (in your case a counter increment)
I did something similar for a competition site a few years ago, and
stupidly didn't think about this at the time. Someone ended up gaming
the system by including an image with the clicked-through URL in the src
attribute, and put that on their MySpace profile page, which had more
than a few visitors. Each of those visitors browser attempted to grab
that "image" which registered a click, and because of the number of
unique visitors, the clicks were registered as genuine.
I'd recommend using POST data for this reason, as it's a lot more
difficult for people to game.
I agree, POST data is indeed the way to go here. Personally, I would use
a "like" image-like thing which is actually a button, using some clever
javascript (personally I would use jquery for this) you can then POST
data to the server based on the click. Then set a cookie which disables
the button (and keeps it disabled on future visits). This should prevent
average person from repeatedly clicking it. You could also log the
person's IP adress and filter based on that aswell; combining various
methods would be best in this case I think.
To prevent the method which Ashley mentioned, using POST data isn't
enough. You would want to guarantee that the link came from YOUR server
instead of some different place. There are multiple ways to do this:
- use a unique key as an argument in the POST which can only be
"clicked" once. Register the key in a database before serving the page,
and then unregister it once it has been served and clicked. Though if a
person were to repeatedly open the page, your cache would be exhausted,
and the method would become useless.
- require a referrer address to come from your domain; also reasonably
easily circumvented in this case
- there are more, but it really depends on how much effort you want to
put into preventing attacks and how much effort you expect others to put
into attacking it. For example, large sites like youtube are sure to use
extensive measures to prevent people from spam-clicking in any way.
While sites that only cater to say 3 visitors a month don't require all
that effort in the first place.
Hope that helps,
- Tul
--- End Message ---
--- Begin Message ---
On 01-01-2012 20:08, Ashley Sheridan wrote:
On Sun, 2012-01-01 at 11:49 -0500, Tedd Sperling wrote:
On Jan 1, 2012, at 11:26 AM, muad shibani wrote:
I have a website that posts the most important news according to the number
of clicks to that news
the question is : what is the best way to prevent multiple clicks from the
same visitor?
Not a fool-proof method, but use Javascript on the client-side to stop users'
from continuous clicking.
Then create a token and verify the click on the server-side before considering
the click as being acceptable.
Cheers,
tedd
_____________________
t...@sperling.com
http://sperling.com
There are still problems with this, GET data (which essentially only
what a clicked link would produce if you leave Javascript out the
equation - you can't rely on Javascript) shouldn't be used to trigger a
change on the server (in your case a counter increment)
I did something similar for a competition site a few years ago, and
stupidly didn't think about this at the time. Someone ended up gaming
the system by including an image with the clicked-through URL in the src
attribute, and put that on their MySpace profile page, which had more
than a few visitors. Each of those visitors browser attempted to grab
that "image" which registered a click, and because of the number of
unique visitors, the clicks were registered as genuine.
I'd recommend using POST data for this reason, as it's a lot more
difficult for people to game.
I agree, POST data is indeed the way to go here. Personally, I would use
a "like" image-like thing which is actually a button, using some clever
javascript (personally I would use jquery for this) you can then POST
data to the server based on the click. Then set a cookie which disables
the button (and keeps it disabled on future visits). This should prevent
average person from repeatedly clicking it. You could also log the
person's IP adress and filter based on that aswell; combining various
methods would be best in this case I think.
To prevent the method which Ashley mentioned, using POST data isn't
enough. You would want to guarantee that the link came from YOUR server
instead of some different place. There are multiple ways to do this:
- use a unique key as an argument in the POST which can only be
"clicked" once. Register the key in a database before serving the page,
and then unregister it once it has been served and clicked. Though if a
person were to repeatedly open the page, your cache would be exhausted,
and the method would become useless.
- require a referrer address to come from your domain; also reasonably
easily circumvented in this case
- there are more, but it really depends on how much effort you want to
put into preventing attacks and how much effort you expect others to put
into attacking it. For example, large sites like youtube are sure to use
extensive measures to prevent people from spam-clicking in any way.
While sites that only cater to say 3 visitors a month don't require all
that effort in the first place.
Hope that helps,
- Tul
--- End Message ---
--- Begin Message ---
On 1 Jan 2012, at 16:26, muad shibani wrote:
> I have a website that posts the most important news according to the number
> of clicks to that news
> the question is : what is the best way to prevent multiple clicks from the
> same visitor?
I'm assuming this is not a voting system, and the news items you're counting
are sourced from your own site and, with all due respect to Ash, unlikely to be
a target for false clicks. All you're really wanting to do is prevent the site
from registering multiple hits from the same user in a short period of time.
I would probably use memcached on the server-side to store short-term
information about clicks. When a news item is loaded...
1) Construct the memcache key: "newsclick_<article_id>_<ip_address>".
2) Fetch the key from memcache.
3a) If it does not exist, log the hit.
3b) If it does exist, compare time() with the value and only log the hit if
time() is greater.
4) Store the key with a value of time() + 300 and an expiry of the same value.
This will prevent hits being logged for the same news item from the same IP
address within 5 minutes of other hits.
Other alternatives would be to use cookies (could get messy, and not very
reliable since it requires the response from click 1 to be processed before
click 2 gets started), Javascript (as suggested by tedd but without the token -
it would work pretty well and would be a lot easier to implement than the
above, but you sacrifice having full control over it).
If I'm interpreting the requirement correctly my solution is almost certainly
overkill, and a simple Javascript solution would be more than sufficient.
-Stuart
--
Stuart Dallas
3ft9 Ltd
http://3ft9.com/
--- End Message ---
--- Begin Message ---
All the answers are great but Stuart Dallas' answer is what I was asking
about .. thank u all I really appreciate it a lot
On Sun, Jan 1, 2012 at 11:10 PM, Stuart Dallas <stu...@3ft9.com> wrote:
> On 1 Jan 2012, at 16:26, muad shibani wrote:
>
> > I have a website that posts the most important news according to the
> number
> > of clicks to that news
> > the question is : what is the best way to prevent multiple clicks from
> the
> > same visitor?
>
> I'm assuming this is not a voting system, and the news items you're
> counting are sourced from your own site and, with all due respect to Ash,
> unlikely to be a target for false clicks. All you're really wanting to do
> is prevent the site from registering multiple hits from the same user in a
> short period of time.
>
> I would probably use memcached on the server-side to store short-term
> information about clicks. When a news item is loaded...
>
> 1) Construct the memcache key: "newsclick_<article_id>_<ip_address>".
> 2) Fetch the key from memcache.
> 3a) If it does not exist, log the hit.
> 3b) If it does exist, compare time() with the value and only log the hit
> if time() is greater.
> 4) Store the key with a value of time() + 300 and an expiry of the same
> value.
>
> This will prevent hits being logged for the same news item from the same
> IP address within 5 minutes of other hits.
>
> Other alternatives would be to use cookies (could get messy, and not very
> reliable since it requires the response from click 1 to be processed before
> click 2 gets started), Javascript (as suggested by tedd but without the
> token - it would work pretty well and would be a lot easier to implement
> than the above, but you sacrifice having full control over it).
>
> If I'm interpreting the requirement correctly my solution is almost
> certainly overkill, and a simple Javascript solution would be more than
> sufficient.
>
> -Stuart
>
> --
> Stuart Dallas
> 3ft9 Ltd
> http://3ft9.com/
--
*_______________*
*
*
السجل .. كل الأخبار من كل مكان
www.alsjl.com
صفحة السجل على فيسبوك
http://www.facebook.com/alsjl
*Muad Shibani*
*
*
Aden Yemen
Mobile: 00967 733045678
www.muadshibani.com
--- End Message ---
--- Begin Message ---
I've got a Dokuwiki installation that is producing Apache errors
saying that the maximum execution time of 30 seconds has been exceeded.
So, I went and changed max_execution_time in php.ini expecting that
this would solve the problem (which is due to large files taking a
while to upload over an ADSL connection). However, the error log still
reports that the old 30 second value is in use, even though Apache has
been restarted.
phpinfo() for the same site reports max_execution_time as 120 seconds,
so it seems as if the change to php.ini has been detected as expected.
Is there another setting that I need to consider? max_input_time is
already set to 60 seconds and there are no local 'php_value' Apache
configuration items fighting the ones in php.ini.
PHP version is 5.3.2 and is running under a CentOS 6.0 system.
Chris Tapp
opensou...@keylevel.com
www.keylevel.com
--- End Message ---
--- Begin Message ---
If you want to upload large file, maybe you should consider maximum
uploaded size. You can change setting in php.ini on line that contain *
upload_max_filesize*.
On Mon, Jan 2, 2012 at 5:13 AM, Chris Tapp <opensou...@keylevel.com> wrote:
> I've got a Dokuwiki installation that is producing Apache errors saying
> that the maximum execution time of 30 seconds has been exceeded.
>
> So, I went and changed max_execution_time in php.ini expecting that this
> would solve the problem (which is due to large files taking a while to
> upload over an ADSL connection). However, the error log still reports that
> the old 30 second value is in use, even though Apache has been restarted.
>
> phpinfo() for the same site reports max_execution_time as 120 seconds, so
> it seems as if the change to php.ini has been detected as expected. Is
> there another setting that I need to consider? max_input_time is already
> set to 60 seconds and there are no local 'php_value' Apache configuration
> items fighting the ones in php.ini.
>
> PHP version is 5.3.2 and is running under a CentOS 6.0 system.
>
> Chris Tapp
>
> opensou...@keylevel.com
> www.keylevel.com
>
>
>
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>
--
Duken Marga
--- End Message ---
--- Begin Message ---
On 2 Jan 2012, at 02:15, Duken Marga wrote:
If you want to upload large file, maybe you should consider maximum
uploaded size. You can change setting in php.ini on line that
contain *
upload_max_filesize*.
Thanks, but the filesize limits are already set well above the size of
the file. The file transfer takes place within the context of an RPC
callback, so I think this may be a TCP transfer rather than a file
upload.
This really does seem to be an execution time issue, as shown by the
Apache error log entry.
Chris
--- End Message ---
--- Begin Message ---
On Fri, Dec 30, 2011 at 5:33 PM, Eric Lommatsch <er...@pivotaldata.net> wrote:
>
> When I try this method:
>
>
>
> $interval = $dteStartDate[$intCnt]->diff($dteEndDate[$intCnt]); I get the
> following error when I run the page:
>
> " Fatal error : Call to undefined method DateTime::diff() in
> /var/www/evalHomeLime.php on line 254"
>
Just for the record: As noted on the manpage [1], your PHP version
needs to be >= 5.3.0.
Cheers,
Matijn
[1] http://www.php.net/manual/en/datetime.diff.php
--- End Message ---