Hi Stas,

On Thu, Sep 22, 2016 at 7:47 AM, Stanislav Malyshev <smalys...@gmail.com> wrote:
>> On Wed, Sep 21, 2016 at 11:26 AM, Stanislav Malyshev
>> <smalys...@gmail.com> wrote:
>>>> I think we are better to limit max collisions.
>>>> I'm +1 for Nikita's proposal does this.
>>> Max collision per what? How much would be the limit?
>> Collision by keys.
> Not sure I understand. What would be counted - number of collision per
> key? Per hashtable? Per process? Per request?

IIRC, proposed patch was detecting collisions per key.

>> It would be nice to have configurable limit like regex stack/backtrack limit.
>> That said, wouldn't 1000 enough for almost all apps?
> Certainly not. Not even nearly enough. Collisions are pretty frequent
> with short strings, for example, and for a big long-running application
> 1000 hash collisions is nothing. I think you severely underestimate how
> frequent hash collisions are, with simple function like we're using,
> over millions and millions of hash accesses that we're doing routinely.
> I did a quick check, and just running run-tests.php -h (without any
> tests!) produces about 5K collisions. Running composer (without doing
> anything) - 8K collisions. Running composer update on a simple project -
> 400K (!) collisions. Now these are pretty simple cases compared to what
> a complex modern PHP application does. So I think you are
> underestimating it by about 4-5 orders of magnitude.

I agree that we cannot be sure how many collision limit is proper for
certain app.
This is the same for memory limit, stack limit, backtrack limit,
recursion limit.

It is possible to set reasonable limit that is good enough for almost
all apps. AFAIK, we don't have a bug report complains slow hash
operation for normal code yet. IMO, this is the evidence that we can
set collision limit safely and prevent intended hash collision


Yasuo Ohgaki

PHP Internals - PHP Runtime Development Mailing List
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to