What algorithm is recommending for validating cookie domains?  I
originally tried to implement a check compliant with rfc2109, but that
proved not to work with a bunch of web sites.  So I implemented the
recommendation from the bogosity under

    http://www.netscape.com/newsref/std/cookie_spec.html

To quote from there:

    [...] Only hosts within the specified domain can set a cookie for
    a domain and domains must have at least two (2) or three (3)
    periods in them to prevent domains of the form: ".com", ".edu",
    and "va.us". Any domain that fails within one of the seven special
    top level domains listed below only require two periods. Any other
    domain requires at least three. The seven special top level
    domains are: "COM", "EDU", "NET", "ORG", "GOV", "MIL", and "INT".

This is amazingly stupid.  It means that `www.arsdigita.de' cannot set
the cookie for `arsdigita.de'.  To make *that* work, you'd have to
maintain a database of domains that use ".co.xxx" convention, as
opposed to those that use just ".xxx".  That kind of thing is
obviously error-prone, given how top-level domains are being added
these days.

A friend suggested to only allow one level of domain generality.  For
example, allow `sharenet.icn.siemens.de' to set the cookies for that
host, and for `icn.siemens.de', but not for `siemens.de' because that
would be two levels apart.  The problem with that is that it would be
hard to distinguish between `www.cnn.com' (obviously allowed to set
the cookie for "cnn.com") and `cnn.co.uk' (which should obviously
*not* be allowed to set the cookie for `co.uk') without employing a
list of domains such as mentioned above.

`Links' seems to be implementing the same bogus algorithm.  Lynx seems
to be doing something smarter, but it's unclear which (expired) spec
that's based on.

Any thoughts on this?

Reply via email to