On Wed, 1 Mar 2000, Larry W. Virden wrote:

> What concerns me is whether it is possible for lynx to encounter a legit
> situation in need of long values - perhaps a heavily nested web site or
> a cgi with a huge variable requirement. 
> 
> If we start truncating or dropping information because it is long, then
> we may very well have to come back to the situation to resolve it later
> anyways.  If we are going to be touching the code anyways, would it not
> be in best interests to fix things as best as they can be fixed?

Lynx doesn't impose any length restrictions on URLs in general.
AFAIK we are only talking about some components of URLs which are
indeed limited (host - can't be longer than 255 or so characters, and
that won't change as long as the Internet uses DNS), some HTTP headers
which I don't thing will ever legitimately need more than 1.5 kbytes
(like Accept-Language or Accept-Charset), and possibly some local
filenames.  (All this modulo unknown bugs, of course.  But the
unlimitied-length handling of URLs-as-a-whole is pervasive throughout
the code, so the problem you think of shouldn't exist.)

You can test for yourself how lynx handles extremley long URLs:
Just make a <FORM METHOD=GET ACTION=...> with a huge TEXTAREA containing
pre-filled text, and submit it.  See the INFO ('=') screen after that.
See how huge you can make the from contents.
If you see problems, they will probably be from the server (or a proxy)
not from Lynx.

    Klaus

Reply via email to