Then I don't understand; an implementation (conforming to R7RS)
already has to signal an error when it comes across an unreadable
datum, which is "#<...>" in standard Scheme.
Yes. But reserving a standard notation #<...> ensures:
- A convention that schemers will learn to recognize. (Many already do.)
- That #< isn't accidentally read in as some other type of object in
implementations that extend the standard syntax. (This assumption is
currently violated by Chicken, Gambit, and Kawa in some cases. In
Chicken and Gambit's case at least, there is a viable workaround.)
- That the reader can produce a friendlier error message. (This aspect
of Scheme has been traditionally disregarded.)
- Finally, that the _writer_ can do something sensible and consistent.
I am talking about "<" inside #<...>. From the Chez REPL:
<
#<procedure <>
(The first ">" is the prompt, of course.)
Ah, good catch. That is indeed nonsensical.
The fix is #<procedure < > which CL's read-delimited-list would handle.
A bigger problem is #<procedure > > which would confuse it.
The best fix is #<procedure |<|> and #<procedure |>|>. That works for at
least standard R7RS.
But some Scheme implementations read mixed|parts-like|this as one
identifier. In that case you'd need #<procedure |>| > with a space.
This is my fault. The SRFI needs to add some rule that identifiers have
to be disambiguated from >.
So
(foo . #<unreadable>))
would be (foo .) and thus an error while
(foo #<unreadable>) would be (foo) and thus not?
This looks strange to me.
No, the outermost datum having an inner #<...> somewhere would be
discarded wholesale.
You haven't answered the point about the XXX#YYY syntax, have you?
I don't understand it.
Any non-trivial syntax has many ways to write nonsense input. Decades of
Lisp history clearly demonstrate that writing _deliberately_ unreadable
syntax in a structured way is useful. It's useful for the same reason
that abbreviations are useful in natural language.