On 12 May 2008, at 1:52 am, Brandon S. Allbery KF8NH wrote:
My real point was that in the C programming culture it was/is far too common to use an in-band value; that is, one that could be confused with or treated as a valid response: null pointers, stdio's EOF (= -1).

Here I must disagree. I've hacked character-level I/O in lots of programming languages (the last time I counted I'd used more than 100), and C was the first language I ever met that made it easy, precisely BECAUSE the perfectly normal "there are no more characters" situation
was handled the same way as every other outcome.

This just causes problems because code is almost encouraged to ignore the special cases.
 For example, the ctype macros have to support being passed EOF.

So they do, but it is elementary to do so. The only reason there is anything even remotely unusual there is that the *same* functions are used in C for *character* input and *byte* input. I'll grant you that you probably don't want to process binary input using quite the same quasi-FSA code that you want for characters. Since C uses NUL for terminating strings, and since ASCII made it clear that NUL was never ever *supposed* to appear in text, NUL would have been the perfect choice for character EOF, and in that case there would never have been anything odd about having the ctype macros handle it.

I've been writing some Smalltalk recently, which uses a Pascal-like convention
        aStream atEnd                   "test for EOF"
          ifTrue: [self handleEOF]
          ifFalse: [self handleCharacter: aStream next]
and the EOF tests clutter up the code inordinately and make it so painful that C
starts looking good again.

The C approach here has several benefits:
 - you can *postpone* checking for EOF until after you have checked for
   other things; since EOF is seldom or never what the code is mainly
   *about* this is good for clarity
- if you want to know "is the next character one of these" you have only two cases to deal with at that point (yes and no), not three (yes, no, and you-idiot-you-forgot-to-test-for-EOF-first-and-testing-for-EOF- is- the-most-important-thing-anybody-could-be-interested-in-when- reading).

Maybe types force you to deal with it, while simultaneously providing convenience functions to help you deal with it.

I readily grant that Maybe is a wonderful wonderful thing and I use it freely and
voluntarily.  BUT it should not dominate the code.

Consider Haskell's getChar and hGetChar. They DON'T return Maybe Char; they raise an exception at end of file. You have to keep testing isEOF/ hIsEOF before
each character read, as if we had learned nothing since Pascal.
Arguably, maybeGetChar :: IO (Maybe Char) and hMaybeGetChar :: Handle - > IO (Maybe Char) would be a good idea, at least then one could easily set up some combinators to
deal with this nuisance.



--
brandon s. allbery [solaris,freebsd,perl,pugs,haskell] [EMAIL PROTECTED]
system administrator [openafs,heimdal,too many hats] [EMAIL PROTECTED]
electrical and computer engineering, carnegie mellon university KF8NH


_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


--
"I don't want to discuss evidence." -- Richard Dawkins, in an
interview with Rupert Sheldrake.  (Fortean times 232, p55.)






_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to