Date: Tue, 28 Apr 2015 14:55:46 +0300
   From: David Gray <dg...@iesl.forth.gr>

   I'm reading some data over a raw TCP socket and the 
   server program sends me 0d and what I read is 0a
   I've used both read-string! and read-char and experience
   the same result. Is there some character encoding default
   that I need to override or some binary mode?

By default, the MIT Scheme TCP sockets map {CR, CRLF, LF} -> LF on
input, and LF -> CRLF on output.  (Yes, that's rather silly.  It
happens to do the right thing for text-oriented protocols like SMTP
and HTTP.)  You can disable it with:

(port/set-line-ending socket 'NEWLINE)

_______________________________________________
MIT-Scheme-devel mailing list
MIT-Scheme-devel@gnu.org
https://lists.gnu.org/mailman/listinfo/mit-scheme-devel

Reply via email to