I have not yet succeeded in turning this behaviour off (_setmode()
didn't seem to affect it). If we can't find a way to turn it off,
the only solution short of abandoning its use on Windows that I can
think of is to translate LF on input to something unlikely like 0x1C
and then translate it back when we read it from the pipe.
Did you _setmode() the pipe, _setmode() stderr, or both? (And did
you try before or after calling dup2()?).
It looks like the Win CRT implementation of dup2() copues the "mode"
from the pipe that you've created into stderr.
Sorry, I was looking at the wrong chunk of code in syslogger.c.
Why are you calling _open_osfhandle() with O_TEXT? That looks
suspicious giving the problem that you are seeing with CR -> CR/LF
I have no idea why that's done - it goes back to the origins of the
syslogger - probably because someone mistakenly thinks all WIndows text
files have to have CRLF line endings.
I tried changing that to _O_BINARY, and calling _setmode on both the
pipe before it's duped into stderr and stderr after the dup and both.
Nothing seemed to work.
But that's not the only problem. I am now getting log file corruption
even when I work around the text mode problem by not sending a '\n' at
all, which makes me think even small packets aren't safely written
atomically to Windows pipes. I wonder if we'd do better using the (so
far unused) pipe simulation using sockets that we have. Not sure if we
can dup a socket into stderr on Windows, but it might be worth trying,
or even working around that problem. If we could safely use that the
code would get a whole lot simpler - presumably we would no longer need
those extra syslogger threads on Windows.
---------------------------(end of broadcast)---------------------------
TIP 5: don't forget to increase your free space map settings