% sh randomscript.sh > file cat: duplicate writer % cat randomscript.sh #!/bin/sh cat ~/.headerfile anotherscript.sh ... %
I don't see a "duplicate writer". If the point is that I don't, but there is, somewhere down the guts - well, I agree that Unix shell scripts are not much fun to debug. That's why I never write them, and instead use languages emitting stack traces upon error.
% sh longrunningprogram.sh > /tmp/stuff & % rm /tmp/stuff longrunningprogram.sh: output deleted -- core dumped % Oh, that's definitely better!
No, that's the Unix Way. I meant: % sh longrunningprogram.sh > /tmp/stuff & % rm /tmp/stuff /tmp/stuff: file is used by longrunningprogram.sh [pid 134]
Programs aren't paid to think.
Programmers are.
NFS is too hateful to ignore, but it's too hateful to even bring up as a reason NOT to do something.
What I was saying is that the "two-step" deletion (unlink, reclaim) is only marginally useful for creating temporary files. So the mechanism involved shouldn't get in the way when I do other things.
