I have existing code which takes commands over a pipe and returns large amounts of data. I didliked the design which was not as compact as I would like, so I rewrote it to use remoting. The resulting data transfer speed is now 12 times slower than using the pipe.
I do use a binary tcp channel. I make a singleton object available which lets the remote client open a binary datafile (it is a hd video) and request sequential blocks of data (frames) from the server. This works well, but very slowly compared to just streaming the binary data, using binary stream writer, over a pipe. Note that currently both client and server runs on the same computer. The question is now if this is to be expected. Should one generally avoid using remoting for datatransfer intensive tasks, or might I simply be doing something silly somewhere?
