Just to let you all know that there is a Stackoverflow question on this 
issue, in case you wanted to air your views there:

http://stackoverflow.com/questions/6463945/whats-the-most-efficient-node-js-inter-process-communication-library-method
 

On Thursday, 28 April 2011 14:47:04 UTC+2, Aikar wrote:
>
> Justin/OP: 
> You can try my Wormhole lib. 
> https://github.com/aikar/wormhole 
>
> It's designed to simply pass it ends of a stream, and it just works. 
>
> As said in benchmark above, I've got it up to 150k messages processed 
> per second on a single 3.4ghz core. 
> And that was just 1 core feeding it, so sending would of bottlenecked 
> it, if you had multiple processes sending it data, im sure it would 
> get even faster 
>
> On Apr 28, 8:32 am, Chris <[email protected]> wrote: 
> > How does a straight `eval(json)` compare to `JSON.parse(json)` for 
> > unpacking? 
> > 
> > On Apr 28, 2:54 pm, billywhizz <[email protected]> wrote: 
> > 
> > 
> > 
> > 
> > 
> > 
> > 
> > > interestingly, node v0.4.6 built with v8 3.3.2 which should have 
> > > crankshaft enabled on 64 bit is slightly quicker for stringify and 
> > > quite a bit slower for parse: 
> > 
> > > node 0.4.6/v8 3.3.2 
> > > json    pack:   1318 ms 
> > > json    unpack: 1039 ms 
> > 
> > > node.0.4.6/v8 3.1.8.10 
> > > json    pack:   1487 ms 
> > > json    unpack: 851 ms 
> > 
> > > On Apr 28, 5:02 am, Chris <[email protected]> wrote: 
> > 
> > > > I use a modified version of kriszyp's multi-node that enables 2-way 
> > > > communication 
> > 
> > > > Source is here:
> https://github.com/chriso/node.io/blob/master/lib/node.io/multi_node.js 
> > 
> > > > You set it up like this:
> https://github.com/chriso/node.io/blob/master/lib/node.io/processor.j... 
> > 
> > > > Then just use `master.send(msg)` or `workers[i].send(msg)` 
> > 
> > > > It's been more than fast enough. 
> > 
> > > > On Apr 28, 1:43 pm, billywhizz <[email protected]> wrote: 
> > 
> > > > > interesting. i am seeing the same speedup. the benchmark on 
> msgpack 
> > > > > github is way out of date. i tested here using an old 0.1.96 
> version 
> > > > > of node versus latest 0.4.7 on 64 bit with v8 v3.1.8.10 and saw 
> the 
> > > > > following results for peter's bench.js script (i just ran the JSON 
> > > > > tests not the msgpack ones): 
> > 
> > > > > v0.1.96: 
> > > > > json    pack:   6787 ms 
> > > > > json    unpack: 11888 ms 
> > 
> > > > > v0.4.7: 
> > > > > json    pack:   1487 ms 
> > > > > json    unpack: 851 ms 
> > 
> > > > > That's almost 8x faster for JSON.stringify and almost 14x faster 
> for 
> > > > > JSON.parse. This is very nice indeed! 
> > 
> > > > > On Apr 28, 3:42 am, Aikar <[email protected]> wrote: 
> > 
> > > > > > Just throwing in my recent findings, 
> > > > > > V8 has apparently done a massive performance boost on JSON. 
> > 
> > > > > > msgpack is now alot slower than JSON 
> > 
> > > > > > see:https://github.com/aikar/wormhole/issues/3 
> > 
> > > > > > I had a good streaming msgpack parser implemented doing 50k 
> messages/ 
> > > > > > sec, and switching to json tripled the speed. 
> > 
> > > > > > On Apr 27, 9:30 pm, billywhizz <[email protected]> wrote: 
> > 
> > > > > > > Thanks Matt. had a google but there doesn't seem to be much 
> definitive 
> > > > > > > info out there. i'd wager for large message sizes a unix 
> socket should 
> > > > > > > be quite a bit faster but if sending lots of small messages, 
> you 
> > > > > > > probably won't see a whole lot of difference. 
> > 
> > > > > > > Justin, the best way to answer a question like this is to do 
> the 
> > > > > > > testing yourself based on the system/environment you are 
> optimizing 
> > > > > > > for. There's no way to give a definitive general answer to 
> your 
> > > > > > > question. 
> > 
> > > > > > > anyway, i'll see what kind of benchmark i can throw together - 
> i'm 
> > > > > > > more interested in the cost of serialization differences 
> myself. 
> > 
> > > > > > > On Apr 28, 1:44 am, Matt <[email protected]> wrote: 
> > 
> > > > > > > > On Wed, Apr 27, 2011 at 8:19 PM, billywhizz <
> [email protected]> wrote: 
> > > > > > > > > does TCP over loopback on linux really not hit the network 
> stack? 
> > > > > > > > > doesn't it do packet framing, checksums and protocol 
> handshakes etc? i 
> > > > > > > > > can't find any reference to this, so if you have one, 
> please let me 
> > > > > > > > > know. 
> > 
> > > > > > > > AFAIK it's optimised to not go through the network driver 
> layer (so it still 
> > > > > > > > goes through parts of the stack, but is optimised so that it 
> doesn't have to 
> > > > > > > > talk to hardware). But this is just a memory from when I 
> used to read the 
> > > > > > > > Kernel change logs years and years ago.  I can't find any 
> reference to it 
> > > > > > > > online now. 
> > 
> > > > > > > > Matt.

-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

Reply via email to