Hello all,

A friend and I are trying to benchmark some network server code which
runs fine on my OSX laptop (~7500 req/s) but which is >10x slower on a
much faster 8-core Xeon Linux box.

The profiling run on Linux looks like this:

                                          individual    inherited
    COST CENTRE    MODULE  no.  entries  %time %alloc   %time %alloc

    MAIN           MAIN      1         0  95.7    0.2   100.0  100.0
    ...

How should we interpret this result? MAIN doesn't seem to correspond to
any user code so we're wondering where the time is going.

G.
-- 
Gregory Collins <g...@gregorycollins.net>
_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to