I am currently trying to optimize the in process part of my Actor Model 
framework for Go - GAM.
In order to compare it to Erlang and Akka, I am using the "Skynet 
Benchmark" https://github.com/atemerev/skynet#results-i7-4771-ubuntu-1510

I have ported this test to my lib: 
https://github.com/AsynkronIT/gam/blob/dev/examples/spawnbenchmark/main.go
It works and the performance is roughly the same as the Scala Akka 
performace.

I have noticed that if I turn of GOGC, it runs extremely fast. about 3 
times faster.
Is this because my test is generating a lot of garbage or more due to the 
GC process of GO having a big overhead by just running?
That is, would I still see a pef difference even in a system that does not 
allocate anything?

This test is ofcourse completely nonsense from a real world scenario, but 
still, it's an interesting challange trying to push the numbers down.

When profiling, I am not seeing anything specific that pops out at me. 
Anyone here up for giving a helping hand?


-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to golang-nuts+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to