Hi,
I noticed a strange thing concerning the execution time of a simple filter
program of mine which I ran on Mac OSX natively as well as in a Linux VM via
Parallels on one and the same iMac.
Here are typical values I measured using time:
MacOS X host system with MacPorts
real 0m0.010s
user 0m0.003s
sys 0m0.003s
OpenSUSE 11.2 VM on Parallels
real 0m0.008s
user 0m0.001s
sys 0m0.007s
Firstly one sees that the "real"ly spent time is 20% shorter on the VIRTUAL
Linux on my Parallels 5.0.
On the other hand I see that "user"+"sys" time is usually equal to "real" on
Linux, whereas "user"+"sys"<"real" on Mac OS X itself, which makes me believe
that "user"+"sys" is the more actual than "real". :-)
Assuming "user"+"sys" being the really consumed time by the system to run my
little filter tool would proove then the Mac OS X host system to be slightly
faster than the virtual Linux guest system, which I would have expected in the
first place.
Nevertheless I am a little taken aback, confused, dazed due to these unexpected
numbers here.
Any ideas as to what causes this?
Anyway, there is more to come… But I would like to hear your opinion on this
topic first, before I dive into other results which are even more shocking. ;-)
Greets,
Marko
_______________________________________________
macports-dev mailing list
[email protected]
http://lists.macosforge.org/mailman/listinfo.cgi/macports-dev