Robert Casto wrote: > That depends of course on what you are trying to do. > > Joshua wants to measure average system performance while things are > humming along. > > If you want to know how long it takes to startup, then you keep the > data. I tend to separate the two in reports I give to companies. Very > different work is done to speed one or the other up. > Exactly. If you're profiling a server, the performance at startup is completely useless. For a client (desktop or applet) it depends, but in most cases I think it is still not relevant.
Anyway, this is not my focus problem - I always throw away boot parameters. The idea of averaging a large number of result could indeed at least alleviate my problem about measuring, still it would make things more complex on other aspect - let's say a typical test run takes 1h (not parallelized); i have to run for two JDK (5 and 6, when 7 will be near I'll drop 5) and at least three operating systems. This means 6h, not parallelized. Running 10 times the suite would give 60h :-(( Even running 8 tests in parallel per each CPU, it would be 7.5 hours - and in that period, I couldn't run on the CI server anything else. -- Fabrizio Giudici - Java Architect, Project Manager Tidalwave s.a.s. - "We make Java work. Everywhere." weblogs.java.net/blog/fabriziogiudici - www.tidalwave.it/blog [email protected] - mobile: +39 348.150.6941 --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "The Java Posse" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/javaposse?hl=en -~----------~----~----~----~------~----~------~--~---
