There's 2 things going on here. Tim is talking about "out of the box"
performance achievable by reasonably knowledgeable individuals. And that's
an interesting measure in and of itself.
The other is a measure of absolute best performance potential. In other
words "What's the absolute fastest this race car can go on this race track
in these weather conditions?" which is a very different measure. You'd want
the chassis designer, engine designer, factory tire guys, best mechanics,
and best drivers on hand for such an enterprise. And then it'd take
collaboration and time to get to the absolute fastest lap times possible.
Certainly, an average crew and an average driver with average tires could
get a certain level of performance out of the race car. And that would tell
us something, but it would be more indicative of the quality of the test
crew and driver than the performance potential of the race car. If the goal
is absolute fastest possible lap times you go with the pros.
I wouldn't try to optimally tune a Solaris or Linux OS, or an Apache or
Netscape web server, or an Oracle or DB2 relational engine without
soliciting expert help. Most of the time the folks that wrote the code are
the absolute best possible experts to have available for extreme performance
potential tests. Assuming they're available and you can afford them. If you
settle for something less than that you're testing something other than
maximum performance potential.
That's a personal perspective. Not a vendor position.
Dwight
-----Original Message-----
From: Tim Endres [SMTP:[EMAIL PROTECTED]]
Sent: Tuesday, February 01, 2000 9:41 AM
To: [EMAIL PROTECTED]
Subject: Re: EJB Server Comparison (WebLogic, WebSphere,
NetDynamics,GemS tone)
> But that doesnt mean that the evaluators can publish without
consulting the
> vendors nor they can publish all those that are restricted by the
vendors. It
> would have been easier if the evaluators had contacted the vendors
before
> publishing to confirm whether they used the right version of the
vendors
> product in their evalutaion.
This is backwards. I would not consult the vendors in such a test.
If I require the vendors input to get top performance from their
product, then their distribution is lacking. I should be able to
get top performance out of the box by reading their documentation.
If not, then their product is inferior to those that outperform it.
Further, the minute the vendor is in the loop, you can kiss your
objective testing goodbye, or you will never get your evaluation
out the door.
I would ignore their license and force them to sue me.
tim.
Tim Endres [EMAIL PROTECTED]
ICE Engineering, Inc. http://www.trustice.com/
"USENET - a slow moving self parody." - Peter Honeyman
===========================================================================
To unsubscribe, send email to [EMAIL PROTECTED] and include in
the body
of the message "signoff EJB-INTEREST". For general help, send email
to
[EMAIL PROTECTED] and include in the body of the message "help".
===========================================================================
To unsubscribe, send email to [EMAIL PROTECTED] and include in the body
of the message "signoff EJB-INTEREST". For general help, send email to
[EMAIL PROTECTED] and include in the body of the message "help".