Dwight Rexin <[EMAIL PROTECTED]> writes:
> There's 2 things going on here. Tim is talking about "out of the box"
> performance achievable by reasonably knowledgeable individuals. And that's
> an interesting measure in and of itself.
And a third: results acheivable by experts not from the vendor. You are
assuming that a high performance test can not be done without someone from
the vendor. I say, "Nonsense."
> The other is a measure of absolute best performance potential. In other
> words "What's the absolute fastest this race car can go on this race track
> in these weather conditions?" which is a very different measure. You'd want
> the chassis designer, engine designer, factory tire guys, best mechanics,
> and best drivers on hand for such an enterprise. And then it'd take
> collaboration and time to get to the absolute fastest lap times possible.
And what exactly is the point of this test? What is the point of my HiFi amp
being able to reproduce perfect sound when I am using a scratchy LP as my
input source.
You are not going to come in and configure my system, any more than a pit crew
will tune up my car. In fact, if I need your engineers to get the performance
results that you claim, then your product is useless to me.
I need to know what a competent group of engineers can accomplish.
Further, since those same engineers are working across all the products in
the test, it seems like a far more appropriate comparison than what you propose.
> Certainly, an average crew and an average driver with average tires could
> get a certain level of performance out of the race car. And that would tell
> us something, but it would be more indicative of the quality of the test
> crew and driver than the performance potential of the race car. If the goal
> is absolute fastest possible lap times you go with the pros.
>
> I wouldn't try to optimally tune a Solaris or Linux OS, or an Apache or
> Netscape web server, or an Oracle or DB2 relational engine without
> soliciting expert help. Most of the time the folks that wrote the code are
> the absolute best possible experts to have available for extreme performance
> potential tests. Assuming they're available and you can afford them. If you
> settle for something less than that you're testing something other than
> maximum performance potential.
No. I am testing what I can expect from your product in the real world.
If I can not achieve your results, then what difference does it make
how good they are?
I see no problem with the test you propose coming from "The Vendor". That
is the whole point. Your test is, "Here is what we can do with full knowledge
of the internals of the system". I expect that from a Vendor, and read it
with full knowledge of the bias. However, I like to see completely independent
third party tests as well. And I understand the "non-bias" in those tests.
If you, as the vendor, feel the report is flawed, then you have a website,
and can make your case just like anyone else. I think anyone producing a
test report should feel obliged to link to your "test rebuttal", which will
allow me to see where you feel the test is flawed, and allow me to evaluate
what I think that means in the overall comparison.
HOWEVER, seeing that report DISAPPEAR from the Internet is *CHILLING*!!!
It strikes me as Censorship, and it reminds me of the Thought Ministry.
tim.
Tim Endres [EMAIL PROTECTED]
ICE Engineering, Inc. http://www.trustice.com/
"USENET - a slow moving self parody." - Peter Honeyman
===========================================================================
To unsubscribe, send email to [EMAIL PROTECTED] and include in the body
of the message "signoff EJB-INTEREST". For general help, send email to
[EMAIL PROTECTED] and include in the body of the message "help".