I wrote a simple CMD file that selected 3 values from a table (integer,
text10 & text25) and displayed the resulting value. The integer values run
from 1 to 1000. I put the select & write in a WHILE loop and incremented an
int var so that it runs thru the loop 1000 times (selecting the row with
the corresponding int value) & displays the results. I had it calculate the
time it took & display that value.
The integer in the table is indexed.
I got the following results.
Database on MULTI setting Version Time, sec
Speed factor
------------------ -------------------- --------------------------------
------------- ------------------
Local drive C ON 6.5++ DOS 4.80
Local drive C OFF 6.5++ DOS 1.21
4x faster
speed ratio = 4.6
Network ON 6.5++ DOS 27.85
Network OFF 6.5++ DOS 1.23
18.6x faster
----------------------------------------------------------------------------
--------------------------------------------
Local drive C ON 6.5++ WIN 3.57
Local drive C OFF 6.5++ WIN 1.32
2.7x faster
speed ratio = 4.6
Network ON 6.5++ WIN 17.14
Network OFF 6.5++ WIN 1.36
12.6x faster
=============================================================
The time is an average for several runs. I used computers (with approx the
same processor speed) running both Win98 & Win2000 and got similar results.
The network server is running Win2000.
The 6.5++ WIN seems to run a little faster with MULTI ON and a little slower
with MULTI OFF, relative to 6.5++ DOS but the ratio of the speed with MULTI
ON for Local vs Network are both about 4.6 for both DOS & WIN versions.
The speed with MULTI OFF does not vary significantly for Local vs Network.
Why does SET MULTI ON cause such a huge speed degration? Why would MULTI
ON cause it to run slower on a network then on a local drive, whereas, MULTI
OFF runs the same on a network and/or local drive?
Has anyone else observe this kind of behavior?
Frank Radice
[EMAIL PROTECTED]