Lee,
 
Set Scratch Off will cause all temp and sort files to be created on the current drive and directory.  If your
current drive and directory is the network drive this will slow the system down considerably as both
PC's sort and temp files will be pushed back and forth over the network.  It is often that the temp files
are not that big, but the peer to peer "server" does not handle the db file and these temp file handles at the
same time very well.
  Try Set Scratch C:\Temp  or something similar. 
 
Try setting feedback off.  This can slow processing down some as well and in most cases is not needed.
 
You do not mention whether your 6.5 is Dos or Windows.  I have not used Dos for a long time, but seem
to remember that buffer and file settings could cause for some speed issues.  Check to make sure they are set large enough. If using DOS version, try setting your compatibility mode on the icon that calls up the program to an earlier version such as 98.  (Assuming you are running XP)
 
Last issue could very well be your router.  Not knowing what type, many routers are for home use and do not
have true built in switches, but a version of the old hub technology.  Your description sounds like it could very well be related to the router.  A multiple PC network can only run as fast as the mechanism that directs the data packets between the acting server and requesting PC.  You could have the fastest PC's, hard drives, and network cards in the world, but if your hub/switch does not handle the multiple requests efficiently, you will see unacceptable performance.  Try using a true switch instead of the router, they are inexpensive.
 
Good luck  - Bob
 
 
 
 
 
 
-------------- Original message --------------
From: "Lee Bailey" <[EMAIL PROTECTED]>

Hello All-

 

I’ve been monitoring the users group for a long time, and have been impressed with the expertise revealed in exchanged emails.  I am hoping that one of you networking gurus can point me in the right direction in solving a perplexing problem.

 

I have been an avid R-Base user since DOS 2.0.  I am currently running version 6.5, and have been for about 5 years.

 

Here’s the setup—

Machine #1 is a 2.2 gighz 64 bit machine with 2 gig’s of ram.  Machine #2 is a 1.9 gighz 32 bit machine with 1 gig of ram.  Both machines are tied together via a router with a transfer rate of 100 mbps.  The router also provided access, by both machines, to the Internet via a DSL modem, also attached to the router.

 

The 20 meg database files (RB1, RB2, RB3, and RB4) are on machine #1, with application command files on both machines to speed processing in the common database.  Being an old R-Base guy, much of the programming is done via command files, crunching a lot of data, to achieve the full relational data base power.

 

Here’s the problem--

When machine #1 accesses the database as a sole user, the applications, command files, forms, reports, etc. run very quickly.  When machine #2 accesses the database solely, applications also run very quickly.  When both machines are utilizing the database simultaneously, machine #1 still works great (the data base is on this machine), but machine #2 drops to a painful crawl.

 

If both computers are utilizing the database, and machine #1 exits the database, machine #2 continues to work painfully slow, even though it has become the sole user of the database at that time.

 

Setting are: staticdb on, fastfk off, scratch off, multi on, ansi off, feedback on, rules off, rowlocks are used, column verify, and precedence on, sort menu on.

 

My thought is that at a transfer rate of 100 mbps between the machines, a slow response by machine #2 should never happen—no matter what.   

 

Any ideas as to what is going on, and what can be done to rectify the situation?

 
Lee
 
Bailey & Associates
E-Mail: [EMAIL PROTECTED]
Phone: 772-597-0040
Fax: 772-597-0043

Reply via email to