Hi Walter,

You raise a lot of interesting points. I hadn't realised TQuery could
be slower than TTable on a database like Access. I normally use a lot
of TTable components in this type of application but I've been saying
since it was first designed that a database like SQL Server would be
better so I used TQuery to make it easier to change in future.

My memory must be worse than I thought, I just checked the size of the
database and its about 100Mb. :) Larger than I thought but still much
smaller than your example. Since the app is only used by 1 person at a
time we moved the database from a server to the local machine several
months ago which helped for a while. I'm not sure of the PC spec but
it could be an issue. It's a fairly recent HP box but our IT dept
aren't known for spending a lot of cash so it won't be a top spec machine.

The app is used in a manufacturing process, printing rolls of labels
with unique serial numbers. Each roll has 1,000 labels and is created
by a while loop that queries for the next serial number, prints a
label and logs it's details in the db then repeats. I suggested faster
methods than checking against every serial number ever printed before
printing every label but my bosses insisted that we do it this way to
be 100% sure we don't print any duplicates. The while loop is the
problem, the time taken to print a roll has approximately doubled in
just over a year and the only change is the size of the table it
queries. Other queries are taking longer too but the effect is
negligible and not important. The app prints several different rolls
of labels and the query has two different where clauses depending on
which part is being produced. I haven't used parameters but build the
query using TQuery.Sql.Add(), it's two forms are:

select `SerialNo` from LOG
where `LabelPartNo` = '5205-035'
and `Year` = '06'
and `DateCode` = '4Z'
order by `SerialNo`

select `SerialNo` from LOG
where `RevFamily` = '1'
and `Year` = '06'
and `DateCode` = '4Z'
order by `SerialNo`

I'm going to improve the indexes (as per another post) and upgrade the
RAM to begin with. I'll keep the pressure on my boss to upgrade Delphi
 so I can switch to ADO and then think about SQL Server Express.

Thanks,
Dave.

Dave Murray
Delphi-En Moderator
Glasgow, UK
[EMAIL PROTECTED]


--- In [email protected], Walter Prins <[EMAIL PROTECTED]> wrote:
>
> Firstly, for file base databases you really want to try and use 
> components which properly marry up to the underlying access model.  
> This means for example file based databases you want to use TTable 
> with Indexes and SetRange wherever possible to access, in preference 
> to TQuery, since the BDE isn't really a very good "database engine" 
> and doesn't really optimise queries.  As a result, it will often 
> effectively perform an full linear file read/"table scan" whereas 
> TTable with SetRange and FindKey/Locate would effectively merely 
> position a cursor on a file handle.     However, In your case I'm 
> not sure if you'll get full benefit via your access path (ie BDE to 
> Access) as there might be some unholy translation going on.  A lot 
> better would indeed probably be ADO, where the TADOTable is almost 
> certain to properly access the underlying JET engine.  While I do 
> not recommend anyone to use Paradox anymore, due to the status of it 
> and the BDE, it is true that you'd have been far happier with it and 
> the BDE than with Access and the BDE.  You can do some pretty 
> efficient things with Paradox and the BDE, over even a slow network, 
> when used carefully.  (But, as I've said, stay away from TQuery, and 
> optimise access by using indexes and locates on the file 
> yourself...)  
> 
> In actuality, 30Mb is IMO an almost paltry amount of data, compared 
> to the amount of RAM and resources available on todays PC's and 
> networks. What I mean by that is that if you are having complaints, 
> it suggests to me that more data (all of it?) might perhaps be read 
> by client machines than are really neccesary when going about their 
> business, more frequently than is necessary (probably due to the 
> inefficiencies introduced by the data access layer/components 
> themselves.)   To put this comment in perspective:  A few years ago 
> one of the system we look after was still Paradox/file based.  The 
> total database size was at that point approximately 800Mb-900Mb or 
> so.  The largest file/table was probably about 400Mb and contained 
> approximately 1.8 million records, there were about 2 or 3 others 
> near on a million records and most of the other significant ones 
> were tens of thousands.  There were about 10-15 concurrent users 
> using this database over a 100Mbit LAN, and for the most part this 
> worked exceedingly well, generally... Looking up records in the 
> largest files/tables were still nearly instantaneous despite the
> size as they used suitable indexes together with TTable, FindKey 
> and SetRange.  Things bogged down however with complicated reporting 
> queries where tables had to be related to each other and you thus 
> had no choice but to read large amounts of information over the 
> network from multiple tables to produce reports (since obviously 
> there were no centralised database server with the bandwidth to cope 
> with such things).  The point is this is an order of magnitude more 
> information that in your database and was still quite workable (just 
> not with respect to the reporting requirements for the most part as 
> well as a few other irritating management/maintenance problems), so 
> I'm fairly sure you can improve your situation. 
> 
> Having said all that, it might just be worth your while to move to a 
> client server based database sometime soon... (but that might be a 
> lot of work...)
> 
> I guess I should've asked:  What type of machines are running this 
> system (spec), how fast is the network, and can you give an idea of 
> how you access the data in the database (what type of queries and so 
> on)?
> 
> Hope that helps,
> 
> Walter
> 
> 
> 
> Dave Murray wrote:
> 
> >Hi,
> >
> >I've got a Delphi app that uses an Access database via BDE (Delphi 
> >4 so no ADO). At the moment there are about 600,000 records in the 
> >main table and it is growing by about 12,000 - 15,000 records a 
> >week. The database itself is currently over 30Mb in size. The app 
> >runs a lot of queries against the main table and has been slowing 
> >down as it grows. The speed is now getting to the point where the 
> >users are complaining.
> >
> >So am I reaching the limits of what I can do with an Access 
> >database? Would I get a speed increase by switching to Delphi 2006 
> >and using ADO? Is it time to start looking at SQL Server?
> >
> >Thanks,
> >Dave.
> >
> >  
> >
>






-----------------------------------------------------
Home page: http://groups.yahoo.com/group/delphi-en/
To unsubscribe: [EMAIL PROTECTED] 
Yahoo! Groups Links

<*> To visit your group on the web, go to:
    http://groups.yahoo.com/group/delphi-en/

<*> To unsubscribe from this group, send an email to:
    [EMAIL PROTECTED]

<*> Your use of Yahoo! Groups is subject to:
    http://docs.yahoo.com/info/terms/
 



Reply via email to