I am trying to reply to Scalability of connection numbers of client-server 
solution with Firebird 3.0?
1. I am not receiving messages from the yahoo Firebird group (I don't want that 
they are coming in my mail, if I can always read them on the Web and in the the 
nice threaded format and with history), but I am not receiving answers to my 
questions as well - so - I don't know to how to respond them technically. I am 
not happy about mail lists really, forms and github issue formas are far more 
better way of communication.
2. I can do and I am doing proper transaction management with IBX/Firebird Zeos 
as well, no need for futher enhancements. Currently my MON$ATTACHMENTS lists 
some 30 connections and my MON$TRANSACTIONS lists some 20-25 transactions, no 
older than 20 minutes. I guess that is OK. Generally I open/close transaction 
for each read and save operation and when user inteacts with the form I am 
ussing COMMIT RETAINING options with the final COMMIT in the case when the form 
is being closed. I guess, that is right? That should be the proper balance that 
avoids long-running transactions and that also avoids recreating transactions 
for each minot save or read.
Of course, I can migrate to FireDAC, but I don't see it as silver bullet for 
any improvement if the MON$ATTACHMENTS and MON$TRANSACTIONS data are good with 
the old IBX/Zeos as well. I can not see how can I improve it further?
3. And I have never understood (and I still not understand) what kind of data 
caching can be there for the ERP/accounting/manufacturing management systems? 
Either on the client side or on the server side. Well - my application caches 
(in detached, memory only ClientDataSets) some short, common classifiers like 
list of some document types, list of account codes, list or warehouses or the 
user preferences (all of which is load during the startup time of the 
application). But anything beyond that can not be cached: the catalog of goods 
is enormous - more than 20.000 - how can I cache it? Catalog of customer is 
still larger (more than 100.000). The list of documents is growing and so on, 
so on. I simply don't understand what miracles the FireDAC can do there? Will 
it make some AI analytics to decide some chunks of commonly used/rarely mutable 
data? I have great doubts about the possibility of such analytics and 
reliability of it. Database can do some caching based on the data access 
patterns, but not FireDAC.
OK, I can understand caching for some web pages, web shops, newspages, but 
there is little that can be cached (automatically or manually) in ERP 
applications with OLTP and dynamic reports.
Thx, J.

Reply via email to