Hello to the SAPDB community,
I'd love to build a 300 million row Table, short of the that the biggest
possible on a desktop machine.
Right now I am testing the three available open source SQL Engines:
mySQL, FireBird, SAPDB in order to figure out which one handles large DB
the fastest within a desktop configuration.
I am logging the time it takes to insert a 48 byte row plus three
indices, one of which includes a float column filled with semi-random
data.
I am using
- 800 Mhz, 300MB+ RAM machine with a baracuda IDE drive
- Wink2K SP3
- Kernel 7.4.3, Build 010-120-035-462
- ODBC
- local connection
- Database instance: SAP DB OLTP
- single log mode set to off.
- 1GB data volume
I am looking for raw INSERT speed. Data security and integrity are
secondary factors.
First quick comments on the installation process if I may ...
1) Getting started section: if there is one I missed it. It would be a
welcome addition.
2) Some Search capability on the manual would be useful, although it is
well written and well organised
3) Getting started section:
a) mention what the log is for. I set the log size to a low value
and it locked the DB after a thousands rows.
b) mention that you need to use the dba account to connect to SQL
studio ...
4) CACHE_SIZE: it's not clear what 10,000 represents. 10 MB I presume.
Performance:
Adding 10,000 rows took 50 seconds at first.
I then turned off set single log mode off.
Insert semi-random data benchmark in time it takes to insert 10,000 rows.
Left column is the resulting total nbr of rows:
Acceptable performance till 600,000 rows - 40s per 10,000 inserts, then:
800,000 1:02
1,000,000 1:28
1,060,000 2:01
1,200,000 2:41
1,500,000 3:12
If you have any fine tuning tips and suggestions to improve performance,
please let me
know.
Thanks,
_______________________________________________
sapdb.general mailing list
[EMAIL PROTECTED]
http://listserv.sap.com/mailman/listinfo/sapdb.general