In message <[EMAIL PROTECTED]>, Paul McVicker
<[EMAIL PROTECTED]> writes
I am dealing with a program processing about 60 million records. I am trying
to load some of my reference files into dynamic arrays to improve the
processing speed by eliminating physical I/O reads. I am concerned about the
amount of memory I am using and was trying to get a handle on the actual
memory usage.
If you want speed, try using static arrays. Do you know how many fields
are supposed to be in the records? DIM an array of the correct size and
MATREAD them.
If you don't know how big they are, you can read them, then do a DIM and
MATPARSE (provided you're in IDEAL or PI mode).
Cheers,
Wol
--
Anthony W. Youngman <[EMAIL PROTECTED]>
'Yings, yow graley yin! Suz ae rikt dheu,' said the blue man, taking the
thimble. 'What *is* he?' said Magrat. 'They're gnomes,' said Nanny. The man
lowered the thimble. 'Pictsies!' Carpe Jugulum, Terry Pratchett 1998
Visit the MaVerick web-site - <http://www.maverick-dbms.org> Open Source Pick
-------
u2-users mailing list
[email protected]
To unsubscribe please visit http://listserver.u2ug.org/