Re: Reading and writing large arrays to disk

2020-01-07 Thread Chip Scheide via 4D_Tech
these size differences, compressing the blob, on any computer newer than about 1990 is negligible. > > Compressing the BLOB prior to writing slowed the process with times > of about 20-21 milliseconds. > However, the file size was 1/3 as big. Shrinking from 2.3MB to 0.74 MB. > Note that the

Re: Reading and writing large arrays to disk

2020-01-06 Thread Mitchell Shiller via 4D_Tech
So I tested my setup. V17R5, MacOS Mojave. My results were slightly different. Your mileage may vary. Compared BLOB, JSON Stringify Array, and OB SET ARRAY / JSON Stringify. For writing an array that exists in memory, BLOB was fastest with average times in 6-9 milliseconds. JSA was in the

Re: Reading and writing large arrays to disk

2020-01-06 Thread Chip Scheide via 4D_Tech
Variable to blob blob to document done fast as possible from 4D Chip > On Jan 6, 2020, at 10:50 AM, Kirk Brooks via 4D_Tech > <4d_tech@lists.4d.com> wrote: > >> I agree with Chuck here - writing a line at a time is slow. It's very >> secure though. So it's good if you may crash - whatever has

Re: Reading and writing large arrays to disk

2020-01-06 Thread Jim Crate via 4D_Tech
On Jan 6, 2020, at 10:50 AM, Kirk Brooks via 4D_Tech <4d_tech@lists.4d.com> wrote: > I agree with Chuck here - writing a line at a time is slow. It's very > secure though. So it's good if you may crash - whatever has already been > written stays written to disk. But otherwise better to buffer

Re: Reading and writing large arrays to disk

2020-01-06 Thread Kirk Brooks via 4D_Tech
Peter, I agree with Chuck here - writing a line at a time is slow. It's very secure though. So it's good if you may crash - whatever has already been written stays written to disk. But otherwise better to buffer some and then write. After looking at the link Arnaud posted it looks like

Re: Reading and writing large arrays to disk

2020-01-06 Thread Charles Miller via 4D_Tech
Th e overhead for writing 1 k is the same as writing 1 meg to disk. Writing one element at a time will take much longer Regards Chuck On Mon, Jan 6, 2020 at 4:18 AM Peter Bozek via 4D_Tech <4d_tech@lists.4d.com> wrote: > Mitch, > > If the speed is the issue, you may try to write/read array

Re: Reading and writing large arrays to disk

2020-01-06 Thread Peter Bozek via 4D_Tech
Mitch, If the speed is the issue, you may try to write/read array elements to disk using SEND PACKET / RECEIVE PACKET. If you want to speed up reading, insert number of elements into first line, like Create document SEND PACKET(file;Size of array(arrayElement)+"\n") loop for all elements in

Re: Reading and writing large arrays to disk

2020-01-06 Thread Arnaud init5 imap via 4D_Tech
> Le 6 janv. 2020 à 05:48, Kirk Brooks via 4D_Tech <4d_tech@lists.4d.com> a > écrit : > > Miyako, > Nice explanation. > > More or less - where do you think the inflection point is where simple text > concatenation becomes less efficient than adding the text to a BLOB? Hi Kirk, something

Re: Reading and writing large arrays to disk

2020-01-06 Thread Arnaud init5 imap via 4D_Tech
> Le 6 janv. 2020 à 05:09, Mitchell Shiller via 4D_Tech <4d_tech@lists.4d.com> > a écrit : > > Hi, > > I have large string arrays (about 200 k elements). > I need to write and read them to disk. > Speed is the most important criteria. > > Options > 1) Create a text variable (loop with CR

Re: Reading and writing large arrays to disk

2020-01-05 Thread Kirk Brooks via 4D_Tech
Miyako, Nice explanation. More or less - where do you think the inflection point is where simple text concatenation becomes less efficient than adding the text to a BLOB? On Sun, Jan 5, 2020 at 8:20 PM Keisuke Miyako via 4D_Tech < 4d_tech@lists.4d.com> wrote: > concatenation of text is

Re: Reading and writing large arrays to disk

2020-01-05 Thread Keisuke Miyako via 4D_Tech
concatenation of text is intuitive, but inefficient. every time you add text to another, a new buffer is created and the whole text is copied there. as the text gets larger, the allocation of memory and copying of data will become slower. variable to blob is fast as it simply packs the

Reading and writing large arrays to disk

2020-01-05 Thread Mitchell Shiller via 4D_Tech
Hi, I have large string arrays (about 200 k elements). I need to write and read them to disk. Speed is the most important criteria. Options 1) Create a text variable (loop with CR delimiter) and then TEXT TO DOCUMENT. 2) VARIABLE TO BLOB, COMPRESS BLOB, BLOB TO DOCUMENT 3) OB SET ARRAY, JSON