Re: Saving and loading large data sets easily and efficiently

2019-10-03 Thread Brett via Digitalmars-d-learn
On Thursday, 3 October 2019 at 14:38:35 UTC, Bastiaan Veelo wrote: On Monday, 30 September 2019 at 20:10:21 UTC, Brett wrote: [...] The way the data is structured is that I have a master array of non-ptr structs. E.g., S[] Data; S*[] OtherStuff; then every pointer points to an element in to

Re: Saving and loading large data sets easily and efficiently

2019-10-03 Thread Bastiaan Veelo via Digitalmars-d-learn
On Monday, 30 September 2019 at 20:10:21 UTC, Brett wrote: [...] The way the data is structured is that I have a master array of non-ptr structs. E.g., S[] Data; S*[] OtherStuff; then every pointer points to an element in to Data. I did not use int's as "pointers" for a specific

Re: Saving and loading large data sets easily and efficiently

2019-10-01 Thread JN via Digitalmars-d-learn
On Monday, 30 September 2019 at 20:10:21 UTC, Brett wrote: So it much more difficult than POD but would still be a little more work to right... hoping that there is something already out there than can do this. It should be I'm afraid there's nothing like this available. Out of serialization

Saving and loading large data sets easily and efficiently

2019-09-30 Thread Brett via Digitalmars-d-learn
I have done some large computations where the data set is around 10GB and takes several minutes to run. Rather than running it every time regenerating the same data, can I simply save it to disk easily? The data is ordered in arrays and structs. It's just numbers/POD except some arrays use