On Sat, Feb 12, 2022 at 06:41:14PM +0000, Era Scarecrow via Digitalmars-d-learn wrote: > On Thursday, 10 February 2022 at 01:43:54 UTC, H. S. Teoh wrote: > > On Thu, Feb 10, 2022 at 01:32:00AM +0000, MichaelBi via > > Digitalmars-d-learn wrote: > > > thanks, very helpful! i am using a assocArray now... > > > > Are you sure that's what you need? > > Depends. if you do say TYPE[long/int] then you have effectively a > sparse array, if there's a few entries (*Say a hundred million or > something*) it will probably be fine. > > Depending on what you're storing, say if it's a few bits per entry you > can probably use bitarray to store differing values. The 10^12 would > take up.... 119Gb? That won't work. Wonder how 25Gb was calculated. [...]
That was not my point. My point was to question whether the OP has discovered the insight that would allow him to accomplish his task with a LOT less space than the naïve approach of storing everything in a gigantic array. Substituting an AA for a gigantic array matters little as long as the basic approach remains the same -- you have only modified the implementation details but the algorithm is still a (highly) suboptimal one. --T