Re: task parallelize dirEntries

2017-08-11 Thread Arun Chandrasekaran via Digitalmars-d-learn
On Friday, 11 August 2017 at 21:58:20 UTC, Johnson wrote: Just a thought, maybe the GC isn't cleaning up quick enough? You are allocating and md5 digest each iteration. Possibly, an opitimization is use use a collection of md5 hashes and reuse them. e.g., pre-allocate 100(you probably only ne

Re: task parallelize dirEntries

2017-08-11 Thread Johnson via Digitalmars-d-learn
On Friday, 11 August 2017 at 21:33:51 UTC, Arun Chandrasekaran wrote: I've modified the sample from tour.dlang.org to calculate the md5 digest of the files in a directory using std.parallelism. When I run this on a dir with huge number of files, I get: core.exception.OutOfMemoryError@src/core/

Re: task parallelize dirEntries

2017-08-11 Thread Arun Chandrasekaran via Digitalmars-d-learn
On Friday, 11 August 2017 at 21:33:51 UTC, Arun Chandrasekaran wrote: I've modified the sample from tour.dlang.org to calculate the [...] RHEL 7.2 64 bit dmd v2.075.0 ldc 1.1.0

task parallelize dirEntries

2017-08-11 Thread Arun Chandrasekaran via Digitalmars-d-learn
I've modified the sample from tour.dlang.org to calculate the md5 digest of the files in a directory using std.parallelism. When I run this on a dir with huge number of files, I get: core.exception.OutOfMemoryError@src/core/exception.d(696): Memory allocation failed Since dirEntries returns