On Friday, 11 August 2017 at 21:58:20 UTC, Johnson wrote:
Just a thought, maybe the GC isn't cleaning up quick enough?
You are allocating and md5 digest each iteration.
Possibly, an opitimization is use use a collection of md5
hashes and reuse them. e.g., pre-allocate 100(you probably only
ne
On Friday, 11 August 2017 at 21:33:51 UTC, Arun Chandrasekaran
wrote:
I've modified the sample from tour.dlang.org to calculate the
md5 digest of the files in a directory using std.parallelism.
When I run this on a dir with huge number of files, I get:
core.exception.OutOfMemoryError@src/core/
On Friday, 11 August 2017 at 21:33:51 UTC, Arun Chandrasekaran
wrote:
I've modified the sample from tour.dlang.org to calculate the
[...]
RHEL 7.2 64 bit
dmd v2.075.0
ldc 1.1.0
I've modified the sample from tour.dlang.org to calculate the md5
digest of the files in a directory using std.parallelism.
When I run this on a dir with huge number of files, I get:
core.exception.OutOfMemoryError@src/core/exception.d(696): Memory
allocation failed
Since dirEntries returns