On Saturday, 17 October 2015 at 09:30:47 UTC, Marco Leise wrote:
It is trivial to read into an allocated block when the file size is below a threshold. I would just need a rough file size. Are you talking about 4K pages or mega-bytes? 64 KiB maybe?

Maybe, I guess you could just focus on what you think is the primary usage patterns for your library and benchmark those for different parameters.

If you want to test processing of many small files combined with computationally/memory intensive tasks then you could try to construct a simple benchmark where you iterate over memory (M*cache 3 size) using a "realistic" pattern like brownian motion in N threads and also repeatedly/concurrently load JSON code for different file sizes so that the CPUs page table mechanisms are stressed by mmap, cache misses and (possibly) page faults.

Reply via email to