On Monday, 24 September 2018 at 14:31:45 UTC, Steven
Schveighoffer wrote:
It's not scanning the blocks. But it is scanning the stack.
Each time you are increasing the space it must search for a
given *target*. It also must *collect* any previous items at
the end of the scan. Note that a collection is going to mark
every single page and bitset that is contained in the item
being collected (which gets increasingly larger).
Is this because of the potentially (many) slices referencing this
large block?
I assume the GC doesn't scan the `byte`-array for pointer-values
in this case, but that happens for `void`-arrays and
class/pointer-arrays right?
Couldn't that scan be optimized by adding a bitset that indicates
which pages need to be scanned?
Is it common for GC's to treat large objects in this way?