Hi,
Is there a way with bigforth to reserve (and take advantage of) more
than 2 Gigabytes of memory for the heap?
> free -m
total used free shared buffers cached
Mem: 16052 15962 90 0 44 15708
-/+ buffers/cache: 209 15843
Swap: 15257 2 15254
> bigforth -m 3G
Running out of Memory!
> bigforth -m 4G
Segmentation fault
But bigforth -m 10G starts, although reserves only 2G :)
I understand bigforth is only 32 bit system, but this seems like 4G of
memory should be available for addressing. Is there a workaround that
allows me to use more memory? I am working with particle filters and
need to store tons of them in memory (on disk is prohibitively slow) to
perform statistics on them.
Another option for me seems to move to gforth which should support all
memory a 64bit system provides. But is there a simple way to convert my
bigforth specific library bindings to gforth? Callbacks and
everything... I doubt :(
Thanks!
--
Sergey
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]