It is kind of hard to tell what is going on here, because you're trying
different things and causing yourself different problems in the process.

If you are opening a lot of H2 databases, you may need to reduce the cache
size, since each open database will have its own cache.

On Wed, 23 Dec 2020 at 23:47, Gerlits András <andras.gerl...@gmail.com>
wrote:

> unless I used "nioMapped". And even then, when I open any one of these
> databases, they are always read into memory fully, even if I put "file" in
> the URL. So, not just in my software, but if I open a split database of 2
> gigs in a SQL console application like squirrel, it grows the amount of
> used memory by 2 gigabytes or throws an out of memory error.
>
>
"nioMapped" means using mmap() to map the file space into the memory space,
so it doesn't actually "grow" the memory usage, but it does use up virtual
memory space. You should ideally be monitoring the working set size, not
the vmsize.

You might be seeing an out of memory error because you are trying to mmap()
a large database into a 32-bit process, and running out of virtual memory.
Try running a 64-bit JRE.

-- 
You received this message because you are subscribed to the Google Groups "H2 
Database" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to h2-database+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/h2-database/CAFYHVnURBfUd0aX42Qim3CmfQ2BHse%2BBPuW8SkVNaT31%2B25Umg%40mail.gmail.com.

Reply via email to