Thanks Peter & Paul.

There are only ~5000 rows in my table so I should not be having issues
with the SET_MAX_MEMORY_ROWS results, but it is good to know about and
I'll check it out anyways.

I'll post the Explain early next week once I have access to the
database again.

Thanks,
Emily


On Aug 26, 2:44 pm, Peter Yuill <[email protected]> wrote:
> > One thing that I have noticed, that you might be being bitten by, is
> > that if there are more than 10000 results for the select in the in
> > clause, then the results will be stored to disk, and not in memory. In
> > my testing if the in clause had 9999 elements it would return in<1s,
> > and if it had 10000 elements it would take about 50 seconds.
>
> > There is a parameter that you can use to increase that 10000 limit,
> > but I cannot recall what it was, sorry.
>
> Well spotted.
>
>       SET MAX_MEMORY_ROWS
>
> |SET MAX_MEMORY_ROWS int <http://www.h2database.com/html/grammar.html#int>|
>
> The maximum number of rows in a result set that are kept in-memory. If
> more rows are read, then the rows are buffered to disk. The default
> value is 10000.
>
> Admin rights are required to execute this command, as it affects all
> connections. This command commits an open transaction. This setting is
> persistent. It has no effect for in-memory databases.
>
> Example:
>
> SET MAX_MEMORY_ROWS 1000
>
> Regards,
> Peter

-- 
You received this message because you are subscribed to the Google Groups "H2 
Database" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/h2-database?hl=en.

Reply via email to