Hi,

the company I work for is performing some tests on Phoenix with NodeJS. For
simple queries I didn't had any problem, but as soon as I start to use our
app I'm getting "process out of memory" errors on the client when I runs
queries that return a big number of rows (i.e. 400k) . I think the problem
is that the client tries to buffer all the results in RAM and that kills
it. The same query runs fine when I run it with sqline.

So, is there a way to tell the client to stream the results (or batch them)
instead of buffering them all? is raising the client memory the only
solution?

I'm using phoenix-4.3.1 and https://github.com/gaodazhu/phoenix-client as
the NodeJS driver

Thanks,

Isart Montane

Reply via email to