On Thu, Nov 30, 2023 at 8:41 AM Sverre Aleksandersen <
sverre.aleksander...@gmail.com> wrote:

> I'm running the app in Kubernetes and get OOMKilled because memory
> consumption goes through the roof on high throughput, so its not premature
> optimisation.
> I want to be able to read the blobs using a buffer so that my container
> does not get killed. I see two solutions:  (a) read blob using a buffer and
> write it to disk or (b) read blob using a buffer and stream out of app
> using HTTP. I'm currently working on solution (a) since this is simplest
> and will probably be sufficient.
>
> If I interpret the code in PgResultSet correctly it seems like it reads
> the entire BLOB into memory irregardless?
>

Sorry for the confusion, that's what I meant by "premature optimisation."
It doesn't work this way with pgjdbc. They didn't implement the standard
JDBC APIs as one would expect.

Here's a resource on alternative pgjdbc specific API that allows for
accessing BLOBs:
https://jdbc.postgresql.org/documentation/binary-data

I guess you could just pass around pgjdbc's LargeObject (which works like
java.sql.Blob without implementing it...) and use that to stream your data
both ways.

-- 
You received this message because you are subscribed to the Google Groups "jOOQ 
User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jooq-user+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jooq-user/CAB4ELO4Be9WtmWzW%3DqFH1Fs3_w8UKWd0bs1UHcmddUDpZwwFyw%40mail.gmail.com.

Reply via email to