Hi čt 17. 8. 2023 v 16:48 odesílatel Karsten Hilbert <karsten.hilb...@gmx.net> napsal:
> > Even I used postgreSQL Large Objects by referring this link to store and > retrieve large files (As bytea not working) > https://www.postgresql.org/docs/current/largeobjects.html > > But even now I am unable to fetch the data at once from large objects > > select lo_get(oid); > > Here I'm getting the same error message. > > But if I use select data from pg_large_object where loid = 49374 > Then I can fetch the data but in page wise (data splitting into rows of > each size 2KB) > > So, here how can I fetch the data at single step rather than page by page > without any error. > SQL functionality is limited by 1GB You should to use \lo_import or \lo_export commands or special API https://www.postgresql.org/docs/current/lo-interfaces.html regards Pavel > And I'm just wondering how do many applications storing huge amount of > data in GBs? I know that there is 1GB limit for each field set by > postgreSQL. If so, how to deal with these kind of situations? Would like to > know about this to deal with real time scenarios. > > > > https://github.com/lzlabs/pg_dumpbinary/blob/master/README.md > might be of help > > Karsten > > >