Yes, I know it can be done with Json, too. I was particularly interested if it can be controlled in psycopg2 module, because the serialization to ARRAY[...] string is performed be the module and in this particular case explicit specification of types is required, but it seems like the module does not provide any control over it... Thanks anyway!
-- Regards, Vladimir ср, 4 дек. 2019 г. в 10:47, Adrian Klaver <adrian.kla...@aklaver.com>: > On 12/4/19 10:44 AM, Adrian Klaver wrote: > > On 12/3/19 4:00 PM, Vladimir Ryabtsev wrote: > >> It does not change much, the error message just says it now expects > >> 'character varying'. > >> Anyway bigint problem is the next in the line. > > > > Yeah it is related to this: > > > > https://www.postgresql.org/docs/11/functions-array.html > > unnest(anyarray) setof anyelement > > > > Dealing with anyelement can be tricky. > > > > I tried out an alternate solution: > > > > json_query = "select * from json_to_recordset(%s) as t(a timestamp, b > > varchar, c bigint)" > > > > cur.execute(json_query,[Json([{'a': datetime.now().isoformat(), 'b': > > 'abc', 'c': 100},{'a': datetime.now().isoformat(), 'b': 'xyz', 'c': > > 200}])]) > > Should have added that Json comes from: > > from psycopg2.extras import Json > > http://initd.org/psycopg/docs/extras.html?highlight=json#json-adaptation > > > > > > Which results in: > > > > test_(postgres)# select * from json_to_recordset(E'[{"a": > > "2019-12-04T10:40:22.823910", "b": "abc", "c": 100}, {"a": > > "2019-12-04T10:40:22.823918", "b": "xyz", "c": 200}]') as t(a timestamp, > > b varchar, c bigint) > > ; > > a | b | c > > ----------------------------+-----+----- > > 2019-12-04 10:40:22.82391 | abc | 100 > > 2019-12-04 10:40:22.823918 | xyz | 200 > > (2 rows) > > > >> > >> -- > >> Regards, > >> Vladimir > > > > > > > -- > Adrian Klaver > adrian.kla...@aklaver.com >