PetarVasiljevic-DB commented on PR #48625:
URL: https://github.com/apache/spark/pull/48625#issuecomment-2469976708

   I see and it makes sense. Thanks @milastdbx. @yaooqinn may I suggest that we 
at least fallback to the value of 1 if we read 0 from metadata? It is expected 
to read the ArrayType so dimensionality of array should be at least 1. So 
something like this:
   
   ```
   metadata.putLong("arrayDimension", Math.max(1, rs.getLong(1)))
   ```
   
   It doesn't cover all the arrays still, but I am expecting that most of the 
users are using 1D arrays so this would cover most of the cases. Also, it is 
stable. 
   
   And I see it as pure improvement not from the current code of PG dialect, 
but from the previous one, where we were reading all PG arrays (non-CTAS and 
CTAS) as 1D. With this code, we would support arrays (non-CTAS) of higher 
dimensions, but for the CTAS tables we would support only 1D arrays.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to