birschick-bq commented on PR #1858:
URL: https://github.com/apache/arrow-adbc/pull/1858#issuecomment-2109229539

   > Thanks for the change! I don't understand how decimal types are even 
usable if we don't know the scale of the result. Are there known circumstances 
where that's true?
   > 
   > How were 10 and 0 chosen as the defaults for precision and scale?
   
   These changes affect AdbcConnection.GetTableSchema and not any of the 
ExecuteQuery/ExecuteUpdate. They (statement results) have a separate metadata 
that DOES include the [precision and 
scale](https://github.com/apache/arrow-adbc/blob/4d203b88951c3e947d3a7df1200759f328747221/csharp/src/Drivers/Apache/Thrift/SchemaParser.cs#L66).
 (Unfortunately, statement result metadata does not include metadata for 
complex types.)
   
   The default are chose from 
https://docs.databricks.com/en/sql/language-manual/data-types/decimal-type.html#syntax
   
   ```
               // { DECIMAL | DEC | NUMERIC } [ (  p [ , s ] ) ]
               // p: Optional maximum precision (total number of digits) of the 
number between 1 and 38. The default is 10.
               // s: Optional scale of the number between 0 and p. The number 
of digits to the right of the decimal point. The default is 0.
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to