CurtHagenlocher commented on code in PR #2152:
URL: https://github.com/apache/arrow-adbc/pull/2152#discussion_r1775698569
##########
csharp/test/Drivers/Apache/Spark/NumericValueTests.cs:
##########
@@ -261,7 +262,8 @@ public async Task TestFloatValuesInsertSelectDelete(float
value)
string valueString = ConvertFloatToString(value);
await InsertSingleValueAsync(table.TableName, columnName,
valueString);
object doubleValue = (double)value;
- object floatValue =
TestEnvironment.GetValueForProtocolVersion(doubleValue, value)!;
+ // Spark over HTTP returns float as double whereas Spark on
Databricks returns float.
Review Comment:
On MSSQL and many other relational databases, a 32-bit floating point value
is REAL and a 64-bit floating point value is FLOAT. I don't know how it is with
Spark. If a Spark FLOAT is a 32-bit floating point value then I think we should
do the conversion from double -> float.
(This can obviously be a followup.)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]