s1ck commented on a change in pull request #24107: [SPARK-27174][SQL] Add
support for casting integer types to binary
URL: https://github.com/apache/spark/pull/24107#discussion_r268116123
##########
File path:
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/NumberConverterSuite.scala
##########
@@ -37,4 +40,47 @@ class NumberConverterSuite extends SparkFunSuite {
checkConv("11abc", 10, 16, "B")
}
+ test("byte to binary") {
+ checkToBinary(0.toByte)
+ checkToBinary(1.toByte)
+ checkToBinary(-1.toByte)
+ checkToBinary(Byte.MaxValue)
+ checkToBinary(Byte.MinValue)
+ }
+
+ test("short to binary") {
+ checkToBinary(0.toShort)
+ checkToBinary(1.toShort)
+ checkToBinary(-1.toShort)
+ checkToBinary(Short.MaxValue)
+ checkToBinary(Short.MinValue)
+ }
+
+ test("integer to binary") {
+ checkToBinary(0)
+ checkToBinary(1)
+ checkToBinary(-1)
+ checkToBinary(Int.MaxValue)
+ checkToBinary(Int.MinValue)
+ }
+
+ test("long to binary") {
+ checkToBinary(0L)
+ checkToBinary(1L)
+ checkToBinary(-1L)
+ checkToBinary(Long.MaxValue)
+ checkToBinary(Long.MinValue)
+ }
+
+ def checkToBinary[T](in: T): Unit = in match {
+ case b: Byte =>
+ assert(toBinary(b) ===
ByteBuffer.allocate(1).order(BIG_ENDIAN).put(b).array())
Review comment:
As mentioned above, casting to binary is not specified within the SQL
standard. From the vendor docs, I couldn't figure out how it's being handled.
Imho, going big endian here (Java and network standard) is ok for an
unspecified feature.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]