meszibalu commented on a change in pull request #39: HBASE-22711 Spark
connector doesn't use the given mapping when inserting data
URL: https://github.com/apache/hbase-connectors/pull/39#discussion_r305807875
##########
File path:
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/datasources/Utils.scala
##########
@@ -73,28 +74,19 @@ object Utils {
val record = field.catalystToAvro(input)
AvroSerdes.serialize(record, field.schema.get)
} else {
- input match {
- case data: Boolean => Bytes.toBytes(data)
- case data: Byte => Array(data)
- case data: Array[Byte] => data
- case data: Double => Bytes.toBytes(data)
- case data: Float => Bytes.toBytes(data)
- case data: Int => Bytes.toBytes(data)
- case data: Long => Bytes.toBytes(data)
- case data: Short => Bytes.toBytes(data)
- case data: UTF8String => data.getBytes
- case data: String => Bytes.toBytes(data)
- // TODO: add more data type support
+ field.dt match {
+ case BooleanType => Bytes.toBytes(input.asInstanceOf[Boolean])
+ case ByteType => Array(input.asInstanceOf[Number].byteValue)
+ case ShortType => Bytes.toBytes(input.asInstanceOf[Number].shortValue)
+ case IntegerType => Bytes.toBytes(input.asInstanceOf[Number].intValue)
+ case LongType => Bytes.toBytes(input.asInstanceOf[Number].longValue)
+ case FloatType => Bytes.toBytes(input.asInstanceOf[Number].floatValue)
+ case DoubleType =>
Bytes.toBytes(input.asInstanceOf[Number].doubleValue)
+ case DateType | TimestampType =>
Bytes.toBytes(input.asInstanceOf[java.util.Date].getTime)
Review comment:
In `java.sql` there are to different types: `Date` and `Timestamp`. Both of
them are subclasses of `java.util.Date` so they can be treated the same during
serialization but they should be deserialized differently.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services