Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22037#discussion_r209129230
  
    --- Diff: 
external/avro/src/main/scala/org/apache/spark/sql/avro/AvroDeserializer.scala 
---
    @@ -138,10 +142,21 @@ class AvroDeserializer(rootAvroType: Schema, 
rootCatalystType: DataType) {
                 bytes
               case b: Array[Byte] => b
               case other => throw new RuntimeException(s"$other is not a valid 
avro binary.")
    -
             }
             updater.set(ordinal, bytes)
     
    +      case (FIXED, d: DecimalType) => (updater, ordinal, value) =>
    +        val bigDecimal = 
decimalConversions.fromFixed(value.asInstanceOf[GenericFixed], avroType,
    +          LogicalTypes.decimal(d.precision, d.scale))
    --- End diff --
    
    parquet can convert binary to unscaled long directly, shall we follow?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to