Github user gengliangwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22037#discussion_r208591866
  
    --- Diff: 
external/avro/src/main/scala/org/apache/spark/sql/avro/SchemaConverters.scala 
---
    @@ -139,7 +142,16 @@ object SchemaConverters {
     
           case FloatType => builder.floatType()
           case DoubleType => builder.doubleType()
    -      case _: DecimalType | StringType => builder.stringType()
    +      case StringType => builder.stringType()
    +      case d: DecimalType =>
    +        val avroType = LogicalTypes.decimal(d.precision, d.scale)
    +        if (nullable) {
    +          val schema = 
avroType.addToSchema(SchemaBuilder.builder().bytesType())
    +          builder.`type`(schema)
    --- End diff --
    
    For decimal, it requires to set three keys in props: `logicalType`, 
`precision`, `scale`. 
    So here I choose to use such way, instead of setting properties like 
logical Timestamp/Date type.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to