Github user gatorsmile commented on the issue:
https://github.com/apache/spark/pull/19231
```
val metadata = new MetadataBuilder()
.putLong("scale", fieldScale)
val columnType =
dialect.getCatalystType(dataType, typeName, fieldSize,
metadata).getOrElse(
getCatalystType(dataType, fieldSize, fieldScale, isSigned))
fields(i) = StructField(columnName, columnType, nullable,
metadata.build())
```
->
```
val metadata = new MetadataBuilder()
.putLong("scale", fieldScale)
val columnType =
dialect.getCatalystType(dataType, typeName, fieldSize,
metadata).getOrElse(
getCatalystType(dataType, fieldSize, fieldScale, isSigned))
fields(i) = StructField(columnName, columnType, nullable)
```
Calling `dialect.getCatalystType` is before we setting the metadata of
`StructField `. Why we still need `scale`?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]