Github user dmateusp commented on the issue:
https://github.com/apache/spark/pull/21766
Just checked out the PR,
```scala
scala> spark.sql("SELECT CAST(1 as NUMERIC)")
res0: org.apache.spark.sql.DataFrame = [CAST(1 AS DECIMAL(10,0)):
decimal(10,0)]
scala> spark.sql("SELECT NUMERIC(1)")
org.apache.spark.sql.AnalysisException: Undefined function: 'NUMERIC'. This
function is neither a registered temporary function nor a permanent function
registered in the database 'default'.; line 1 pos 7
```
I imagine some tests could be added here:
-
`sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/DataTypeParserSuite.scala`
- `sql/core/src/test/resources/sql-tests/inputs/`
Do you think it's worth having a separate DataType or just have it as an
alias?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]