Github user dongjoon-hyun commented on the issue:
https://github.com/apache/spark/pull/20846
@liutang123 . Did you test this with the latest Apache Spark 2.3? Apache
Spark 2.3 works without any problem with your example.
```scala
scala> sql("create table test_par(a string) PARTITIONED BY (b bigint) ROW
FORMAT SERDE 'org.apache.hadoop.hive.ql.io.orc.OrcSerde' STORED AS INPUTFORMAT
'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat' OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'")
res0: org.apache.spark.sql.DataFrame = []
hive> ALTER TABLE test_par CHANGE a a bigint restrict;
OK
Time taken: 1.358 seconds
scala> sql("select * from test_par").show
18/03/16 17:33:52 WARN ObjectStore: Failed to get database global_temp,
returning NoSuchObjectException
18/03/16 17:33:53 WARN HiveExternalCatalog: The table schema given by Hive
metastore(struct<a:bigint,b:bigint>) is different from the schema when this
table was created by Spark SQL(struct<a:string,b:bigint>). We have to fall back
to the table schema from Hive metastore which is not case preserving.
18/03/16 17:33:54 WARN HiveExternalCatalog: The table schema given by Hive
metastore(struct<a:bigint,b:bigint>) is different from the schema when this
table was created by Spark SQL(struct<a:string,b:bigint>). We have to fall back
to the table schema from Hive metastore which is not case preserving.
+---+---+
| a| b|
+---+---+
+---+---+
scala> sc.version
res1: String = 2.3.0
```
So, please include a test case for this PR. You may insert some data to
illustrate your issue.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]