Github user liancheng commented on the pull request:
https://github.com/apache/spark/pull/7700#issuecomment-125273115
Removed the "HOTFIX" tag, since this is actually not a newly introduced
issue. Spark 1.4 behaves exactly the same. With a table created in Hive via:
```sql
CREATE TABLE x STORED AS PARQUET AS SELECT 1 AS key;
```
We have the following PySpark result in Spark 1.4:
```
In [5]: sqlContext.setConf('spark.sql.hive.convertMetastoreParquet', 'true')
In [6]: sqlContext.sql('desc extended x').show()
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further
details.
+--------+---------+-------+
|col_name|data_type|comment|
+--------+---------+-------+
| key| int| |
+--------+---------+-------+
In [7]: sqlContext.setConf('spark.sql.hive.convertMetastoreParquet',
'false')
In [8]: sqlContext.sql('desc extended x').show()
+--------------------+--------------------+-------+
| col_name| data_type|comment|
+--------------------+--------------------+-------+
| key| int| null|
|Detailed Table In...|Table(tableName:x...| |
+--------------------+--------------------+-------+
```
So I'm pretty puzzled why this test case only fails occasionally. A
possible explanation is that, some test cases may set
`spark.sql.hive.convertMetastoreParquet` to `false` without properly restoring
the original value. When such a test case is executed before "CTAS with serde",
no failure occurs.
cc @viirya
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]