Yin Huai created SPARK-18464:
--------------------------------
Summary: Spark SQL fails to load tables created without providing
a schema
Key: SPARK-18464
URL: https://issues.apache.org/jira/browse/SPARK-18464
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 2.1.0
Reporter: Yin Huai
Priority: Blocker
I have a old table that was created without providing a schema. Seems branch
2.1 fail to load it and says that the schema is corrupt.
With {{spark.sql.debug}} enabled, I get the metadata by using {{describe
formatted}}.
{code}
[col,array<string>,from deserializer]
[,,]
[# Detailed Table Information,,]
[Database:,mydb,]
[Owner:,root,]
[Create Time:,Fri Jun 17 11:55:07 UTC 2016,]
[Last Access Time:,Thu Jan 01 00:00:00 UTC 1970,]
[Location:,mylocation,]
[Table Type:,EXTERNAL,]
[Table Parameters:,,]
[ transient_lastDdlTime,1466164507,]
[ spark.sql.sources.provider,parquet,]
[,,]
[# Storage Information,,]
[SerDe Library:,org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe,]
[InputFormat:,org.apache.hadoop.mapred.SequenceFileInputFormat,]
[OutputFormat:,org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat,]
[Compressed:,No,]
[Storage Desc Parameters:,,]
[ path,/myPatch,]
[ serialization.format,1,]
{code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]