Huaxin Gao created SPARK-16946:
----------------------------------
Summary: saveAsTable[append] with different number of columns
should throw Exception
Key: SPARK-16946
URL: https://issues.apache.org/jira/browse/SPARK-16946
Project: Spark
Issue Type: Bug
Components: SQL
Reporter: Huaxin Gao
Priority: Minor
In HiveContext, if saveAsTable[append] has different number of columns, Spark
will throw Exception.
e.g.
{code}
test("saveAsTable[append]: too many columns") {
withTable("saveAsTable_too_many_columns") {
Seq((1, 2)).toDF("i",
"j").write.saveAsTable("saveAsTable_too_many_columns")
val e = intercept[AnalysisException] {
Seq((3, 4, 5)).toDF("i", "j",
"k").write.mode("append").saveAsTable("saveAsTable_too_many_columns")
}
assert(e.getMessage.contains("doesn't match"))
}
}
{code}
However, in SparkSession or SQLContext, if use the above code example, the
extra column in the append data will be removed silently without any warning or
Exception. The table becomes
i j
3 4
1 2
We may want follow the HiveContext behavior and throw Exception
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]