NarekDW commented on code in PR #41018:
URL: https://github.com/apache/spark/pull/41018#discussion_r1183876429
##########
core/src/main/resources/error/error-classes.json:
##########
@@ -235,6 +235,11 @@
],
"sqlState" : "22018"
},
+ "CORRUPTED_TABLE_PROPERTY" : {
Review Comment:
It is being raised in `CatalogTable.readLargeTableProp` function
```scala
def readLargeTableProp(props: Map[String, String], key: String):
Option[String] = {
props.get(key).orElse {
if (props.exists { case (mapKey, _) => mapKey.startsWith(key) }) {
props.get(s"$key.numParts") match {
case None => throw
QueryCompilationErrors.cannotReadCorruptedTablePropertyError(key)
case Some(numParts) =>
val parts = (0 until numParts.toInt).map { index =>
props.getOrElse(s"$key.part.$index", {
throw
QueryCompilationErrors.cannotReadCorruptedTablePropertyError(
key, Some(s"Missing part $index, $numParts parts are
expected."))
})
}
Some(parts.mkString)
}
} else {
None
}
}
}
```
If the key is not found but there is another key with the same prefix - the
functions expects that the property has to be partitioned and it will raise
exception in case if it didn't find `key.numParts` property or in case if
`key.numParts` exists but some exact partitioned property is missed. In fact it
means that table properties are corrupted, but it could be named more
precisely, like `INSUFFICITENT_TABLE_PROPETIES` or
`TABLE_PROPERTY_PARTS_MISSED` or something else...
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]