yaooqinn commented on code in PR #47017:
URL: https://github.com/apache/spark/pull/47017#discussion_r1650655648
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##########
@@ -5010,6 +5010,14 @@ object SQLConf {
.booleanConf
.createWithDefault(false)
+ val LEGACY_CODING_ERROR_ACTION =
buildConf("spark.sql.legacy.codingErrorAction")
+ .internal()
+ .doc("When set to true, encode/decode functions replace unmappable
characters with mojibake " +
+ "instead of reporting coding errors.")
+ .version("4.0.0")
+ .booleanConf
+ .createWithDefault(false)
Review Comment:
The reasons I'd like to make it independent of ANSI are:
- Part of the implication of ANSI is Hive-incompatibility,
- Hive also reports coding errors, so it was a mistake when we ported this
from hive
- These functions are not ANSI-defined
- The error behaviors are also not found in ANSI
The reasons mentioned above indicate that this behavior is more of a legacy
trait of Spark itself.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]