cstavr commented on code in PR #47976:
URL: https://github.com/apache/spark/pull/47976#discussion_r1745688319
##########
sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala:
##########
@@ -671,12 +672,9 @@ class CatalogImpl(sparkSession: SparkSession) extends
Catalog {
} else {
CatalogTableType.MANAGED
}
- val location = if (storage.locationUri.isDefined) {
- val locationStr = storage.locationUri.get.toString
- Some(locationStr)
- } else {
- None
- }
+
+ // The location in UnresolvedTableSpec should be the original
user-provided path string.
Review Comment:
I thought about this. I did not add a config because the change in behaviour
is only for new tables. Existing tables should continue working. I am also
suspecting that there is no real usage of this API for tables with special
chars. But I don't have a strong opinion. WDYT?
##########
sql/core/src/test/scala/org/apache/spark/sql/internal/CatalogSuite.scala:
##########
@@ -700,7 +700,7 @@ class CatalogSuite extends SharedSparkSession with
AnalysisTest with BeforeAndAf
val description = "this is a test table"
withTable("t") {
- withTempDir { dir =>
+ withTempDir(prefix = "test%prefix") { dir =>
Review Comment:
That would also work, but I think is useful to have it since we should
expand more on testing of paths with special chars. In the future we could also
change the default value, so this helper will be needed to avoid the special
chars.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]