Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17149#discussion_r109892051
  
    --- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
    @@ -285,7 +285,7 @@ private[spark] class HiveExternalCatalog(conf: 
SparkConf, hadoopConf: Configurat
             // compatible format, which means the data source is file-based 
and must have a `path`.
             require(table.storage.locationUri.isDefined,
               "External file-based data source table must have a `path` entry 
in storage properties.")
    -        Some(new Path(table.location).toUri.toString)
    --- End diff --
    
    I see, it seems the problem itself seems existing before.
    
    I found this while running related tests on Windows which it looks related 
with this PR (special character stuff) so I think this is a proper JIRA to make 
a followup with and PR to note this.
    
    It seems we do have tests that use URIs already whether it was mistakenly 
supported or not. Should we ask it to dev-mailing list? I think this is an 
important decision.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to