Github user HyukjinKwon commented on a diff in the pull request: https://github.com/apache/spark/pull/19799#discussion_r152796192 --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/client/VersionsSuite.scala --- @@ -862,17 +859,17 @@ class VersionsSuite extends SparkFunSuite with Logging { | "logicalType": "decimal" | } | ] - | } ] + | }] |} """.stripMargin - val schemaUrl = s"""$schemaPath${File.separator}avroDecimal.avsc""" - val schemaFile = new File(schemaPath, "avroDecimal.avsc") + val schemaFile = new File(dir, "avroDecimal.avsc") val writer = new PrintWriter(schemaFile) writer.write(avroSchema) writer.close() + val schemaPath = schemaFile.getCanonicalPath --- End diff -- For example, `/` in `file:/C:/a/b/c` is fine but problem seems about `\` in `C:\a\b\c`. I believe there are few issues open about this. For example, on Windows: ``` scala> sql("select '\\\\'").show() 17/11/23 05:04:37 WARN SizeEstimator: Failed to check whether UseCompressedOops is set; assuming yes +---+ | \| +---+ | \| +---+ scala> sql("select '\\'").show() org.apache.spark.sql.catalyst.parser.ParseException: no viable alternative at input 'select ''(line 1, pos 7) == SQL == select '\' -------^^^ at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver .scala:239) at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.sc ala:115) at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48 ) at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDrive r.scala:69) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:638) ... 49 elided scala> sql("select '/'").show() +---+ | /| +---+ | /| +---+ scala> sql("select '//'").show() +---+ | //| +---+ | //| +---+ ``` So ... I was thinking using `toURI` ```scala scala> new File("C:\\a\\b\\c").toURI.toString res6: String = file:/C:/a/b/c ```
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org