Vijay created SPARK-21182: ----------------------------- Summary: Structured streaming on Spark-shell on windows Key: SPARK-21182 URL: https://issues.apache.org/jira/browse/SPARK-21182 Project: Spark Issue Type: Bug Components: Structured Streaming Affects Versions: 2.1.1 Environment: Windows 10 spark-2.1.1-bin-hadoop2.7 Reporter: Vijay Priority: Minor
Structured streaming output operation is failing on Windows shell. As per the error message, path is being prefixed with File separator as in Linux. Thus, causing the IllegalArgumentException. Following is the error message. scala> val query = wordCounts.writeStream .outputMode("complete") .format("console") .start() java.lang.IllegalArgumentException: Pathname {color:red}*/*{color}C:/Users/Vijay/AppData/Local/Temp/temporary-081b482c-98a4-494e-8cfb-22d966c2da01/offsets from C:/Users/Vijay/AppData/Local/Temp/temporary-081b482c-98a4-494e-8cfb-22d966c2da01/offsets is not a valid DFS filename. at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:197) at org.apache.hadoop.hdfs.DistributedFileSystem.access$000(DistributedFileSystem.java:106) at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305) at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317) at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426) at org.apache.spark.sql.streaming.StreamingQueryManager.createQuery(StreamingQueryManager.scala:222) at org.apache.spark.sql.streaming.StreamingQueryManager.startQuery(StreamingQueryManager.scala:280) at org.apache.spark.sql.streaming.DataStreamWriter.start(DataStreamWriter.scala:268) ... 52 elided -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org