Github user jose-torres commented on a diff in the pull request:
https://github.com/apache/spark/pull/20097#discussion_r159493843
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/streaming/DataStreamReader.scala
---
@@ -167,6 +167,24 @@ final class DataStreamReader
private[sql](sparkSession: SparkSession) extends Lo
className = source,
options = extraOptions.toMap)
ds match {
+ case s: MicroBatchReadSupport =>
+ val tempReader = s.createMicroBatchReader(
+ java.util.Optional.ofNullable(userSpecifiedSchema.orNull),
+ Utils.createTempDir(namePrefix =
s"temporaryReader").getCanonicalPath,
+ options)
+ // Generate the V1 node to catch errors thrown within generation.
+ try {
+ StreamingRelation(v1DataSource)
+ } catch {
+ case e: UnsupportedOperationException
--- End diff --
https://github.com/apache/spark/blob/9a2b65a3c0c36316aae0a53aa0f61c5044c2ceff/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala#L266
I agree that it would be nice to change this exception, but I don't know
whether we can.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]