Github user jose-torres commented on a diff in the pull request:
https://github.com/apache/spark/pull/20369#discussion_r164036304
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -1110,6 +1110,13 @@ object SQLConf {
.timeConf(TimeUnit.MILLISECONDS)
.createWithDefault(100)
+ val DISABLED_V2_STREAMING_WRITERS =
buildConf("spark.sql.streaming.disabledV2Writers")
+ .internal()
+ .doc("A comma-separated list of fully qualified data source register
class names for which" +
+ " StreamWriteSupport is disabled. Writes to these sources will fail
back to the V1 Sink.")
--- End diff --
DataStreamWriter will call DataSource.createSink(), which will notice the
providing class doesn't have a (V1) sink implementation and throw "Data source
$className does not support streamed writing".
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]