HeartSaVioR commented on a change in pull request #29767:
URL: https://github.com/apache/spark/pull/29767#discussion_r490178331
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/streaming/DataStreamWriter.scala
##########
@@ -300,54 +301,44 @@ final class DataStreamWriter[T] private[sql](ds:
Dataset[T]) {
"write files of Hive data source directly.")
}
- if (source == "memory") {
+ if (source == SOURCE_NAME_TABLE) {
+ assertNotPartitioned("table")
+
+ import df.sparkSession.sessionState.analyzer.CatalogAndIdentifier
+
+ import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+ val CatalogAndIdentifier(catalog, identifier) =
df.sparkSession.sessionState.sqlParser
Review comment:
I just checked it roughly, and looks like temporary view is not loaded
by `loadTable` - it throws NoSuchTableException in V2SessionCatalog.
```
test("write to temporary view shouldn't be allowed") {
val tableIdentifier = "table_name"
val tempViewIdentifier = "temp_view"
spark.sql(s"CREATE TABLE $tableIdentifier (id bigint, data string) USING
parquet")
checkAnswer(spark.table(tableIdentifier), Seq.empty)
spark.sql(s"SELECT id, data FROM
$tableIdentifier").createOrReplaceTempView(tempViewIdentifier)
// spark.sql(s"CREATE TEMPORARY VIEW $tempViewIdentifier AS SELECT id,
data FROM $tableIdentifier")
withTempDir { checkpointDir =>
val exc = intercept[AnalysisException] {
runStreamQueryAppendMode("default." + tempViewIdentifier,
checkpointDir, Seq.empty, Seq.empty)
}
assert(exc.getMessage.contains("doesn't support streaming write"))
}
}
```
fails with `"Table default.temp_view not found;" did not contain "doesn't
support streaming write"`.
For sure I think this is desired behavior, as it's a view. Even it can load
the (temp) view, capability shouldn't have write related flags.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]