zifeif2 commented on code in PR #53104:
URL: https://github.com/apache/spark/pull/53104#discussion_r2582799523


##########
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/state/StateDataSource.scala:
##########
@@ -407,6 +418,7 @@ object StateSourceOptions extends DataSourceOptions {
   val STATE_VAR_NAME = newOption("stateVarName")
   val READ_REGISTERED_TIMERS = newOption("readRegisteredTimers")
   val FLATTEN_COLLECTION_TYPES = newOption("flattenCollectionTypes")
+  val INTERNAL_ONLY_READ_ALL_COLUMN_FAMILIES = 
newOption("internalOnlyReadAllColumnFamilies")

Review Comment:
   Could you explain a bit more regarding to exposing an interface at the 
partition reader level?
   I am thinking about the interface of using 
StatePartitionAllColumnFamiliesReader as the following
   ```
               spark.read
                 .format("statestore")
                 .option(StateSourceOptions.PATH, tempDir.getAbsolutePath)
                 
.option(StateSourceOptions.INTERNAL_ONLY_READ_ALL_COLUMN_FAMILIES, "true")
                 .load()
                 .collect()
   ```
   Therefore, we need to expose this option in StateDataSource. 
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to