HeartSaVioR commented on code in PR #43425:
URL: https://github.com/apache/spark/pull/43425#discussion_r1388820123
##########
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateSchemaCompatibilityChecker.scala:
##########
@@ -88,8 +88,7 @@ class StateSchemaCompatibilityChecker(
private def schemasCompatible(storedSchema: StructType, schema: StructType):
Boolean =
DataType.equalsIgnoreNameAndCompatibleNullability(schema, storedSchema)
- // Visible for testing
- private[sql] def readSchemaFile(): (StructType, StructType) = {
Review Comment:
It's not about scoping. It is more about the representation that we are
exposing these methods to others. check method can be private[sql] as well but
we don't, because the method is the entry point. We are adding more entry
points, hence making them be public as well.
(Although someone might argue that it's better to separate the methods for
handling reading and writing schema file out to the separate class.)
Technically saying, we should be able to make the entire class be package
private, but we don't, because the package is considered as non-public one.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]