EugeneYushin commented on PR #52759:
URL: https://github.com/apache/spark/pull/52759#issuecomment-3474039875

   > the fix LGTM, can we add a test in 
`org.apache.spark.sql.connector.catalog.CatalogSuite`?
   
   I was trying to re-use SQLConf "public" (private[sql]) API in order to 
re-use ConfigReader that comes within 
[SQLConf](https://github.com/EugeneYushin/spark/blob/036f591448ecd466a018794c2e6cc7a049486200/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala#L6622),
 but this implementation is not stable: it fails if the property has been 
registered already.
   
https://github.com/apache/spark/pull/52759/commits/adcf06bf5cf47709f58eea07eb6f783531409400
   
   I'm not quite sure how to propagate/mock env up to Catalogs singleton object 
during unit tests. I've been thinking to re-use `docker-integration-tests` 
suite for this, but I have some doubts it's worth enough to justify integration 
tests costs increase.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to