aokolnychyi commented on PR #53143: URL: https://github.com/apache/spark/pull/53143#issuecomment-3576538884
@cloud-fan, correct. The question is what if the refresh is successful BUT reanalyzing / reoptimizing of the refreshed plan fails? Should this throw a top-level exception and mark write as failed OR shall it be ignored just like the inability to refresh due to schema changes? Previously, Spark would throw an exception and mark the write as failed if reanalyzing / reoptimizing fails. This PR switches to a warning. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
