dchristle commented on pull request #32688: URL: https://github.com/apache/spark/pull/32688#issuecomment-854296497
@dongjoon-hyun Sure - the error triggers right after the job jar is added, i.e. there are no stages that start, and it's so early that the Spark UI is not accessible. I tried rebuilding my Docker image from scratch and the error persists. I am using the GCS connector and Snowflake connectors for this job, which are technically only on 3.1. I am not sure how to debug whether there is something specific about my build (aka a problem locally) or if there truly is a dependency conflict in the `master` branch. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
