jayantdb commented on PR #52237: URL: https://github.com/apache/spark/pull/52237#issuecomment-3257088205
> Can you also paste the new output for the progress metrics with your change ? > @jayantdb - please look into CI failures here - https://github.com/jayantdb/spark/actions/runs/17472105263/job/49622983276 ? @anishshri-db The CI pipeline is failing at **Scala linter** with this message: ``` Scalastyle checks passed. The scalafmt check failed on sql/connect or sql/connect at following occurrences: org.apache.maven.plugin.MojoExecutionException: Scalafmt: Unformatted files found Error: Failed to execute goal org.antipathy:mvn-scalafmt_2.13:1.1.1713302731.c3d0074:format (default-cli) on project spark-sql-api_2.13: Error formatting Scala files: Scalafmt: Unformatted files found -> [Help 1] Before submitting your change, please make sure to format your code using the following command: ./build/mvn scalafmt:format -Dscalafmt.skip=false -Dscalafmt.validateOnly=false -Dscalafmt.changedOnly=false -pl sql/api -pl sql/connect/common -pl sql/connect/server -pl sql/connect/shims -pl sql/connect/client/jvm Error: Process completed with exit code 1. ``` The reason seems to be due to the formatting in `sql/connect packages`. Upon running the following check, I can see 1000+ files are marked as unformatted: ``` ./build/mvn scalafmt:format \ -Dscalafmt.skip=false \ -Dscalafmt.validateOnly=true \ -Dscalafmt.changedOnly=false \ -pl sql/core \ ``` I didn't touched any other of these thousands of files. Kindly confirm if I should run formatting or not? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org