jerryshao opened a new pull request, #9284: URL: https://github.com/apache/gravitino/pull/9284
### What changes were proposed in this pull request? This PR improves the documentation for Spark job templates to clarify that the `className` field is required for Java/Scala Spark applications but optional for PySpark applications. ### Why are the changes needed? Issue #8896 identified that: 1. The `className` field could be null or empty when registering or updating Spark job templates 2. It was unclear whether this field should be validated as required 3. Documentation did not clearly state that `className` is only required for Java/Scala jobs, not PySpark jobs ### Does this PR introduce _any_ user-facing change? Yes, this PR: - **Improves documentation** in Java API, Python API, and OpenAPI specs to clarify `className` requirements - **Fixes validation bugs** in Python client that incorrectly required `className` for all Spark jobs - **Fixes OpenAPI schema** that had incorrect field name (`mainClass` → `className`) and requirements ### How was this patch tested? - Existing tests pass: `./gradlew :common:test --tests TestJobTemplateDTO` - All changes are documentation and validation fixes with no functional code changes - The existing test at line 388 of TestJobTemplateDTO already validates that null `className` is accepted ### Changes: - Updated Java API and DTO Javadocs to clarify `className` optionality - Updated Python API and DTO docstrings with same clarifications - Fixed Python validation bugs that incorrectly required `className` - Fixed OpenAPI documentation: corrected field name, requirements, and descriptions - Removed useless `validate()` method override in Python DTO to fix linting Closes #8896 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
