Bowrna opened a new issue, #34838:
URL: https://github.com/apache/airflow/issues/34838
### Description
spark-submit command has one of the options to pass the properties file as
argument. Instead of loading multiple key, value via --conf option, this will
help to load extra properties from the file path. While we have support for
most of the arguments supported in spark-submit command in SparkSubmitOperator,
this one `property-files` is missing. Could that be included?
```[root@airflow ~]# spark-submit --help
Usage: spark-submit [options] <app jar | python file | R file> [app
arguments]
Usage: spark-submit --kill [submission ID] --master [spark://...]
Usage: spark-submit --status [submission ID] --master [spark://...]
Usage: spark-submit run-example [options] example-class [example args]
Options:
--conf, -c PROP=VALUE Arbitrary Spark configuration property.
--properties-file FILE Path to a file from which to load extra
properties. If not
specified, this will look for
conf/spark-defaults.conf.
```
### Use case/motivation
Add the property-files as one of the options to pass in the
SparkSubmitOperator to load the extra config properties as file
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]