alexandraabbas opened a new issue #9027: URL: https://github.com/apache/airflow/issues/9027
**Apache Airflow version**: 1.10.10 - **Cloud provider or hardware configuration**: Local - **OS** (e.g. from /etc/os-release): MacOS High Sierra 10.13.6 - **Kernel** (e.g. `uname -a`): Darwin Kernel Version 17.7.0 - **Install tools**: pip, conda - **Others**: - **What happened**: Cannot upload local pyspark file when using DataProcPySparkOperator. I'm getting the following error when DataProcPySparkOperator.main="local file path". ```python TypeError: upload() got an unexpected keyword argument 'bucket_name' ``` I think DataProcPySparkOperator._upload_file_temp() uses wrong argument names when trying to run GoogleCloudStorageHook.upload() method. GoogleCloudStorageHook.upload() expects `bucket` and `object` however DataProcPySparkOperator._upload_file_temp() tries to use the names `bucket_name` and `object_name`. **What you expected to happen**: I expected the DataProcPySparkOperator to upload my local python file to GCS. **How to reproduce it**: Use main="local file path" with DataProcPySparkOperator. **Anything else we need to know**: Every time I tried to run DataProcPySparkOperator with a main="local file path". ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected]
