gabFirmaway commented on issue #29423:
URL: https://github.com/apache/airflow/issues/29423#issuecomment-1422932899
Oh! I was stuck with this error today too!
I tested the triggering of existing Glue jobs on my development machine with
an older version of Airflow and everything worked flawlessly.
Today, when I create an ec2 instance, create a custom airflow docker
container and triggered the job, it keeps asking for s3_bucket and if you
provide one, it override the configuration that you create in the Glue Editor
(for example, for python only scripts, it changes to glue script with spark 3
but not defined language).
A temporary solution would be to add in the task definition:
create_job_kwargs={'GlueVersion':'3.0', "DefaultArguments":
{"--job-language": "python"}}
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]