raynorelyp commented on issue #56689:
URL: https://github.com/apache/airflow/issues/56689#issuecomment-3423750434

   This is sample code of what I'm doing. Pardon if there's a typo because I'm 
manually writing this on my personal computer. The important part for 
experiencing this bug is to make sure there is now aws credentials file and 
there are no aws environment variables set.
   
   ```python
   from airflow.decorators import dag
   from airflow.providers.amazon.aws.operators.eks import EksPodOperator
   
   @dag(
     dag_id="test"
   )
   def generate_dag():
     run_thing = EksPodOperator(
       task_id="run_thing",
       region="us-east-1", #not what the docs say, but what the actual param is
       namespace="test-namespace",
       cluster_name="test-cluster",
       pod_name="test-pod",
       service_account_name="test-san",
       image="public.ecr.aws/aws-cli/aws-cli:latest",
       env_vars={"TEST": "test"},
       cmds=["sh", "-c", "echo 'hello'"],
       on_finish_action="delete_secceeded_pod",
       get_logs=True,
       do_xcom_push=True,
       aws_conn_id="test_id", #set this up in the connections settings page. 
Does not work unless you add environment vars
       in_cluster=False
     )
   
   generate_dag()
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to