moomindani commented on PR #63622:
URL: https://github.com/apache/airflow/pull/63622#issuecomment-4063106646
You're right, sorry about that. Here's the actual DAG I tested:
```python
from datetime import datetime
from airflow.models.dag import DAG
from airflow.providers.amazon.aws.operators.s3 import (
S3CreateBucketOperator,
S3DeleteBucketOperator,
)
ACCOUNT_ID = "xxxxxxxxxxxx"
REGION = "us-east-1"
BUCKET_NAME = f"airflow-dag-ns-test-{ACCOUNT_ID}-{REGION}-an"
with DAG(
dag_id="test_s3_account_regional_namespace",
start_date=datetime(2026, 1, 1),
schedule=None,
catchup=False,
) as dag:
create_bucket = S3CreateBucketOperator(
task_id="create_bucket",
bucket_name=BUCKET_NAME,
region_name=REGION,
bucket_namespace="account-regional",
aws_conn_id=None,
)
delete_bucket = S3DeleteBucketOperator(
task_id="delete_bucket",
bucket_name=BUCKET_NAME,
force_delete=False,
aws_conn_id=None,
)
create_bucket >> delete_bucket
```
Tested the full DAG against real S3 using `airflow dags test`:
```
$ airflow dags test test_s3_account_regional_namespace 2026-01-01
Created bucket with name: airflow-dag-ns-test-xxxxxxxxxxxx-us-east-1-an
Deleted bucket with name: airflow-dag-ns-test-xxxxxxxxxxxx-us-east-1-an
DagRun Finished: state=success
```
Note: Requires `botocore>=1.42.0` at runtime.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]