TJaniF commented on issue #29238:
URL: https://github.com/apache/airflow/issues/29238#issuecomment-1408849155

   Hi @Taragolis 
   
   Yes, using the S3Hook directly like in the Mixing TaskFlow and classic 
operators works as expected and creates 6 separate files:
   
   ```
   from airflow.decorators import dag, task_group, task
   from airflow.providers.amazon.aws.hooks.s3 import S3Hook
   from pendulum import datetime
   import io
   import json
   
   MY_S3_BUCKET = "s3://mytxtbucket"
   AWS_CONN_ID = "aws_conn"
   
   @dag(
       start_date=datetime(2022, 12, 1),
       schedule=None,
       catchup=False,
   )
   def mixed():
   
       @task_group()
       def create_s3_files(num):
   
           @task
           def create_file(aws_conn_id, bucket, num):
               hook = S3Hook(aws_conn_id=aws_conn_id)
   
               return hook.load_file_obj(file_obj=io.BytesIO(b'something'), 
key=f"{bucket}/x/{num}.txt")
           create_file("aws_conn", MY_S3_BUCKET, num)
   
       tg_object = create_s3_files.expand(num=[0,1,2,3,4,5])
   
   mixed()
   ```
   
   Gives me:
   
   <img width="1233" alt="Screenshot 2023-01-30 at 16 40 27" 
src="https://user-images.githubusercontent.com/90063506/215522862-d1b0ab2e-48fe-468f-afdf-fdd0c86bd50d.png";>
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to