o-nikolas commented on code in PR #22758:
URL: https://github.com/apache/airflow/pull/22758#discussion_r849992018
##########
airflow/providers/amazon/aws/operators/s3.py:
##########
@@ -318,6 +318,94 @@ def execute(self, context: 'Context'):
)
+class S3CreateObjectOperator(BaseOperator):
+ """
+ Creates a new object from a given string or bytes.
Review Comment:
I'm not sure if you saw my reply @eladkal, but in those three cases I
linked, such an http to S3 operator would not help.
Another example, that is not of the hardcoded variety, that I've run into
recently was with the Step Functions operator. There is an operator to run a SF
workflow and then another to fetch the json result which is returned, it would
be convenient to have an S3 operator to consume the output from xcom and write
it to S3, instead of having to use the python operator and write boilder plate
code that uses the hook directly to create the object or create a tempfile and
use the local to s3 operator.
I'm not sure what the grave concern is about adding this operator, but I'd
sure like to have it, and would make use of it.
Maybe we can flip the question around and here what concerns, risks,
oversights and pitfalls you see over including this operator? Thanks!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]