potiuk commented on a change in pull request #7688: [AIRFLOW-6794] Allow AWS 
Operator RedshiftToS3Transfer To Run a Custom Query
URL: https://github.com/apache/airflow/pull/7688#discussion_r390751106
 
 

 ##########
 File path: UPDATING.md
 ##########
 @@ -1047,6 +1047,20 @@ If the DAG relies on tasks with other trigger rules 
(i.e. `all_done`) being skip
 
 The goal of this change is to achieve a more consistent and configurale 
cascading behaviour based on the `BaseBranchOperator` (see 
[AIRFLOW-2923](https://jira.apache.org/jira/browse/AIRFLOW-2923) and 
[AIRFLOW-1784](https://jira.apache.org/jira/browse/AIRFLOW-1784)).
 
+
+### RedshiftToS3Transfer:: signature changed
+
+Previous versions of the `RedshiftToS3Transfer` operator required `schema` and 
`table` arguments as the first 2
+positional arguments. This signature was changed in 2.0 and
+the `s3_bucket` and `s3_key` are the first 2 positional reguements.
+
+In order to use this operator:
+```python
+result = RedshiftToS3Transfer('schema', 'table')  # Pre-2.0 call
+...
+result = RedshiftToS3Transfer('s3_bucket', 's3_key')  # Post-2.0 call
 
 Review comment:
   I think the examples are not correct and it might be misleading - the first 
one should also have s3_bucket and s3_key -> they were not optional. Also the 
second example is not valid because it requires either schema+table combination 
or custom query.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to