[ 
https://issues.apache.org/jira/browse/AIRFLOW-5115?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16973822#comment-16973822
 ] 

Juan Ramos Fuentes commented on AIRFLOW-5115:
---------------------------------------------

After testing locally, I have confirmed that moving the input validation logic 
to the `poke` method solves the issue. Do you have a PR open for this 
[~dsynkov]? Let me know if I can help in any way. This will unblock me and my 
team

> S3KeySensor template_fields for bucket_name & bucket_key do not support Jinja 
> variables
> ---------------------------------------------------------------------------------------
>
>                 Key: AIRFLOW-5115
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-5115
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: aws
>    Affects Versions: 1.9.0
>            Reporter: Dmitriy Synkov
>            Assignee: Dmitriy Synkov
>            Priority: Minor
>              Labels: easyfix, patch
>             Fix For: 2.0.0
>
>   Original Estimate: 4h
>  Remaining Estimate: 4h
>
> In all Airflow operators (which inherit form {{BaseOperator}}) there is a 
> {{template_fields}} attribute defined as ["which fields will get 
> jinjafied"|https://github.com/apache/airflow/blob/master/airflow/models/baseoperator.py#L218-L219]).
>  For the {{S3KeySensor}} op in specific, these are {{template_fields = 
> ('bucket_key', 'bucket_name')}}.
> The {{bucket_key}} kwarg, however, has some input validation in that the 
> {{bucket_key}} needs to begin with the S3 protocol {{s3://}}; this exception 
> is thrown by the 
> [constructor|https://github.com/apache/airflow/blob/master/airflow/sensors/s3_key_sensor.py#L71-L74],
>  which makes it impossible to use Jinja strings as an arg to {{bucket_key}}, 
> since these don't get rendered in the scope of the DAG {{*.py}} file itself. 
> Below is an example; I'm using Airflow 1.9.0 with Python 3.5.3:
> Given the below DAG code, where "my_s3_key" is 
> {{s3://bucket/prefix/object.txt:}}
> {code:java}
> dag = DAG('sample_dag', start_date=datetime(2019, 8, 1, 12, 15))
> s3_variable_sensor = S3KeySensor(
>     task_id='s3_variable_sensor',
>     bucket_key=Variable.get('my_s3_key'),
>     dag=dag
> )
> s3_jinja_sensor = S3KeySensor(
>     task_id='s3_jinja_sensor',
>     bucket_key="{{ var.value.my_s3_key }}",
>     dag=dag
> )
> {code}
> Executing the first task will run just fine while the next task will throw 
> the following exception:
> {code:java}
> airflow.exceptions.AirflowException: Please provide a bucket_name.
> {code}
> This ticket is to propose a code change that will move input validation out 
> of the constructor to allow for Jinja-templated strings to be passed into 
> both {{bucket_name}} and {{bucket_key}}.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to