magdagultekin opened a new issue, #29781:
URL: https://github.com/apache/airflow/issues/29781
### Apache Airflow Provider(s)
sftp
### Versions of Apache Airflow Providers
4.2.3
### Apache Airflow version
2.5.1
### Operating System
macOS Ventura 13.2.1
### Deployment
Astronomer
### Deployment details
_No response_
### What happened
I wanted to use `file_pattern` and `newer_than` in `SFTPSensor` to find only
the files that landed in SFTP after the data interval of the prior successful
DAG run (`{{ prev_data_interval_end_success }}`).
I have four text files (`file.txt`, `file1.txt`, `file2.txt` and
`file3.txt`) but only `file3.txt` has the last modification date after the data
interval of the prior successful DAG run. I use the following file pattern:
`"*.txt"`.
The moment the first file (`file.txt`) was matched and the modification date
did not meet the requirement, the task changed the status to
`up_for_reschedule`.
### What you think should happen instead
The other files matching the pattern should be checked as well.
### How to reproduce
```python
import pendulum
from airflow import DAG
from airflow.providers.sftp.sensors.sftp import SFTPSensor
with DAG(
dag_id="sftp_test",
start_date=pendulum.datetime(2023, 2, 1, tz="UTC"),
schedule="@once",
render_template_as_native_obj=True,
):
wait_for_file = SFTPSensor(
task_id="wait_for_file",
sftp_conn_id="sftp_default",
path="/upload/",
file_pattern="*.txt",
newer_than="{{ prev_data_interval_end_success }}",
)
```
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]