frankbreetz opened a new issue #20725: URL: https://github.com/apache/airflow/issues/20725
### Apache Airflow version 2.2.2 ### What happened The following REST Call produces a body with 100 task instances (The default for the page_limit/limit value (I am not sure which is correct documentation mentions both, neither cause errors)) ``` import requests import json url = "http://localhost:8080/api/v1/dags/~/dagRuns/~/taskInstances" payload = json.dumps({ "page_limit": 1, "limit": 1 }) headers = { 'Authorization': 'Basic XXXXXXXXX', 'Content-Type': 'application/json', 'Cookie': 'session=XXXXXXX' } response = requests.request("GET", url, headers=headers, data=payload) print(response.text) ``` ### What you expected to happen A single task instance entry should be returned in the response from the REST Call ### How to reproduce Run the script above and inspect the results( your DB should have 100+ task Instances) You will get 100 task instance in the response when you should get just 1 ### Operating System Debian GNU/Linux 11 (bullseye) ### Versions of Apache Airflow Providers _No response_ ### Deployment Astronomer ### Deployment details _No response_ ### Anything else Perhaps the best fix for this would be to remove this REST Call and add limit/offset to the post call: https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html#operation/get_task_instances_batch getting all the task instances in a single REST call will overload some systems, so limit/offset allows you to incrementally load ### Are you willing to submit PR? - [X] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
