sjyangkevin opened a new pull request, #60680:
URL: https://github.com/apache/airflow/pull/60680

   <!--
   Thank you for contributing!
   
   Please provide above a brief description of the changes made in this pull 
request.
   Write a good git commit message following this guide: 
http://chris.beams.io/posts/git-commit/
   
   Please make sure that your code changes are covered with tests.
   And in case of new features or big changes remember to adjust the 
documentation.
   
   Feel free to ping (in general) for the review if you do not see reaction for 
a few days
   (72 Hours is the minimum reaction time you can expect from volunteers) - we 
sometimes miss notifications.
   
   In case of an existing issue, reference it using one of the following:
   
   * closes: #ISSUE
   * related: #ISSUE
   -->
   
   closes: #58381 
   
   ### Summary
   
   The audit logging doesn't seem to capture events for task lifecycle state 
transitions including running and success. As mentioned in the 
[document](https://airflow.apache.org/docs/apache-airflow/stable/security/audit_logs.html#scope-of-audit-logging),
 it should capture system-generated events including "task lifecycle state 
transitions (queued, running, success, failed)". As mentioned in the issue, 
task-level success and running events were logged in Airflow 2.10.x.
   
   In Airflow 3, task state is set to `running` through the execution API 
endpoint `/{task_instance_id}/run` (`ti_run`), and task state update is handled 
the endpoint `/{task_instance_id}/state` (`ti_update_state`). The logic to 
insert log entry seems missing when updating task instance states.
   
   To log the state transition events, we need to write logs into DB through 
when the two functions is processing the state update. However, if we would 
also want to cover the "queue" event, we need to add similar logic in the 
function below:
   
https://github.com/apache/airflow/blob/06fbabb26e826fe2fc79399333b8bff503b458c1/airflow-core/src/airflow/jobs/scheduler_job_runner.py#L412-L426
   
   ### Current (Basic) Approach
   
   Since both `ti_run` and `ti_update_state` already execute a SELECT query at 
the start to fetch task metadata, The audit log entry is constructed through 
`TaskInstanceKey` by copying the data from the query result with an updated 
state. As this insert operation is done through the same `session`, if I 
understand correctly, this operation is atomic and can ensure consistency 
between the actual task state and audit log.
   
   ### Some Issues
   
   1. There are some inconsistency in audit logs between Airflow 2 and Airflow 
3 when implementing in this way. For example, `logical_date` field is empty. In 
`ti_run`, this could probably be collected from the DAG Run query, but in 
`ti_update_state` extra query is required. Also, `owner` field is empty if it 
is not explicitly passed. `extra` field need to be explicitly constructed but 
only `hostname` is available.
   2. In scheduler job, there is `TaskInstance` object available which contains 
information such as `owner` and `logical_date`. This is more ideal as the log 
is more closer to what are in Airflow 2's audit log, but similarly still need 
to construct the `extra` field. To make the logging behavior similar to the one 
we have in scheduler require extra query to construct those information.
   
   The logging behavior is kind of different now based on where the insert is 
implemented. Thinking about a way to make this implementation more unified and 
ensure consistency between actual task state and log entry.
   
   ---
   
   ##### Was generative AI tooling used to co-author this PR?
   
   <!--
   If generative AI tooling has been used in the process of authoring this PR, 
please
   change below checkbox to `[X]` followed by the name of the tool, uncomment 
the "Generated-by".
   -->
   
   - [ ] Yes (please specify the tool below)
   
   <!--
   Generated-by: [Tool Name] following [the 
guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#gen-ai-assisted-contributions)
   -->
   
   ---
   
   * Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information. Note: commit author/co-author name and email in commits 
become permanently public when merged.
   * For fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   * When adding dependency, check compliance with the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   * For significant user-facing changes create newsfragment: 
`{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in 
[airflow-core/newsfragments](https://github.com/apache/airflow/tree/main/airflow-core/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to