amoghrajesh opened a new pull request, #60826:
URL: https://github.com/apache/airflow/pull/60826

   <!--
   Thank you for contributing!
   
   Please provide above a brief description of the changes made in this pull 
request.
   Write a good git commit message following this guide: 
http://chris.beams.io/posts/git-commit/
   
   Please make sure that your code changes are covered with tests.
   And in case of new features or big changes remember to adjust the 
documentation.
   
   Feel free to ping (in general) for the review if you do not see reaction for 
a few days
   (72 Hours is the minimum reaction time you can expect from volunteers) - we 
sometimes miss notifications.
   
   In case of an existing issue, reference it using one of the following:
   
   * closes: #ISSUE
   * related: #ISSUE
   -->
   
   ---
   
   ##### Was generative AI tooling used to co-author this PR?
   
   <!--
   If generative AI tooling has been used in the process of authoring this PR, 
please
   change below checkbox to `[X]` followed by the name of the tool, uncomment 
the "Generated-by".
   -->
   
   - [x] Yes (please specify the tool below)
   Used Cursor with Claude Sonnet 4.5, mostly for tests
   
   ---
   
   ### Problem
   
   task-sdk had several imports from airflow-core for remote logging, for 
instance
   - `from airflow.logging_config import RemoteLogIO, get_remote_task_log, 
get_default_remote_conn_id`
   - `from airflow.configuration import conf` (used atleast 4 times in `log.py`)
   
   Due to this coupling,  client-server separation is difficult where task sdk 
dfoesn't import from core airflow.
   
   ### Proposal
   
   To the existing shared library (`logging`), I am adding utilities containing 
the remote logging protocols and discovery logic. Both core and sdk now use 
this shared code, but each uses its own configuration source (conf)
   
   For context as to what's being moved:
   
   - The `RemoteLogIO` and `RemoteLogStreamIO` protocols define the interface 
that S3, GCS, CloudWatch and other remote log handlers implement. These 
protocols are now in the shared library where both core and sdk (and providers) 
can reference them.
   
   - The discovery logic that found and was responsible for loading the remote 
log handler from the logging config module is extracted into a helper 
`discover_remote_log_handler()` function. This function takes the config paths 
and import function as parameters, so core and sdk can each inject their own 
config.
   
   ### Summary of changes
   
   Airflow core's: `airflow/logging/remote.py` is just a backcompat shim now 
that re-exports from the shared library. The `logging_config.py` module uses 
the shared discovery function instead of it's earlier logic.
   
   For task-sdk, `sdk/log.py` gets its own `_ActiveLoggingConfig` class and 
discovery logic exports. All imports from core are replaced with sdk or shared 
imports. The earlier TODO's are removed too.
   
   
   ### Impact on remote logging
   
   No breaking changes. 
   
   - Provider remote log handlers work unchanged, same configuration mechanism, 
same connection ID options. 
   - The difference is internal - task processes now use sdk's config instead 
of core's/
   
   ### For confidence, entire testing flow is here:
   
   1. Run breeze with localstack integration after setting these env vars:
   
   ```shell
   export AIRFLOW__LOGGING__REMOTE_LOGGING=true
   export AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER=s3://test-airflow-logs
   export AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID=aws_default
   export AIRFLOW__LOGGING__DELETE_LOCAL_LOGS=false
   export 
AIRFLOW_CONN_AWS_DEFAULT=aws://test:test@?endpoint_url=http://localstack:4566&region_name=us-east-1
   ```
   
   `breeze start-airflow --integration localstack`
   
   2. Create a simple dag like this one:
   ```python
   from datetime import datetime
   from airflow.sdk import DAG
   from airflow.providers.standard.operators.python import PythonOperator
   
   
   def test_logging():
       import logging
       logger = logging.getLogger(__name__)
   
       logger.info("Testing remote logging with LocalStack")
       logger.warning("This should appear in S3")
   
       for i in range(5):
           logger.info(f"Log message {i}")
   
       print("Print statement test")
   
   
   with DAG(
       "test_remote_logging",
       start_date=datetime(2024, 1, 1),
       schedule=None,
       catchup=False,
   ) as dag:
       PythonOperator(
           task_id="log_test",
           python_callable=test_logging,
       )
   
   ```
   
   3. Trigger the dag and observe the logs
   <img width="2559" height="1189" alt="image" 
src="https://github.com/user-attachments/assets/f94b9896-7490-47ca-8bcb-3480c8d70507";
 />
   
   4. Check in localstack logs for requests:
   
   <img width="1722" height="1018" alt="image" 
src="https://github.com/user-attachments/assets/d8de3dc4-57f1-422e-b0ee-8a49eea3131d";
 />
   
   
   5. Check logs on localstack container using `awslocal` too
   
   <img width="1722" height="1018" alt="image" 
src="https://github.com/user-attachments/assets/95e249b5-3823-4396-a719-f6a7ff7ce694";
 />
   
   
   
   * Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information. Note: commit author/co-author name and email in commits 
become permanently public when merged.
   * For fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   * When adding dependency, check compliance with the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   * For significant user-facing changes create newsfragment: 
`{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in 
[airflow-core/newsfragments](https://github.com/apache/airflow/tree/main/airflow-core/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to