qaziashikin opened a new pull request, #62240:
URL: https://github.com/apache/airflow/pull/62240

    <!-- SPDX-License-Identifier: Apache-2.0
         https://www.apache.org/licenses/LICENSE-2.0 -->
   
   <!--
   Thank you for contributing!
   
   Please provide above a brief description of the changes made in this pull 
request.
   Write a good git commit message following this guide: 
http://chris.beams.io/posts/git-commit/
   
   Please make sure that your code changes are covered with tests.
   And in case of new features or big changes remember to adjust the 
documentation.
   
   Feel free to ping (in general) for the review if you do not see reaction for 
a few days
   (72 Hours is the minimum reaction time you can expect from volunteers) - we 
sometimes miss notifications.
   
   In case of an existing issue, reference it using one of the following:
   
   * closes: #ISSUE
   * related: #ISSUE
   -->
   
   ## Description
   
   This pull request adds a new AWS operator that will allow for managing 
notebook schedules with the ability to specify frequencies, compute environment 
settings, and notebook parameters, and to be used within Amazon SageMaker 
Unified Studio.
   
   [SageMaker Unified Studio](https://aws.amazon.com/sagemaker/unified-studio/) 
(SMUS) supports development of Airflow DAGs (called "workflows" within the 
product) that are run on an MWAA cluster managed by the project. These 
workflows have the ability to orchestrate the execution of Unified Studio 
artifacts that can connect to data assets stored in a SMUS project.
   
   API-based execution or event-triggered capabilities outside of SageMaker 
Unified Studio are also possible with the documented DataZone APIs that this 
operator executes notebooks with. [TODO: Add links to these APIs]
   
   ## Components
   
   - `SageMakerUnifiedStudioNotebookOperator`: this operator allows users to 
execute Unified Studio artifacts within the context of their project
   - `SageMakerUnifiedStudioNotebookHook`: this hook provides a wrapper around 
the notebook execution API calls
   - `SageMakerUnifiedStudioNotebookSensor`: this sensor polls for the status 
of the notebook execution until a terminal state
   - `SageMakerUnifiedStudioNotebookJobTrigger`: this trigger activates when 
the notebook execution completes and allows the execution to be deferrable
   
   ## Usage
   
   ```python
   run_notebook = SageMakerUnifiedStudioNotebookOperator(
       task_id="notebook-task",
       notebook_id=notebook_id, # This should be the notebook asset identifier 
from within the SageMaker Unified Studio domain
       domain_id=domain_id,
       project_id=project_id,
       client_token="unique-idempotency-token", # optional
       notebook_parameters={
           "param1": "value1", 
           "param2": "value2",
       }, # optional
       compute_configuration={"instance_type": "ml.m5.large"}, # optional
       timeout_configuration={"run_timeout_in_minutes": 1440}, # optional
       wait_for_completion=True, # optional
       waiter_delay=30, # optional
       deferrable=False, # optional
   )
   ```
   
   ## Testing
   
   ### Unit Tests
   
   ```
   breeze testing core-tests -p 3.11 -b postgres 
providers/amazon/tests/provider_tests/amazon/aws/*/test_sagemaker_unified_studio_notebook.py
   
   ```
   ### System Tests
   
   Ensure a properly configured SageMaker Unified Domain and Project as 
indicated in the `example_sagemaker_unified_studio_notebook.py` file. Ensure 
that AWS credentials are populated and up to date, and then run:
   
   ```
   breeze shell --forward-credentials -b postgres -p 3.11
   ```
   
   Then, populate the `DOMAIN_ID`, `PROJECT_ID`, and `NOTEBOOK_ID`:
   
   ```
   export DOMAIN_ID="dzd-4sse0ajynr9skn"
   export PROJECT_ID="3ha87cu1y4urs7"
   export NOTEBOOK_ID="b8ipdh41dp2skn"
   ```
   
   Run the system test:
   
   ```
   pytest --system -v --setup-timeout=3600 --execution-timeout=3600 
--teardown-timeout=3600 \
     
providers/amazon/tests/system/amazon/aws/example_sagemaker_unified_studio_notebook.py
   ```
   
   ---
   
   ##### Was generative AI tooling used to co-author this PR?
   
   <!--
   If generative AI tooling has been used in the process of authoring this PR, 
please
   change below checkbox to `[X]` followed by the name of the tool, uncomment 
the "Generated-by".
   -->
   
   - [X] Yes (please specify the tool below)
   
   Generated-by: [Claude Opus 4.6] following [the 
guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#gen-ai-assisted-contributions)
   
   ---
   
   * Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information. Note: commit author/co-author name and email in commits 
become permanently public when merged.
   * For fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   * When adding dependency, check compliance with the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   * For significant user-facing changes create newsfragment: 
`{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in 
[airflow-core/newsfragments](https://github.com/apache/airflow/tree/main/airflow-core/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to