jscheffl commented on code in PR #42216:
URL: https://github.com/apache/airflow/pull/42216#discussion_r1760095770


##########
contributing-docs/17_adding_api_endpoints.rst:
##########
@@ -18,122 +18,78 @@
 Adding a New API Endpoint in Apache Airflow
 ===========================================
 
-This documentation outlines the steps required to add a new API endpoint in 
Apache Airflow. It includes defining the endpoint in the OpenAPI specification, 
implementing the logic, running pre-commit checks, and documenting the changes.
+This documentation outlines the steps required to add a new API endpoint in 
Apache Airflow. It includes implementing the logic, running pre-commit checks, 
and documenting the changes.
 
 **The outline for this document in GitHub is available at top-right corner 
button (with 3-dots and 3 lines).**
 
-Step 1: Define the Endpoint in ``v1.yaml``
-----------------------------------------
-1. Navigate to the ``v1.yaml`` file, which contains the OpenAPI specifications.
-2. Add a new path for your endpoint, specifying the URL path, HTTP method 
(GET, POST, etc.), and a brief summary.
-3. Define the parameters required for the endpoint, including types, 
required/optional status, and default values.
-4. Describe the responses, including status codes, content types, and schema 
details.
-
-Example:
-
-.. code-block:: yaml
-
-   paths:
-     /example/endpoint:
-       get:
-         summary: Example API endpoint
-         description: This endpoint provides an example response.
-         parameters:
-           - name: example_param
-             in: query
-             required: true
-             schema:
-               type: string
-         responses:
-           "200":
-             description: Successful response
-             content:
-               application/json:
-                 schema:
-                   type: object
-                   properties:
-                     message:
-                       type: string
-           "400":
-             $ref: "#/components/responses/BadRequest"
-           "404":
-             $ref: "#/components/responses/NotFound"
-
-
-Step 2: Implement the Endpoint Logic
+Step 1: Implement the Endpoint Logic
 ------------------------------------
-1. In the appropriate Python file, implement the endpoint's logic.
-2. Ensure proper parameter handling and validation.
-3. Implement the core logic, such as data retrieval or processing.
-4. Add error handling for potential issues like missing parameters or invalid 
data.
+1. Navigate to the ``views`` directory.
+2. Register a new route for your endpoint with the appropriate HTTP method, 
query params, permissions, body type, etc.
+3. Specify the appropriate Pydantic type in the return type annotation.
 
 Example:
 
 .. code-block:: python
 
-   @security.requires_access_dag("GET", DagAccessEntity.TASK_LOGS)
-   @provide_session
-   @unify_bucket_name_and_key
-   @provide_bucket_name
-   def get_example(
-       *,
-       example_param: str,
-       session: Session = NEW_SESSION,
-   ) -> APIResponse:
-       # Implementation details here
-       pass
+  @dags_router.get("/dags")  # permissions go in the dependencies parameter 
here
+  async def get_dags(
+      *,
+      limit: int = 100,
+      offset: int = 0,
+      tags: Annotated[list[str] | None, Query()] = None,
+      dag_id_pattern: str | None = None,
+      only_active: bool = True,
+      paused: bool | None = None,
+      order_by: str = "dag_id",
+      session: Annotated[Session, Depends(get_session)],
+  ) -> DAGCollectionResponse:
+      pass
+
+
+Step 2: Add tests for your Endpoints
+------------------------------------
+1. Verify manually with a local API that the endpoint behaves as expected.
+2. Go to the test folder and initialize new tests.
+3. Implements extensives tests to validate query param, permissions, error 
handling etc.
+
+
+Step 3: Documentation
+---------------------
+Documentation is built automatically by FastAPI and our pre-commit hooks. 
Verify by going to ``/docs`` that the documentation is clear and appears as 
expected (body and return types, query params, validation)
 
 
-Step 3: Run Pre-commit Hooks
+Step 4: Run Pre-commit Hooks
 -----------------------------
 1. Ensure all code meets the project's quality standards by running pre-commit 
hooks.
 2. Pre-commit hooks include static code checks, formatting, and other 
validations.
-3. One specific pre-commit hook to note is the ``update-common-sql-api-stubs`` 
hook. This hook automatically updates the common SQL API stubs whenever it 
recognizes changes in the API. This ensures that any modifications to the API 
are accurately reflected in the stubs, maintaining consistency between the 
implementation and documentation.
+3. Persisted openapi spec is located in ``v1-generated.yaml`` and a hook will 
take care of updating it based on your new Endpoint, you just need to add and 
commit the change.
 4. Run the following command to execute all pre-commit hooks:
 
 .. code-block:: bash
 
    pre-commit run --all-files
 
 
-Optional: Adding Schemas
------------------------------
-In some cases, you may need to define additional schemas for new data 
structures. For example, if you are adding an endpoint that involves new data 
objects or collections, you may define a schema in a Python file. Here's an 
example:
+Optional: Adding Pydantic Model
+-------------------------------
+In some cases, you may need to define additional models for new data 
structures. For example, if you are adding an endpoint that involves new data 
objects or collections, you may define a model in a Python file. The model will 
be used to validate and serialize/deserialize objects. Here's an example:
 
 .. code-block:: python
 
-   class TaskLogPageSchema(Schema):
-       """Schema for task log pagination details."""
-
-       total_pages = fields.Int(description="Total number of pages for task 
logs.")
-       current_page = fields.Int(description="Current page number.")
-       page_size = fields.Int(description="Number of logs per page.")
-
-These schemas are defined to structure and validate the data handled by the 
API. Once defined, you can include these schemas in the OpenAPI specification 
file (e.g., v1.yaml) and reference them in the API endpoint definitions.
-
-For example, in v1.yaml, you might add:
-
-.. code-block:: yaml
+    class DAGModelResponse(BaseModel):
+        """DAG serializer for responses."""
 
-   components:
-     schemas:
-       TaskLogPage:
-         type: object
-         properties:
-           total_pages:
-             type: integer
-             description: Total number of pages for task logs.
-           current_page:
-             type: integer
-             description: Current page number.
-           page_size:
-             type: integer
-             description: Number of logs per page.
+        dag_id: str
+        dag_display_name: str
+        is_paused: bool
+        is_active: bool
+        last_parsed_time: datetime | None
 
-Including schemas helps in automatically generating API documentation and 
ensures consistent data structures across the API.
+These models are defined to structure and validate the data handled by the 
API. Once defined, these models will automatically be added to the OpenAPI spec 
file as long as they are actually used by one endpoint.
 
-After adding or modifying schemas, make sure to run the pre-commit hooks again 
to update any generated files.
+After adding or modifying Pydantic models, make sure to run the pre-commit 
hooks again to update any generated files.
 
-Note on the New FastAPI API
----------------------------
+Situational: Legacy Endpoint Migration

Review Comment:
   If we migrate the public API, should we also make sure that existing pytests 
are working 1:1 with the new endpoint?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to