Lee-W commented on code in PR #58702:
URL: https://github.com/apache/airflow/pull/58702#discussion_r2563048840


##########
contributing-docs/MPROCS_QUICK_REFERENCE.md:
##########
@@ -0,0 +1,99 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements.  See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership.  The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied.  See the License for the
+ specific language governing permissions and limitations
+ under the License.
+ -->
+
+# Quick Reference: mprocs Support in Breeze
+
+## Basic Command
+
+```bash
+breeze start-airflow --use-mprocs
+```
+
+## Common Usage Patterns
+
+| Command                                                       | Description  
                   |
+|---------------------------------------------------------------|---------------------------------|
+| `breeze start-airflow --use-mprocs`                           | Start 
Airflow with mprocs       |
+| `breeze start-airflow --use-mprocs --debug scheduler`         | Debug 
scheduler with mprocs     |
+| `breeze start-airflow --use-mprocs --executor CeleryExecutor` | Use mprocs 
with Celery          |
+| `breeze start-airflow --use-mprocs --dev-mode`                | Use mprocs 
in dev mode          |
+| `breeze start-airflow --use-tmux`                             | Explicitly 
use tmux (default)   |
+
+
+## mprocs Keyboard Shortcuts
+
+| Key     | Action                       |
+|---------|------------------------------|
+| `↑↓`    | Navigate between processes   |
+| `Space` | Show/hide process output     |
+| `r`     | Restart selected process     |
+| `k`     | Kill selected process        |
+| `s`     | Start selected process       |
+| `a`     | Toggle showing all processes |
+| `q`     | Quit mprocs                  |
+| `?`     | Show help                    |
+
+## Components Managed
+
+- **scheduler** - Airflow scheduler
+- **api_server** (3.x+) / **webserver** (2.x) - Web interface
+- **triggerer** - Handles deferred tasks
+- **dag_processor** - Standalone DAG processor (when enabled)
+- **celery_worker** - Celery worker (with CeleryExecutor)
+- **flower** - Celery monitoring (when enabled)
+- **edge_worker** - Edge worker (with EdgeExecutor)
+
+## Environment Variables
+
+| Variable                   | Purpose                         |
+|----------------------------|---------------------------------|
+| `USE_MPROCS`               | Enable mprocs mode              |
+| `INTEGRATION_CELERY`       | Enable Celery components        |
+| `CELERY_FLOWER`            | Enable Flower UI                |
+| `STANDALONE_DAG_PROCESSOR` | Enable standalone DAG processor |
+| `BREEZE_DEBUG_*`           | Enable component debugging      |
+| `DEV_MODE`                 | Enable development mode         |
+
+## Debug Ports (when debugging enabled)
+
+| Component     | Port   |
+|---------------|--------|
+| Scheduler     | 50231  |
+| DAG Processor | 50232  |

Review Comment:
   ```suggestion
   | Dag Processor | 50232  |
   ```



##########
scripts/mprocs/basic.yaml:
##########
@@ -0,0 +1,54 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# Basic mprocs configuration for Airflow
+# This is a static example configuration. For dynamic configuration based on
+# your environment, use the generate_mprocs_config.py script from 
../in_container/bin.
+---
+procs:
+  scheduler:
+    shell: airflow scheduler
+    restart: always
+    scrollback: 100000
+
+  api_server:
+    shell: airflow api-server
+    restart: always
+    scrollback: 100000
+
+  triggerer:
+    shell: airflow triggerer
+    restart: always
+    scrollback: 100000
+
+    # Optional: Uncomment to enable DAG processor

Review Comment:
   ```suggestion
       # Optional: Uncomment to enable Dag processor
   ```



##########
scripts/mprocs/README.md:
##########
@@ -0,0 +1,123 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements.  See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership.  The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied.  See the License for the
+ specific language governing permissions and limitations
+ under the License.
+ -->
+
+i# mprocs Configuration for Airflow
+
+This directory contains mprocs configuration files for running Airflow 
components.
+
+## Overview
+
+mprocs is a modern alternative to tmux for managing multiple processes. It 
provides:
+
+- Intuitive TUI with better visual feedback
+- Easy process management (start, stop, restart)
+- Mouse and keyboard navigation
+- Process status indicators
+- Better cross-platform support
+
+## Files
+
+- `basic.yaml` - Static mprocs configuration example
+- `generate_mprocs_config.py` - Script to generate dynamic mprocs 
configuration (located in `scripts/in_container/bin/`)
+- `run_mprocs` - Entry point script for starting Airflow with mprocs (located 
in `scripts/in_container/bin/`)
+
+## Usage
+
+### Using Breeze
+
+The easiest way to use mprocs with Airflow is through the Breeze command:
+
+```bash
+breeze start-airflow --use-mprocs
+```
+
+This will:
+
+1. Start the Breeze container
+2. Dynamically generate an mprocs configuration based on your environment
+3. Launch mprocs with all configured Airflow components
+
+### With Debug Support
+
+You can combine mprocs with debugging:
+
+```bash
+breeze start-airflow --use-mprocs --debug scheduler --debug triggerer
+```
+
+### Manual Usage
+
+Inside the Breeze container, you can manually generate and use mprocs 
configuration:
+
+```bash
+# Generate configuration
+python3 /opt/airflow/scripts/in_container/bin/generate_mprocs_config.py 
/tmp/airflow-mprocs.yaml
+
+# Start mprocs
+mprocs --config /tmp/airflow-mprocs.yaml
+```
+
+## Dynamic Configuration
+
+The `generate_mprocs_config.py` script reads environment variables to 
determine which components to run:
+
+- `INTEGRATION_CELERY` - Enables Celery worker
+- `CELERY_FLOWER` - Enables Flower UI
+- `STANDALONE_DAG_PROCESSOR` - Enables standalone DAG processor
+- `AIRFLOW__CORE__EXECUTOR` - Determines if Edge worker should run
+- `USE_AIRFLOW_VERSION` - Determines if API server (3.x+) or webserver (2.x) 
should run
+- `BREEZE_DEBUG_*` - Enables debug mode for specific components
+- `DEV_MODE` - Enables development mode features
+
+## Components
+
+The following Airflow components can be managed by mprocs:
+
+1. **scheduler** - Main Airflow scheduler
+2. **api_server** (Airflow 3.x+) or **webserver** (Airflow 2.x)
+3. **triggerer** - Handles deferred tasks
+4. **dag_processor** - Standalone DAG processor (optional)

Review Comment:
   ```suggestion
   4. **dag_processor** - Standalone Dag processor (optional)
   ```



##########
scripts/mprocs/README.md:
##########
@@ -0,0 +1,123 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements.  See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership.  The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied.  See the License for the
+ specific language governing permissions and limitations
+ under the License.
+ -->
+
+i# mprocs Configuration for Airflow

Review Comment:
   ```suggestion
   # mprocs Configuration for Airflow
   ```
   
   
   look like vim insert mode 👀



##########
scripts/in_container/bin/generate_mprocs_config.py:
##########
@@ -0,0 +1,216 @@
+#!/usr/bin/env python3
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""Generate mprocs configuration dynamically based on environment variables."""
+
+from __future__ import annotations
+
+import os
+import sys
+from pathlib import Path
+
+
+def get_env_bool(var_name: str, default: str = "false") -> bool:
+    """Get environment variable as boolean."""
+    return os.environ.get(var_name, default).lower() == "true"
+
+
+def get_env(var_name: str, default: str = "") -> str:
+    """Get environment variable with default."""
+    return os.environ.get(var_name, default)
+
+
+def generate_mprocs_config() -> str:
+    """Generate mprocs YAML configuration based on environment variables."""
+    procs = {}
+
+    # Scheduler
+    scheduler_cmd = "airflow scheduler"
+    if get_env_bool("BREEZE_DEBUG_SCHEDULER"):
+        port = get_env("BREEZE_DEBUG_SCHEDULER_PORT", "5678")
+        scheduler_cmd = f"debugpy --listen 0.0.0.0:{port} --wait-for-client -m 
airflow scheduler"
+
+    procs["scheduler"] = {
+        "shell": scheduler_cmd,
+        "restart": "always",
+        "scrollback": 100000,
+    }
+
+    # API Server or Webserver (depending on Airflow version)
+    use_airflow_version = get_env("USE_AIRFLOW_VERSION", "")
+    if not use_airflow_version.startswith("2."):
+        # API Server (Airflow 3.x+)
+        if get_env_bool("BREEZE_DEBUG_APISERVER"):
+            port = get_env("BREEZE_DEBUG_APISERVER_PORT", "5679")
+            api_cmd = f"debugpy --listen 0.0.0.0:{port} --wait-for-client -m 
airflow api-server -d"
+        else:
+            dev_mode = get_env_bool("DEV_MODE")
+            api_cmd = "airflow api-server -d" if dev_mode else "airflow 
api-server"
+
+        procs["api_server"] = {
+            "shell": api_cmd,
+            "restart": "always",
+            "scrollback": 100000,
+        }
+    else:
+        # Webserver (Airflow 2.x)
+        if get_env_bool("BREEZE_DEBUG_WEBSERVER"):
+            port = get_env("BREEZE_DEBUG_WEBSERVER_PORT", "5680")
+            web_cmd = f"debugpy --listen 0.0.0.0:{port} --wait-for-client -m 
airflow webserver"
+        else:
+            dev_mode = get_env_bool("DEV_MODE")
+            web_cmd = "airflow webserver -d" if dev_mode else "airflow 
webserver"
+
+        procs["webserver"] = {
+            "shell": web_cmd,
+            "restart": "always",
+            "scrollback": 100000,
+        }
+
+    # Triggerer
+    triggerer_cmd = "airflow triggerer"
+    if get_env_bool("BREEZE_DEBUG_TRIGGERER"):
+        port = get_env("BREEZE_DEBUG_TRIGGERER_PORT", "5681")
+        triggerer_cmd = f"debugpy --listen 0.0.0.0:{port} --wait-for-client -m 
airflow triggerer"
+
+    procs["triggerer"] = {
+        "shell": triggerer_cmd,
+        "restart": "always",
+        "scrollback": 100000,
+    }
+
+    # Celery Worker (conditional)
+    if get_env_bool("INTEGRATION_CELERY"):
+        if get_env_bool("BREEZE_DEBUG_CELERY_WORKER"):
+            port = get_env("BREEZE_DEBUG_CELERY_WORKER_PORT", "5682")
+            celery_cmd = f"debugpy --listen 0.0.0.0:{port} --wait-for-client 
-m airflow celery worker"
+        else:
+            celery_cmd = "airflow celery worker"
+
+        procs["celery_worker"] = {
+            "shell": celery_cmd,
+            "restart": "always",
+            "scrollback": 100000,
+        }
+
+    # Flower (conditional)
+    if get_env_bool("INTEGRATION_CELERY") and get_env_bool("CELERY_FLOWER"):
+        if get_env_bool("BREEZE_DEBUG_FLOWER"):
+            port = get_env("BREEZE_DEBUG_FLOWER_PORT", "5683")
+            flower_cmd = f"debugpy --listen 0.0.0.0:{port} --wait-for-client 
-m airflow celery flower"
+        else:
+            flower_cmd = "airflow celery flower"
+
+        procs["flower"] = {
+            "shell": flower_cmd,
+            "restart": "always",
+            "scrollback": 100000,
+        }
+
+    # Edge Worker (conditional)
+    executor = get_env("AIRFLOW__CORE__EXECUTOR", "")
+    if executor == 
"airflow.providers.edge3.executors.edge_executor.EdgeExecutor":
+        if get_env_bool("BREEZE_DEBUG_EDGE"):
+            port = get_env("BREEZE_DEBUG_EDGE_PORT", "5684")
+            edge_cmd = f"debugpy --listen 0.0.0.0:{port} --wait-for-client -m 
airflow edge worker --edge-hostname breeze --queues default"
+        else:
+            # Build command with environment cleanup
+            edge_cmd_parts = [
+                "unset AIRFLOW__DATABASE__SQL_ALCHEMY_CONN || true",
+                "unset AIRFLOW__CELERY__RESULT_BACKEND || true",
+                "unset POSTGRES_HOST_PORT || true",
+                "unset BACKEND || true",
+                "unset POSTGRES_VERSION || true",
+                "export AIRFLOW__LOGGING__BASE_LOG_FOLDER=edge_logs",
+                "airflow edge worker --edge-hostname breeze --queues default",
+            ]
+            edge_cmd = " && ".join(edge_cmd_parts)
+
+        procs["edge_worker"] = {
+            "shell": edge_cmd,
+            "restart": "always",
+            "scrollback": 100000,
+        }
+
+    # DAG Processor (conditional)

Review Comment:
   ```suggestion
       # Dag Processor (conditional)
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to