zhongjiajie commented on a change in pull request #6881: [AIRFLOW-6326] Sort
cli commands and arg
URL: https://github.com/apache/airflow/pull/6881#discussion_r360776054
##########
File path: airflow/bin/cli.py
##########
@@ -1022,10 +1022,10 @@ def _add_subcommand(cls, subparsers, sub):
if subcommands:
sub_subparsers = sub_proc.add_subparsers(dest='subcommand')
sub_subparsers.required = True
- for command in subcommands:
+ for command in sorted(subcommands, key=lambda x: x['name']):
cls._add_subcommand(sub_subparsers, command)
else:
- for arg in sub['args']:
+ for arg in sorted(sub['args'], key=lambda x: cls.args[x].flags[0]):
Review comment:
But I'm not sure if this change is acceptable? Should we keep args in manual
or alphabetical? Because some of args help user more easy to catch specific arg
like `DAG_ID` or `RUN_ID`
```sh
# airflow shceduler -h
# before
optional arguments:
-h, --help show this help message and exit
-d DAG_ID, --dag_id DAG_ID
The id of the dag to run
-sd SUBDIR, --subdir SUBDIR
File location or directory from which to look for
the dag. Defaults to '[AIRFLOW_HOME]/dags' where [AIRFLOW_HOME] is the value
you set for 'AIRFLOW_HOME' config you set in 'airflow.cfg'
-n NUM_RUNS, --num_runs NUM_RUNS
Set the number of runs to execute before exiting
-p, --do_pickle Attempt to pickle the DAG object to send over to the
workers, instead of letting workers run their version of the code
--pid [PID] PID file location
-D, --daemon Daemonize instead of running in the foreground
--stdout STDOUT Redirect stdout to this file
--stderr STDERR Redirect stderr to this file
-l LOG_FILE, --log-file LOG_FILE
Location of the log file
# after
optional arguments:
-h, --help show this help message and exit
--pid [PID] PID file location
--stderr STDERR Redirect stderr to this file
--stdout STDOUT Redirect stdout to this file
-D, --daemon Daemonize instead of running in the foreground
-d DAG_ID, --dag_id DAG_ID
The id of the dag to run
-l LOG_FILE, --log-file LOG_FILE
Location of the log file
-n NUM_RUNS, --num_runs NUM_RUNS
Set the number of runs to execute before exiting
-p, --do_pickle Attempt to pickle the DAG object to send over to the
workers, instead of letting workers run their version of the code
-sd SUBDIR, --subdir SUBDIR
File location or directory from which to look for
the dag. Defaults to '[AIRFLOW_HOME]/dags' where [AIRFLOW_HOME] is the value
you set for 'AIRFLOW_HOME' config you set in 'airflow.cfg'
```
```sh
# airflow dags -h
# before
usage: airflow dags trigger [-h] [-sd SUBDIR] [-r RUN_ID] [-c CONF]
[-e EXEC_DATE]
dag_id
positional arguments:
dag_id The id of the dag
optional arguments:
-h, --help show this help message and exit
-sd SUBDIR, --subdir SUBDIR
File location or directory from which to look \for
the dag. Defaults to '[AIRFLOW_HOME]/dags' where [AIRFLOW_HOME] is the value
you set \for 'AIRFLOW_HOME' config you set \in 'airflow.cfg'
-r RUN_ID, --run_id RUN_ID
Helps to identify this run
-c CONF, --conf CONF JSON string that gets pickled into the DagRun\'s
conf attribute
-e EXEC_DATE, --exec_date EXEC_DATE
The execution date of the DAG
# after
usage: airflow dags trigger [-h] [-c CONF] [-e EXEC_DATE] [-r RUN_ID]
[-sd SUBDIR]
dag_id
positional arguments:
dag_id The id of the dag
optional arguments:
-h, --help show this help message and exit
-c CONF, --conf CONF JSON string that gets pickled into the DagRun\'s
conf attribute
-e EXEC_DATE, --exec_date EXEC_DATE
The execution date of the DAG
-r RUN_ID, --run_id RUN_ID
Helps to identify this run
-sd SUBDIR, --subdir SUBDIR
File location or directory from which to look \for
the dag. Defaults to '[AIRFLOW_HOME]/dags' where [AIRFLOW_HOME] is the value
you set \for 'AIRFLOW_HOME' config you set \in 'airflow.cfg'
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services