yangyulely opened a new issue, #40216:
URL: https://github.com/apache/airflow/issues/40216
### Apache Airflow version
main (development)
### If "Other Airflow 2 version" selected, which one?
_No response_
### What happened?
get_file_stats error occurs when dag:`example_bash_decorator `is manually
triggered on webserver:
> a3be7488f829
*** Found local files:
*** *
/root/airflow/logs/dag_id=example_bash_decorator/run_id=manual__2024-06-13T10:37:13.291437+00:00/task_id=get_file_stats/attempt=1.log
[2024-06-13, 10:37:21 UTC] {local_task_job_runner.py:120} ▶ Pre task
execution logs
[2024-06-13, 10:37:22 UTC] {subprocess.py:63} INFO - Tmp dir root location:
/tmp
[2024-06-13, 10:37:22 UTC] {subprocess.py:75} INFO - Running command:
['/usr/bin/bash', '-c', 'stat ']
[2024-06-13, 10:37:22 UTC] {subprocess.py:86} INFO - Output:
[2024-06-13, 10:37:22 UTC] {subprocess.py:93} INFO - stat: missing operand
[2024-06-13, 10:37:22 UTC] {subprocess.py:93} INFO - Try 'stat --help' for
more information.
[2024-06-13, 10:37:22 UTC] {subprocess.py:97} INFO - Command exited with
return code 1
[2024-06-13, 10:37:22 UTC] {taskinstance.py:3150} ERROR - Task failed with
exception
Traceback (most recent call last):
File "/opt/airflow/airflow/models/taskinstance.py", line 759, in
_execute_task
result = _execute_callable(context=context, **execute_callable_kwargs)
File "/opt/airflow/airflow/models/taskinstance.py", line 725, in
_execute_callable
return ExecutionCallableRunner(
File "/opt/airflow/airflow/utils/operator_helpers.py", line 250, in run
return self.func(*args, **kwargs)
File "/opt/airflow/airflow/models/baseoperator.py", line 406, in wrapper
return func(self, *args, **kwargs)
File "/opt/airflow/airflow/decorators/bash.py", line 81, in execute
return super().execute(context)
File "/opt/airflow/airflow/models/baseoperator.py", line 406, in wrapper
return func(self, *args, **kwargs)
File "/opt/airflow/airflow/decorators/base.py", line 265, in execute
return_value = super().execute(context)
File "/opt/airflow/airflow/models/baseoperator.py", line 406, in wrapper
return func(self, *args, **kwargs)
File "/opt/airflow/airflow/operators/bash.py", line 243, in execute
raise AirflowException(
airflow.exceptions.AirflowException: Bash command failed. The command
returned a non-zero exit code 1.
[2024-06-13, 10:37:22 UTC] {taskinstance.py:1197} INFO - Marking task as
FAILED. dag_id=example_bash_decorator, task_id=get_file_stats,
run_id=manual__2024-06-13T10:37:13.291437+00:00,
execution_date=20240613T103713, start_date=20240613T103721,
end_date=20240613T103722
[2024-06-13, 10:37:22 UTC] {taskinstance.py:338} ▶ Post task execution logs
### What you think should happen instead?
example_bash_decorator can triggered successfully on webserver.
### How to reproduce
Start the airflow using following commands:
`pwd`
> /opt/airflow
1. `breeze --backend mysql --python 3.8`
2. `airflow webserver -D -l logs/airflow-webserver.log`
3. `airflow scheduler -D -l logs/airflow-scheduler.log`
### Operating System
PRETTY_NAME="Debian GNU/Linux 12 (bookworm)" NAME="Debian GNU/Linux"
VERSION_ID="12" VERSION="12 (bookworm)" VERSION_CODENAME=bookworm ID=debian
HOME_URL="https://www.debian.org/" SUPPORT_URL="https://www.debian.org/support"
BUG_REPORT_URL="https://bugs.debian.org/"
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
_No response_
### Anything else?
The workerdir of the dag is `/,` not `/opt/airflow`. The
`airflow/example_dags/*.py` file cannot be found in root directory `/`
When modified into `opt/airflow/airflow/example_dags / *. py`, work fine
May consider to change the default value to
`opt/airflow/airflow/example_dags / *. py`?
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]