Aakcht opened a new issue, #29282:
URL: https://github.com/apache/airflow/issues/29282
### Apache Airflow Provider(s)
ssh
### Versions of Apache Airflow Providers
apache-airflow-providers-ssh>=3.3.0
### Apache Airflow version
2.5.0
### Operating System
debian "11 (bullseye)"
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### What happened
I have an SSH operator task where the command can take a long time. In
recent SSH provider versions(>=3.3.0) it stopped working, as I suspect it is
because of #27184 . After this change looks like the timeout is 10 seconds, and
after there is no output provided through SSH for 10 seconds I'm getting the
following error:
```
[2023-01-26, 11:49:57 UTC] {taskinstance.py:1772} ERROR - Task failed with
exception
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/ssh/operators/ssh.py",
line 171, in execute
result = self.run_ssh_client_command(ssh_client, self.command,
context=context)
File
"/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/ssh/operators/ssh.py",
line 156, in run_ssh_client_command
exit_status, agg_stdout, agg_stderr =
self.ssh_hook.exec_ssh_client_command(
File
"/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/ssh/hooks/ssh.py",
line 521, in exec_ssh_client_command
raise AirflowException("SSH command timed out")
airflow.exceptions.AirflowException: SSH command timed out
```
At first I thought that this is ok, since I can just set `conn_timeout`
extra parameter in my ssh connection. But then I noticed that this parameter
from the connection is not used anywhere - so this doesn't work, and you have
to modify your task code to set the needed value of this parameter in the SSH
operator. What's more, even even with modifying task code it's not possible to
achieve the previous behavior(when this parameter was not set) since now it'll
be set to 10 when you pass None as value.
### What you think should happen instead
I think it should be possible to pass timeout parameter through connection
extra field for ssh operator (including None value, meaning no timeout).
### How to reproduce
Add simple DAG with sleeping for more than 10 seconds, for example:
```python
# this DAG only works for SSH provider versions <=3.2.0
from airflow.models import DAG
from airflow.contrib.operators.ssh_operator import SSHOperator
from airflow.utils.dates import days_ago
from airflow.operators.dummy import DummyOperator
args = {
'owner': 'airflow',
'start_date': days_ago(2),
}
dag = DAG(
default_args=args,
dag_id="test_ssh",
max_active_runs=1,
catchup=False,
schedule_interval="@hourly"
)
task0 = SSHOperator(ssh_conn_id='ssh_localhost',
task_id="test_sleep",
command=f'sleep 15s',
dag=dag)
task0
```
Try configuring `ssh_localhost` connection to make the DAG work using extra
conn_timeout or extra timeout (or other) parameters.
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]