freget opened a new issue #15963:
URL: https://github.com/apache/airflow/issues/15963


   **Apache Airflow version**: 2.0.2 (but problem can still be found in master)
   
   **What happened**:
   
   When using the SSHHook to connect to an ssh server on a non default port, 
the host_key setting is not added with the correct hostname to the list of 
known hosts. In more detail:
   
   ```python
   from airflow.providers.ssh.hooks.ssh import SSHHook
   import paramiko
   from base64 import decodebytes
   
   hook = SSHHook(remote_host="1.2.3.4", port=1234, username="user")
   # Usually, host_key would come from the connection_extras, for the sake of 
this example we set the value manually:
   host_key = "abc" # Some public key
   hook.host_key = paramiko.RSAKey(data=decodebytes(host_key.encode("utf-8")))
   hook.no_host_key_check = False
   
   conn = hook.get_conn()
   ```
   
   This yields the exception
   
       paramiko.ssh_exception.SSHException: Server '[1.2.3.4]:1234' not found 
in known_hosts
   
   **Reason**:
   In the SSHHook the host_key is added using only the name of the remote host. 
   
https://github.com/apache/airflow/blob/5bd6ea784340e0daf1554e207600eae92318ab09/airflow/providers/ssh/hooks/ssh.py#L221
   
   According the the known_hosts format, we would need 
   
   ```python
   hostname = f"[{self.remote_host}]:{self.port}" if self.port != SSH_PORT else 
self.remote_host
   ```
   
   **Anything else we need to know**:
   
   I will prepare a PR that solves the problem.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to