kundan1301 opened a new issue #10113:
URL: https://github.com/apache/airflow/issues/10113


   <!--
   
   Welcome to Apache Airflow!  For a smooth issue process, try to answer the 
following questions.
   Don't worry if they're not all applicable; just try to include what you can 
:-)
   
   If you need to include code snippets or logs, please put them in fenced code
   blocks.  If they're super-long, please use the details tag like
   <details><summary>super-long log</summary> lots of stuff </details>
   
   Please delete these comment blocks before submitting the issue.
   
   -->
   
   <!--
   
   IMPORTANT!!!
   
   PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
   NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
   
   PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
   
   Please complete the next sections or the issue will be closed.
   This questions are the first thing we need to know to understand the context.
   
   -->
   
   **Apache Airflow version**: 1.10.9
   
   
   
   **Environment**: composer
   
   - **Cloud provider or hardware configuration**: google cloud
   
   
   **What happened**:
   TypeError: '>' not supported between instances of 'NoneType' and 'int
   At line 
https://github.com/apache/airflow/blob/1.10.9/airflow/contrib/operators/gcp_transfer_operator.py#L643
   None is passed as default timeout but at line 
   
https://github.com/apache/airflow/blob/1.10.9/airflow/contrib/hooks/gcp_transfer_hook.py#L338
 
   its comparing against int. 
   
   **stacktrace**
   Traceback (most recent call last)
     File "/usr/local/lib/airflow/airflow/models/taskinstance.py", line 972, in 
_run_raw_tas
       result = task_copy.execute(context=context
     File 
"/usr/local/lib/airflow/airflow/contrib/operators/gcp_transfer_operator.py", 
line 643, in execut
       hook.wait_for_transfer_job(job, timeout=self.timeout
     File "/usr/local/lib/airflow/airflow/contrib/hooks/gcp_api_base_hook.py", 
line 284, in wrapper_decorato
       return func(self, *args, **kwargs
     File "/usr/local/lib/airflow/airflow/contrib/hooks/gcp_transfer_hook.py", 
line 338, in wait_for_transfer_jo
       while timeout > 0
   TypeError: '>' not supported between instances of 'NoneType' and 'int
   
   looking at source code we may face same exception at line 
   
https://github.com/apache/airflow/blob/1.10.9/airflow/contrib/operators/gcp_transfer_operator.py#L781
  as well. 
   
   
   <!-- (please include exact error messages if you can) -->
   
   **What you expected to happen**:
   Either pass significantly large value as default timeout for example 
(10**18) or handle None type as special case while checking for status in while 
loop. 
   
   
   <!-- What do you think went wrong? -->
   
   <!---
   
   As minimally and precisely as possible. Keep in mind we do not have access 
to your cluster or dags.
   
   If you are using kubernetes, please attempt to recreate the issue using 
minikube or kind.
   
   ## Install minikube/kind
   
   - Minikube https://minikube.sigs.k8s.io/docs/start/
   - Kind https://kind.sigs.k8s.io/docs/user/quick-start/
   
   If this is a UI bug, please provide a screenshot of the bug or a link to a 
youtube video of the bug in action
   
   You can include images using the .md sytle of
   ![alt text](http://url/to/img.png)
   
   To record a screencast, mac users can use QuickTime and then create an 
unlisted youtube video with the resulting .mov file.
   
   --->
   
   
   **Anything else we need to know**: 
   I can fix this and raise a PR. Let me know.
   
   <!--
   
   How often does this problem occur? Once? Every time etc?
   
   Any relevant logs to include? Put them here in side a detail tag:
   <details><summary>x.log</summary> lots of stuff </details>
   
   -->
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to