VasuBajaj opened a new issue #22705:
URL: https://github.com/apache/airflow/issues/22705


   ### Apache Airflow Provider(s)
   
   google
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-google==6.4.0
   
   ### Apache Airflow version
   
   2.1.4
   
   ### Operating System
   
   Debian GNU/Linux 10 (buster)
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   When you run LocalFilesSystemToGCSOperator with the params for src and dest, 
the operator reports a false positive when there are no files present under the 
specified src directory. I expected it to fail stating the specified directory 
doesn't have any file.
   
   [2022-03-15 14:26:15,475] {taskinstance.py:1107} INFO - Executing 
<Task(LocalFilesystemToGCSOperator): upload_files_to_GCS> on 
2022-03-15T14:25:59.554459+00:00
   [2022-03-15 14:26:15,484] {standard_task_runner.py:52} INFO - Started 
process 709 to run task
   [2022-03-15 14:26:15,492] {standard_task_runner.py:76} INFO - Running: 
['***', 'tasks', 'run', 'dag', 'upload_files_to_GCS', 
'2022-03-15T14:25:59.554459+00:00', '--job-id', '1562', '--pool', 
'default_pool', '--raw', '--subdir', 'DAGS_FOLDER/dag.py', '--cfg-path', 
'/tmp/tmp_e9t7pl9', '--error-file', '/tmp/tmpyij6m4er']
   [2022-03-15 14:26:15,493] {standard_task_runner.py:77} INFO - Job 1562: 
Subtask upload_files_to_GCS
   [2022-03-15 14:26:15,590] {logging_mixin.py:104} INFO - Running 
<TaskInstance: dag.upload_files_to_GCS 2022-03-15T14:25:59.554459+00:00 
[running]> on host 653e566fd372
   [2022-03-15 14:26:15,752] {taskinstance.py:1300} INFO - Exporting the 
following env vars:
   AIRFLOW_CTX_DAG_OWNER=jet2
   AIRFLOW_CTX_DAG_ID=dag
   AIRFLOW_CTX_TASK_ID=upload_files_to_GCS
   AIRFLOW_CTX_EXECUTION_DATE=2022-03-15T14:25:59.554459+00:00
   AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-15T14:25:59.554459+00:00
   [2022-03-15 14:26:19,357] {taskinstance.py:1204} INFO - Marking task as 
SUCCESS. gag, task_id=upload_files_to_GCS, execution_date=20220315T142559, 
start_date=20220315T142615, end_date=20220315T142619
   [2022-03-15 14:26:19,422] {taskinstance.py:1265} INFO - 1 downstream tasks 
scheduled from follow-on schedule check
   [2022-03-15 14:26:19,458] {local_task_job.py:149} INFO - Task exited with 
return code 0
   
   ### What you think should happen instead
   
   The operator should at least info that no files were copied than just making 
it successful. 
   
   ### How to reproduce
   
   - create a Dag with LocalFilesSystemToGCSOperator 
   - specify an empty directory as src and a gcp bucket as bucket_name, dest 
param(can be blank). 
   - run the dag
   
   ### Anything else
   
   No
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to