DrTeja opened a new issue #16243:
URL: https://github.com/apache/airflow/issues/16243


   <!--
   
   Welcome to Apache Airflow!  For a smooth issue process, try to answer the 
following questions.
   Don't worry if they're not all applicable; just try to include what you can 
:-)
   
   If you need to include code snippets or logs, please put them in fenced code
   blocks.  If they're super-long, please use the details tag like
   <details><summary>super-long log</summary> lots of stuff </details>
   
   Please delete these comment blocks before submitting the issue.
   
   
   
   IMPORTANT!!!
   
   PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
   NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
   
   PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
   
   Please complete the next sections or the issue will be closed.
   These questions are the first thing we need to know to understand the 
context.
   
   -->
   
   **Apache Airflow version**: 2.0
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`):  NA
   
   **Environment**: MAC
   
   - **Cloud provider or hardware configuration**: NA
   - **OS** (e.g. from /etc/os-release): Mac Big sur
   - **Kernel** (e.g. `uname -a`):local 20.4.0 Darwin Kernel Version 20.4.0
   - **Install tools**: NA
   - **Others**: NA
   
   **What happened**:
   
   <!-- (please include exact error messages if you can) -->
   We are using airflow jobs to upload data to big query and created python 
operators and triggering them via creating dags. So when we run manually suing 
airflow tasks test <dag_id> <task_id> <date> things works fine ,but the same 
when triggered via UI, its failing with error 
   
   *** Reading local file: 
/Users/rdoppalapudi/airflow_project//logs/ygrene_etl_process/run_main_etl_project/2021-06-03T14:55:02.676999+00:00/1.log
   [2021-06-03 10:55:07,575] {taskinstance.py:876} INFO - Dependencies all met 
for <TaskInstance: ygrene_etl_process.run_main_etl_project 
2021-06-03T14:55:02.676999+00:00 [queued]>
   [2021-06-03 10:55:07,580] {taskinstance.py:876} INFO - Dependencies all met 
for <TaskInstance: ygrene_etl_process.run_main_etl_project 
2021-06-03T14:55:02.676999+00:00 [queued]>
   [2021-06-03 10:55:07,580] {taskinstance.py:1067} INFO - 
   
--------------------------------------------------------------------------------
   [2021-06-03 10:55:07,580] {taskinstance.py:1068} INFO - Starting attempt 1 
of 1
   [2021-06-03 10:55:07,580] {taskinstance.py:1069} INFO - 
   
--------------------------------------------------------------------------------
   [2021-06-03 10:55:07,586] {taskinstance.py:1087} INFO - Executing 
<Task(PythonOperator): run_main_etl_project> on 2021-06-03T14:55:02.676999+00:00
   [2021-06-03 10:55:07,589] {standard_task_runner.py:52} INFO - Started 
process 9133 to run task
   [2021-06-03 10:55:07,595] {standard_task_runner.py:76} INFO - Running: 
['airflow', 'tasks', 'run', 'ygrene_etl_process', 'run_main_etl_project', 
'2021-06-03T14:55:02.676999+00:00', '--job-id', '16', '--pool', 'default_pool', 
'--raw', '--subdir', '/Users/rdoppalapudi/airflow_project/dags/etl_airflow.py', 
'--cfg-path', '/var/folders/5n/72l0n8zd261dnlkm3n902my0pmw_c2/T/tmp4fd_41fd', 
'--error-file', '/var/folders/5n/72l0n8zd261dnlkm3n902my0pmw_c2/T/tmpn67lg3s9']
   [2021-06-03 10:55:07,597] {standard_task_runner.py:77} INFO - Job 16: 
Subtask run_main_etl_project
   [2021-06-03 10:55:07,625] {logging_mixin.py:104} INFO - Running 
<TaskInstance: ygrene_etl_process.run_main_etl_project 
2021-06-03T14:55:02.676999+00:00 [running]> on host 1.0.0.127.in-addr.arpa
   [2021-06-03 10:55:07,649] {taskinstance.py:1280} INFO - Exporting the 
following env vars:
   AIRFLOW_CTX_DAG_OWNER=ygrene
   AIRFLOW_CTX_DAG_ID=ygrene_etl_process
   AIRFLOW_CTX_TASK_ID=run_main_etl_project
   AIRFLOW_CTX_EXECUTION_DATE=2021-06-03T14:55:02.676999+00:00
   AIRFLOW_CTX_DAG_RUN_ID=manual__2021-06-03T14:55:02.676999+00:00
   [2021-06-03 10:55:07,925] {etl_airflow.py:32} INFO - 
   run_id = manual__2021-06-03T14:55:02.676999+00:00 
    dag_id = DAG: ygrene_etl_process 
    task_id = Task(PythonOperator): run_main_etl_project
   [2021-06-03 10:55:08,246] {transport.py:1819} INFO - Connected (version 2.0, 
client OpenSSH_7.4)
   [2021-06-03 10:55:08,954] {transport.py:1819} INFO - Authentication 
(publickey) successful!
   [2021-06-03 10:55:14,328] {data_integration.py:29} INFO - Uploading data for 
projects 
   [2021-06-03 10:55:14,329] {data_integration.py:31} INFO - Creating bigq obj
   [2021-06-03 10:55:26,035] {bigquery_wrapper_apis.py:117} INFO - Got the 
original json to be uploaded
   [2021-06-03 10:55:27,451] {bigquery_wrapper_apis.py:102} INFO - Creating big 
client obj
   [2021-06-03 10:55:27,687] {local_task_job.py:151} INFO - Task exited with 
return code Negsignal.SIGSEGV
   
   **What you expected to happen**:
   
   <!-- What do you think went wrong? -->
   No Sure whats going wrong exactly as scheduler prompt shows up some logs 
with error 
   
   Running <TaskInstance: ygrene_etl_process.run_main_etl_project 
2021-06-03T14:55:02.676999+00:00 [queued]> on host 1.0.0.127.in-addr.arpa
   The process has forked and you cannot use this CoreFoundation functionality 
safely. You MUST exec().
   Break on 
__THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__()
 to debug.
   [2021-06-03 10:55:28,046] {scheduler_job.py:1205} INFO - Executor reports 
execution of ygrene_etl_process.run_main_etl_project execution_date=2021-06-03 
14:55:02.676999+00:00 exited with status success for try_number 1
   [2021-06-03 10:55:29,427] {dagrun.py:429} ERROR - Marking run <DagRun 
ygrene_etl_process @ 2021-06-03 14:55:02.676999+00:00: 
manual__2021-06-03T14:55:02.676999+00:00, externally triggered: True> failed
   [2021-06-03 10:56:10,676] {scheduler_job.py:1822} INFO - Resetting orphaned 
tasks for active dag runs
   [2021-06-03 11:01:10,846] {scheduler_job.py:1822} INFO - Resetting orphaned 
tasks for active dag runs
   
   
   **How to reproduce it**:
   <!---
   
   As minimally and precisely as possible. Keep in mind we do not have access 
to your cluster or dags.
   
   If you are using kubernetes, please attempt to recreate the issue using 
minikube or kind.
   
   ## Install minikube/kind
   
   - Minikube https://minikube.sigs.k8s.io/docs/start/
   - Kind https://kind.sigs.k8s.io/docs/user/quick-start/
   
   If this is a UI bug, please provide a screenshot of the bug or a link to a 
youtube video of the bug in action
   
   You can include images using the .md style of
   ![alt text](http://url/to/img.png)
   
   To record a screencast, mac users can use QuickTime and then create an 
unlisted youtube video with the resulting .mov file.
   
   --->
   
   No Sure really how to reproduce as these things all working fine till last 
week
   
   **Anything else we need to know**:
   
   <!--
   
   How often does this problem occur? Once? Every time etc?
   
   Any relevant logs to include? Put them here in side a detail tag:
   <details><summary>x.log</summary> lots of stuff </details>
   
   -->
   
   Dag  task works fine manually not sure why its failing only during scheduled 
task run from UI and there is no clear information on what is happening 
internally, also the issue looks to be more generic and related to 
multiprocessing ( this we understand, after looking related information on web)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to