GitHub user shaealh added a comment to the discussion: Airflow task failed but 
spark kube app is running

@rcrchawla Thanks for sharing the logs. This looks more like an Airflow control 
plane issue than a Spark app issue.

Here's why I say that:
•  Repeated "Client.request" retries
•  "Failed to send heartbeat"
•  API server error in "set_xcom -> session.flush()" to MySQL
So Spark can keep running, but Airflow may still mark the task failed.

Could you share 3 things to confirm?
1) Full MySQL error text (after _mysql.connection.query(...))
2) Exact Airflow task failure reason or state transition
3) Spark operator /XCom-related task config

If this reproduces with healthy DB/ network, then it likely needs an Airflow 
fix. Otherwise it’s probably infra/DB pressure

GitHub link: 
https://github.com/apache/airflow/discussions/63298#discussioncomment-16080166

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to