Hi airflow-users,

I'm looking to institute some periodic cleanup of the Airflow metadata
database and am heeding the recommendation in the docs to back it up
first. We use Postgres as our back end, and so using the pg_dump utility
seems appropriate. However, pg_dump hits an error:

$ pg_dump airflow_db > airflow_db0.sql
pg_dump: error: query failed: ERROR:  permission denied for table dag_code
pg_dump: detail: Query was: LOCK TABLE public.dag_code IN ACCESS SHARE MODE

Per StackOverflow[1], it sounds like this is probably due to another
lock being issued on the table (ACCESS EXCLUSIVE) and that there's no
way around this, except to try to run `pg_dump` when this lock isn't in
place. However, I imagine that this lock is issued every time the
scheduler parsers the DAG code -- i.e. very frequently.

Have others on the list run into this issue? Any suggested workarounds?
One idea may be to only back up tables *other* than dag_code. However, I
think I'd like to avoid only grabbing some tables and not others (and
I'm not sure this would solve the issue, anyhow).

Many thanks!

[1]:
https://stackoverflow.com/questions/63485415/schema-pg-dump-failed-due-to-a-lock-on-a-table

-- 
Ben Hancock (he/him)
Senior Product Data Engineer
ALM Media

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscr...@airflow.apache.org
For additional commands, e-mail: users-h...@airflow.apache.org

Reply via email to