echang0929 opened a new issue #16336: URL: https://github.com/apache/airflow/issues/16336
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> <!-- IMPORTANT!!! PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE NEXT TO "SUBMIT NEW ISSUE" BUTTON!!! PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!! Please complete the next sections or the issue will be closed. These questions are the first thing we need to know to understand the context. --> **Apache Airflow version**: 2.0.2 **Environment**: - **OS** (e.g. from /etc/os-release): macOS 11.4 - **Kernel** (e.g. `uname -a`): Darwin Timothys 20.5.0 Darwin Kernel Version 20.5.0: Sat May 8 05:10:33 PDT 2021; root:xnu-7195.121.3~9/RELEASE_X86_64 x86_64 - **Install tools**: docker-compose version 1.29.1, build c34c88b2 - **Others**: **What happened**: When I follow the guide to initialize airflow env: https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html#initializing-environment I got ERROR information in PostgreSQL logs: ``` ➜ docker logs aflow-std_postgres_1 -f ... 2021-06-08 17:47:30.212 UTC [99] ERROR: relation "log" does not exist at character 13 2021-06-08 17:47:30.212 UTC [99] STATEMENT: INSERT INTO log (dttm, dag_id, task_id, event, execution_date, owner, extra) VALUES ('2021-06-08T17:47:30.198438+00:00'::timestamptz, NULL, NULL, 'cli_upgradedb', NULL, 'default', '{"host_name": "c11ebd81f1d9", "full_command": "[''/home/airflow/.local/bin/airflow'', ''db'', ''upgrade'']"}') RETURNING log.id 2021-06-08 17:47:30.449 UTC [99] ERROR: relation "connection" does not exist at character 55 2021-06-08 17:47:30.449 UTC [99] STATEMENT: SELECT connection.conn_id AS connection_conn_id FROM connection GROUP BY connection.conn_id HAVING count(*) > 1 2021-06-08 17:47:30.451 UTC [99] ERROR: current transaction is aborted, commands ignored until end of transaction block 2021-06-08 17:47:30.451 UTC [99] STATEMENT: SELECT connection.password AS connection_password, connection.extra AS connection_extra, connection.id AS connection_id, connection.conn_id AS connection_conn_id, connection.conn_type AS connection_conn_type, connection.description AS connection_description, connection.host AS connection_host, connection.schema AS connection_schema, connection.login AS connection_login, connection.port AS connection_port, connection.is_encrypted AS connection_is_encrypted, connection.is_extra_encrypted AS connection_is_extra_encrypted FROM connection WHERE connection.conn_type IS NULL ``` <!-- (please include exact error messages if you can) --> **What you expected to happen**: <!-- What do you think went wrong? --> **How to reproduce it**: <!--- As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags. If you are using kubernetes, please attempt to recreate the issue using minikube or kind. ## Install minikube/kind - Minikube https://minikube.sigs.k8s.io/docs/start/ - Kind https://kind.sigs.k8s.io/docs/user/quick-start/ If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action You can include images using the .md style of  To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file. ---> **Anything else we need to know**: <!-- How often does this problem occur? Once? Every time etc? Any relevant logs to include? Put them here in side a detail tag: <details><summary>x.log</summary> lots of stuff </details> --> -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected]
