abhinav-bureau opened a new issue, #25335:
URL: https://github.com/apache/superset/issues/25335

   I'm deploying superset using helm chart version  superset-0.10.4 and 
superset version 2.1.0, 
   init db pod is failing while running DB migration.  
   
   I have tried above with both standalone Postgres instance on cloud and on 
k8s well both place it is giving the same error.  
   
   ```INFO  [alembic.runtime.migration] Running upgrade 3317e9248280 -> 
030c840e3a1c, Add query context to slices
   Loaded your LOCAL configuration at [/app/pythonpath/superset_config.py]
   
   Cleaning up slice uuid from dashboard position json.. Done.
   
   Updated 0 pie chart labels.
   0 slices altered
   
   Cleaning up slice uuid from dashboard position json.. Done.
   
   Updated 0 native filter configurations.
   Updated 0 filter sets with 0 filters.
   Upgraded 0 filters and 0 filter sets.
   Traceback (most recent call last):
     File "/usr/local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 1770, in _execute_context
       self.dialect.do_execute(
     File 
"/usr/local/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 
717, in do_execute
       cursor.execute(statement, parameters)
   psycopg2.errors.ObjectInUse: cannot ALTER TABLE "slices" because it is being 
used by active queries in this session
   
   
   The above exception was the direct cause of the following exception:
   ```
   
   These are the package we are installing using pip 
   sqlalchemy-redshift 
   clickhouse-connect 
   PyAthena==2.25.0
    dbapi 
    pyathenajdbc==3.0.1 
    SQLAlchemy==1.4.20 
    apache-superset[cors] 
    timedelta
    authlib
   
   
   AnyIdea how to fix above issue. 
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to