Hello,

I'm using spark-3.0.0-bin-hadoop3.2 with custom hive metastore DB
(postgres). I'm setting the "autoCreateAll" flag to true, so hive is
creating its relational schema on first use. The problem is there is a
deadlock and the query hangs forever:
*Tx1* (*holds lock on TBLS relation*, wait_event: ClientRead):
*INSERT* *INTO* "PARTITION_KEYS" ("TBL_ID","PKEY_COMMENT","PKEY_NAME",
"PKEY_TYPE","INTEGER_IDX") *VALUES* (*$1*,*$2*,*$3*,*$4*,*$5*)
*Tx2 (waiting for lock on TBLS relation, *wait_event: relation*): *
*ALTER* *TABLE* "TBL_PRIVS" *ADD* *CONSTRAINT* "TBL_PRIVS_FK1" *FOREIGN*
*KEY* ("TBL_ID") *REFERENCES* "TBLS" ("TBL_ID") *INITIALLY* *DEFERRED*

So Tx1 waits for client read and Tx2 is waiting for Tx1. I'm attaching the
stack trace of active thread reading from postgres.

Steps to reproduce:
#start postgres
docker run --name postgres-hive -d -p 5432:5432 -e
POSTGRES_PASSWORD=postgres postgres:13

#run shell with hive metastore DB pointing to postgres
./bin/spark-shell \
--packages org.postgresql:postgresql:42.2.14 \
--conf
spark.hadoop.javax.jdo.option.ConnectionURL=jdbc:postgresql://localhost/postgres?createDatabaseIfNotExist=true
\
--conf
spark.hadoop.javax.jdo.option.ConnectionDriverName=org.postgresql.Driver \
--conf spark.hadoop.javax.jdo.option.ConnectionUserName=postgres \
--conf spark.hadoop.javax.jdo.option.ConnectionPassword=postgres \
--conf spark.hadoop.hive.metastore.schema.verification=false \
--conf spark.hadoop.datanucleus.schema.autoCreateAll=true \
--conf spark.hadoop.datanucleus.fixedDatastore=false

#Create any hive table
spark.sql("""
     | CREATE TABLE parquet_test (
     |  id int,
     |  str string,
     |  mp MAP<STRING,STRING>,
     |  lst ARRAY<STRING>,
     |  strct STRUCT<A:STRING,B:STRING>)
     | PARTITIONED BY (part string)
     | STORED AS PARQUET;
     | """)

BR,
Tomas
"main@1" prio=5 tid=0x1 nid=NA runnable
  java.lang.Thread.State: RUNNABLE
          at java.net.SocketInputStream.socketRead0(SocketInputStream.java:-1)
          at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
          at java.net.SocketInputStream.read(SocketInputStream.java:171)
          at java.net.SocketInputStream.read(SocketInputStream.java:141)
          at 
org.postgresql.core.VisibleBufferedInputStream.readMore(VisibleBufferedInputStream.java:161)
          at 
org.postgresql.core.VisibleBufferedInputStream.ensureBytes(VisibleBufferedInputStream.java:128)
          at 
org.postgresql.core.VisibleBufferedInputStream.ensureBytes(VisibleBufferedInputStream.java:113)
          at 
org.postgresql.core.VisibleBufferedInputStream.read(VisibleBufferedInputStream.java:73)
          at org.postgresql.core.PGStream.receiveChar(PGStream.java:370)
          at 
org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2043)
          at 
org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:312)
          - locked <0x3687> (a org.postgresql.core.v3.QueryExecutorImpl)
          at 
org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:448)
          at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:369)
          at 
org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:310)
          at 
org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:296)
          at 
org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:273)
          at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:268)
          at com.jolbox.bonecp.StatementHandle.execute(StatementHandle.java:254)
          at 
org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:879)
          at 
org.datanucleus.store.rdbms.table.TableImpl.createForeignKeys(TableImpl.java:522)
          at 
org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:426)
          at 
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3466)
          at 
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2896)
          at 
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:119)
          at 
org.datanucleus.store.rdbms.RDBMSStoreManager.manageClasses(RDBMSStoreManager.java:1627)
          at 
org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:672)
          at 
org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2088)
          at 
org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1271)
          - locked <0x3688> (a org.datanucleus.store.rdbms.RDBMSStoreManager)
          at 
org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3760)
          at 
org.datanucleus.state.StateManagerImpl.setIdentity(StateManagerImpl.java:2267)
          at 
org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:484)
          at 
org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:120)
          at 
org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:218)
          at 
org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2079)
          at 
org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:1923)
          at 
org.datanucleus.ExecutionContextImpl.persistObjects(ExecutionContextImpl.java:1868)
          at 
org.datanucleus.ExecutionContextThreadedImpl.persistObjects(ExecutionContextThreadedImpl.java:231)
          at 
org.datanucleus.api.jdo.JDOPersistenceManager.makePersistentAll(JDOPersistenceManager.java:773)
          at 
org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:1011)
          at 
sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethodAccessorImpl.java:-1)
          at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at 
org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)
          at com.sun.proxy.$Proxy20.createTable(Unknown Source:-1)
          at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1457)
          at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1503)
          at 
sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethodAccessorImpl.java:-1)
          at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
          at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
          at 
com.sun.proxy.$Proxy21.create_table_with_environment_context(Unknown Source:-1)
          at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2405)
          at 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:93)
          at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:752)
          at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:740)
          at 
sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethodAccessorImpl.java:-1)
          at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
          at com.sun.proxy.$Proxy22.createTable(Unknown Source:-1)
          at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:852)
          at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:867)
          at 
org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$createTable$1(HiveClientImpl.scala:548)
          at 
org.apache.spark.sql.hive.client.HiveClientImpl$$Lambda$2280.467312945.apply$mcV$sp(Unknown
 Source:-1)
          at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
          at 
org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:294)
          at 
org.apache.spark.sql.hive.client.HiveClientImpl$$Lambda$2231.358052064.apply(Unknown
 Source:-1)
          at 
org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:227)
          at 
org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:226)
          - locked <0x3689> (a 
org.apache.spark.sql.hive.client.IsolatedClientLoader)
          at 
org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:276)
          at 
org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:546)
          at 
org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$createTable$1(HiveExternalCatalog.scala:284)
          at 
org.apache.spark.sql.hive.HiveExternalCatalog$$Lambda$2250.1058380873.apply$mcV$sp(Unknown
 Source:-1)
          at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
          at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99)
          - locked <0x368a> (a org.apache.spark.sql.hive.HiveExternalCatalog)
          at 
org.apache.spark.sql.hive.HiveExternalCatalog.createTable(HiveExternalCatalog.scala:242)
          at 
org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.createTable(ExternalCatalogWithListener.scala:94)
          at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:326)
          at 
org.apache.spark.sql.execution.command.CreateTableCommand.run(tables.scala:165)
          at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
          - locked <0x368b> (a 
org.apache.spark.sql.execution.command.ExecutedCommandExec)
          at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
          at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
          at 
org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:229)
          at org.apache.spark.sql.Dataset$$Lambda$2102.790037960.apply(Unknown 
Source:-1)
          at 
org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3616)
          at org.apache.spark.sql.Dataset$$Lambda$2103.1184851239.apply(Unknown 
Source:-1)
          at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:100)
          at 
org.apache.spark.sql.execution.SQLExecution$$$Lambda$2111.1817337570.apply(Unknown
 Source:-1)
          at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160)
          at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:87)
          at 
org.apache.spark.sql.execution.SQLExecution$$$Lambda$2104.1491717274.apply(Unknown
 Source:-1)
          at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:763)
          at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
          at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3614)
          at org.apache.spark.sql.Dataset.<init>(Dataset.scala:229)
          at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100)
          at 
org.apache.spark.sql.Dataset$$$Lambda$1963.1279809101.apply(Unknown Source:-1)
          at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:763)
          at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
          at 
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:606)
          at 
org.apache.spark.sql.SparkSession$$Lambda$1848.2086898471.apply(Unknown 
Source:-1)
          at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:763)
          at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:601)
          at $line14.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:24)
          at $line14.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:37)
          at $line14.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:39)
          at $line14.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:41)
          at $line14.$read$$iw$$iw$$iw$$iw.<init>(<console>:43)
          at $line14.$read$$iw$$iw$$iw.<init>(<console>:45)
          at $line14.$read$$iw$$iw.<init>(<console>:47)
          at $line14.$read$$iw.<init>(<console>:49)
          at $line14.$read.<init>(<console>:51)
          at $line14.$read$.<init>(<console>:55)
          at $line14.$read$.<clinit>(<console>:-1)
          at $line14.$eval$.$print$lzycompute(<console>:7)
          - locked <0x368c> (a $line14.$eval$)
          at $line14.$eval$.$print(<console>:6)
          at $line14.$eval.$print(<console>:-1)
          at 
sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethodAccessorImpl.java:-1)
          at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at 
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:745)
          at 
scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1021)
          at 
scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:574)
          at 
scala.tools.nsc.interpreter.IMain$$Lambda$1230.1632468927.apply(Unknown 
Source:-1)
          at 
scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:41)
          at 
scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37)
          at 
scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
          at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
          at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:600)
          at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:570)
          at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:894)
          at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:912)
          at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:912)
          at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:912)
          at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:912)
          at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:912)
          at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:912)
          at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:912)
          at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:912)
          at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:912)
          at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:762)
          at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:464)
          at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:485)
          at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:239)
          at org.apache.spark.repl.Main$.doMain(Main.scala:78)
          at org.apache.spark.repl.Main$.main(Main.scala:58)
          at org.apache.spark.repl.Main.main(Main.scala:-1)
          at 
sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethodAccessorImpl.java:-1)
          at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
          at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
          at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
          at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
          at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
          at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
          at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
          at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala:-1)
---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to