Github user zuotingbing commented on the issue:
https://github.com/apache/spark/pull/20864
i take a look at [https://github.com/apache/spark/pull/18666]. i find it
can not clean all the ***_resources directories. because when we start
HiveThriftServer2, we created two resource directories:
`8/03/21 11:23:33 INFO **SessionState: Created local directory:
/data1/zdh/spark/hive/tmp/616f66c9-fa4e-4a0c-a63a-10ff97e5019c_resources**
18/03/21 11:23:33 INFO SessionState: Created HDFS directory:
/spark-tmp/scratchdir/root/616f66c9-fa4e-4a0c-a63a-10ff97e5019c
18/03/21 11:23:33 INFO SessionState: Created local directory:
/data1/zdh/spark/hive/tmp/616f66c9-fa4e-4a0c-a63a-10ff97e5019c
18/03/21 11:23:33 INFO SessionState: Created HDFS directory:
/spark-tmp/scratchdir/root/616f66c9-fa4e-4a0c-a63a-10ff97e5019c/_tmp_space.db
18/03/21 11:23:33 INFO HiveClientImpl: Warehouse location for Hive client
(version 1.2.2) is file:/media/A/gitspace/spark/dist/sbin/spark-warehouse
18/03/21 11:23:33 INFO HiveMetaStore: 0: get_database: default
18/03/21 11:23:33 INFO audit: ugi=root ip=unknown-ip-addr
cmd=get_database: default
18/03/21 11:23:33 INFO StateStoreCoordinatorRef: Registered
StateStoreCoordinator endpoint
18/03/21 11:23:33 INFO HiveUtils: Initializing execution hive, version 1.2.1
18/03/21 11:23:34 INFO HiveMetaStore: 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
18/03/21 11:23:34 INFO ObjectStore: ObjectStore, initialize called
18/03/21 11:23:34 INFO Persistence: Property
hive.metastore.integral.jdo.pushdown unknown - will be ignored
18/03/21 11:23:34 INFO Persistence: Property datanucleus.cache.level2
unknown - will be ignored
18/03/21 11:23:36 INFO ObjectStore: Setting MetaStore object pin classes
with
hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
18/03/21 11:23:36 INFO Datastore: The class
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
"embedded-only" so does not have its own datastore table.
18/03/21 11:23:36 INFO Datastore: The class
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so
does not have its own datastore table.
18/03/21 11:23:37 INFO Datastore: The class
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
"embedded-only" so does not have its own datastore table.
18/03/21 11:23:37 INFO Datastore: The class
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so
does not have its own datastore table.
18/03/21 11:23:37 INFO MetaStoreDirectSql: Using direct SQL, underlying DB
is DERBY
18/03/21 11:23:37 INFO ObjectStore: Initialized ObjectStore
18/03/21 11:23:37 WARN ObjectStore: Version information not found in
metastore. hive.metastore.schema.verification is not enabled so recording the
schema version 1.2.0
18/03/21 11:23:38 WARN ObjectStore: Failed to get database default,
returning NoSuchObjectException
18/03/21 11:23:38 INFO HiveMetaStore: Added admin role in metastore
18/03/21 11:23:38 INFO HiveMetaStore: Added public role in metastore
18/03/21 11:23:38 INFO HiveMetaStore: No user is added in admin role, since
config is empty
18/03/21 11:23:38 INFO HiveMetaStore: 0: get_all_databases
18/03/21 11:23:38 INFO audit: ugi=root ip=unknown-ip-addr
cmd=get_all_databases
18/03/21 11:23:38 INFO HiveMetaStore: 0: get_functions: db=default pat=*
18/03/21 11:23:38 INFO audit: ugi=root ip=unknown-ip-addr
cmd=get_functions: db=default pat=*
18/03/21 11:23:38 INFO Datastore: The class
"org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as
"embedded-only" so does not have its own datastore table.
18/03/21 11:23:38 INFO **SessionState: Created local directory:
/data1/zdh/spark/hive/tmp/16aa5bb9-33e4-43e6-8bdb-8e0318ab175e_resources**
18/03/21 11:23:38 INFO SessionState: Created HDFS directory:
/spark-tmp/scratchdir/root/16aa5bb9-33e4-43e6-8bdb-8e0318ab175e
18/03/21 11:23:38 INFO SessionState: Created local directory:
/data1/zdh/spark/hive/tmp/16aa5bb9-33e4-43e6-8bdb-8e0318ab175e
18/03/21 11:23:38 INFO SessionState: Created HDFS directory:
/spark-tmp/scratchdir/root/16aa5bb9-33e4-43e6-8bdb-8e0318ab175e/_tmp_space.db
18/03/21 11:23:38 INFO HiveClientImpl: Warehouse location for Hive client
(version 1.2.2) is file:/media/A/gitspace/spark/dist/sbin/spark-warehouse`
but when stop just remove only one resource directory which is current:
`public void close() throws IOException {
registry.clear();
if (txnMgr != null) txnMgr.closeTxnManager();
JavaUtils.closeClassLoadersTo(conf.getClassLoader(), parentLoader);
**File resourceDir =
new
File(getConf().getVar(HiveConf.ConfVars.DOWNLOADED_RESOURCES_DIR))**;
LOG.debug("Removing resource dir " + resourceDir);`
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]