[
https://issues.apache.org/jira/browse/HIVE-23221?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Miklos Gergely updated HIVE-23221:
----------------------------------
Description: These tests are failing in an indeterministic fashion with
such error messages as the one attached. (was: These tests are failing in an
indeterministic fashion with such error messages:
{code:java}
Error MessageNo thread with name metastore_task_thread_test_impl_1
found.Stacktracejava.lang.AssertionError: No thread with name
metastore_task_thread_test_impl_1 found.
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.assertTrue(Assert.java:41)
at
org.apache.hadoop.hive.metastore.TestMetastoreHousekeepingLeaderEmptyConfig.testHouseKeepingThreadExistence(TestMetastoreHousekeepingLeaderEmptyConfig.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
at
org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
at
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
at
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:379)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:340)
at
org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:125)
at
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:413)
Standard OutputFormatting using clusterid: testClusterID
Starting MetaStore Server on port 57893
Standard ErrorSLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/home/hiveptest/35.224.55.83-hiveptest-2/maven/org/apache/logging/log4j/log4j-slf4j-impl/2.12.1/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/hiveptest/35.224.55.83-hiveptest-2/maven/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
DEBUG StatusLogger Using ShutdownCallbackRegistry class
org.apache.logging.log4j.core.util.DefaultShutdownCallbackRegistry
DEBUG StatusLogger Took 0.073789 seconds to load 222 plugins from
sun.misc.Launcher$AppClassLoader@135fbaa4
DEBUG StatusLogger PluginManager 'Converter' found 44 plugins
DEBUG StatusLogger Starting OutputStreamManager SYSTEM_OUT.false.false-1
DEBUG StatusLogger Starting LoggerContext[name=135fbaa4,
org.apache.logging.log4j.core.LoggerContext@223191a6]...
DEBUG StatusLogger Reconfiguration started for context[name=135fbaa4] at URI
null (org.apache.logging.log4j.core.LoggerContext@223191a6) with optional
ClassLoader: null
DEBUG StatusLogger PluginManager 'ConfigurationFactory' found 4 plugins
DEBUG StatusLogger Missing dependencies for Yaml support, ConfigurationFactory
org.apache.logging.log4j.core.config.yaml.YamlConfigurationFactory is inactive
DEBUG StatusLogger Using configurationFactory
org.apache.logging.log4j.core.config.ConfigurationFactory$Factory@4135c3b
DEBUG StatusLogger Apache Log4j Core 2.12.1 initializing configuration
org.apache.logging.log4j.core.config.properties.PropertiesConfiguration@557caf28
DEBUG StatusLogger Installed 3 script engines
DEBUG StatusLogger Scala Interpreter version: 1.0, language: Scala, threading:
Not Thread Safe, compile: true, names: [scala], factory class:
scala.tools.nsc.interpreter.IMain$Factory
DEBUG StatusLogger Groovy Scripting Engine version: 2.0, language: Groovy,
threading: MULTITHREADED, compile: true, names: [groovy, Groovy], factory
class: org.codehaus.groovy.jsr223.GroovyScriptEngineFactory
DEBUG StatusLogger Oracle Nashorn version: 1.8.0_102, language: ECMAScript,
threading: Not Thread Safe, compile: true, names: [nashorn, Nashorn, js, JS,
JavaScript, javascript, ECMAScript, ecmascript], factory class:
jdk.nashorn.api.scripting.NashornScriptEngineFactory
INFO StatusLogger Scanning for classes in
'/home/hiveptest/35.224.55.83-hiveptest-2/maven/org/apache/hive/hive-common/4.0.0-SNAPSHOT/hive-common-4.0.0-SNAPSHOT.jar'
matching criteria annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.PerfLogger matches criteria annotated with @Plugin
INFO StatusLogger Scanning for classes in
'/home/hiveptest/35.224.55.83-hiveptest-2/maven/org/apache/hive/hive-exec/4.0.0-SNAPSHOT/hive-exec-4.0.0-SNAPSHOT.jar'
matching criteria annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.HiveEventCounter$1 matches criteria annotated
with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.LogDivertAppenderForTest matches criteria
annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.syslog.SyslogSerDe matches criteria annotated
with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.syslog.SyslogInputFormat$Location matches
criteria annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.syslog.SyslogInputFormat matches criteria
annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.syslog.SyslogParser matches criteria annotated
with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.syslog.SyslogStorageHandler matches criteria
annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.LogDivertAppenderForTest$TestFilter matches
criteria annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.LogDivertAppender matches criteria annotated with
@Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.PidFilePatternConverter matches criteria
annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.HiveEventCounter matches criteria annotated with
@Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.HiveEventCounter$EventCounts matches criteria
annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.NullAppender matches criteria annotated with
@Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.HushableRandomAccessFileAppender$1 matches
criteria annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.LogDivertAppender$NameFilter matches criteria
annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.SlidingFilenameRolloverStrategy$1 matches
criteria annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.SlidingFilenameRolloverStrategy matches criteria
annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.HushableRandomAccessFileAppender matches criteria
annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.PerfLogger matches criteria annotated with @Plugin
INFO StatusLogger Scanning for classes in
'/home/hiveptest/35.224.55.83-hiveptest-2/maven/org/apache/hive/hive-exec/4.0.0-SNAPSHOT/hive-exec-4.0.0-SNAPSHOT-tests.jar'
matching criteria annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.TestLog4j2Appenders matches criteria annotated
with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.TestSlidingFilenameRolloverStrategy matches
criteria annotated with @Plugin
DEBUG StatusLogger Checking to see if class
org.apache.hadoop.hive.ql.log.TestSyslogInputFormat matches criteria annotated
with @Plugin
DEBUG StatusLogger Took 0.061118 seconds to load 7 plugins from package
org.apache.hadoop.hive.ql.log
DEBUG StatusLogger PluginManager 'Core' found 129 plugins
DEBUG StatusLogger PluginManager 'Level' found 0 plugins
DEBUG StatusLogger Building Plugin[name=property,
class=org.apache.logging.log4j.core.config.Property].
TRACE StatusLogger TypeConverterRegistry initializing.
DEBUG StatusLogger PluginManager 'TypeConverter' found 26 plugins
DEBUG StatusLogger createProperty(name="hive.log.file", value="hive.log")
DEBUG StatusLogger Building Plugin[name=property,
class=org.apache.logging.log4j.core.config.Property].
DEBUG StatusLogger createProperty(name="hive.log.dir",
value="/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/log")
DEBUG StatusLogger Building Plugin[name=property,
class=org.apache.logging.log4j.core.config.Property].
DEBUG StatusLogger createProperty(name="hive.root.logger", value="DRFA")
DEBUG StatusLogger Building Plugin[name=property,
class=org.apache.logging.log4j.core.config.Property].
DEBUG StatusLogger createProperty(name="hive.log.level", value="DEBUG")
DEBUG StatusLogger Building Plugin[name=property,
class=org.apache.logging.log4j.core.config.Property].
DEBUG StatusLogger createProperty(name="hive.test.console.log.level",
value="INFO")
DEBUG StatusLogger Building Plugin[name=properties,
class=org.apache.logging.log4j.core.config.PropertiesPlugin].
DEBUG StatusLogger configureSubstitutor(={hive.log.file=hive.log,
hive.log.dir=/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/log,
hive.root.logger=DRFA, hive.log.level=DEBUG,
hive.test.console.log.level=INFO}, Configuration(HiveLog4j2Test))
DEBUG StatusLogger PluginManager 'Lookup' found 14 plugins
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="WARN",
name="org.apache.hadoop.ipc", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="org.apache.hadoop.security", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="org.apache.hadoop.hdfs", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="WARN",
name="org.apache.hadoop.hdfs.server", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="org.apache.hadoop.metrics2", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="org.mortbay", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="org.apache.hadoop.yarn", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="WARN",
name="org.apache.hadoop.yarn.server", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="org.apache.tez", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="ERROR",
name="org.apache.hadoop.conf.Configuration", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="org.apache.zookeeper", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="WARN",
name="org.apache.zookeeper.server.ServerCnxn", includeLocation="null", ={},
={}, Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="WARN",
name="org.apache.zookeeper.server.NIOServerCnxn", includeLocation="null", ={},
={}, Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="WARN",
name="org.apache.zookeeper.ClientCnxn", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="WARN",
name="org.apache.zookeeper.ClientCnxnSocket", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="WARN",
name="org.apache.zookeeper.ClientCnxnSocketNIO", includeLocation="null", ={},
={}, Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="ERROR",
name="DataNucleus", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="ERROR",
name="Datastore", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="ERROR", name="JPOX",
includeLocation="null", ={}, ={}, Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="org.apache.hadoop.hive.ql.exec.Operator", includeLocation="null", ={},
={}, Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="org.apache.hadoop.hive.serde2.lazy", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="org.apache.hadoop.hive.metastore.ObjectStore", includeLocation="null",
={}, ={}, Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="org.apache.calcite.plan.RelOptPlanner", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="com.amazonaws", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="INFO",
name="org.apache.http", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="WARN",
name="org.apache.thrift", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="WARN",
name="org.eclipse.jetty", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=logger,
class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="true", level="WARN",
name="BlockStateChange", includeLocation="null", ={}, ={},
Configuration(HiveLog4j2Test), Filter=null)
DEBUG StatusLogger Building Plugin[name=AppenderRef,
class=org.apache.logging.log4j.core.config.AppenderRef].
DEBUG StatusLogger createAppenderRef(ref="console", level="INFO", Filter=null)
DEBUG StatusLogger Building Plugin[name=AppenderRef,
class=org.apache.logging.log4j.core.config.AppenderRef].
DEBUG StatusLogger createAppenderRef(ref="DRFA", level="null", Filter=null)
DEBUG StatusLogger Building Plugin[name=root,
class=org.apache.logging.log4j.core.config.LoggerConfig$RootLogger].
DEBUG StatusLogger createLogger(additivity="null", level="DEBUG",
includeLocation="null", ={console, DRFA}, ={}, Configuration(HiveLog4j2Test),
Filter=null)
DEBUG StatusLogger Building Plugin[name=loggers,
class=org.apache.logging.log4j.core.config.LoggersPlugin].
DEBUG StatusLogger createLoggers(={org.apache.hadoop.ipc,
org.apache.hadoop.security, org.apache.hadoop.hdfs,
org.apache.hadoop.hdfs.server, org.apache.hadoop.metrics2, org.mortbay,
org.apache.hadoop.yarn, org.apache.hadoop.yarn.server, org.apache.tez,
org.apache.hadoop.conf.Configuration, org.apache.zookeeper,
org.apache.zookeeper.server.ServerCnxn,
org.apache.zookeeper.server.NIOServerCnxn, org.apache.zookeeper.ClientCnxn,
org.apache.zookeeper.ClientCnxnSocket,
org.apache.zookeeper.ClientCnxnSocketNIO, DataNucleus, Datastore, JPOX,
org.apache.hadoop.hive.ql.exec.Operator, org.apache.hadoop.hive.serde2.lazy,
org.apache.hadoop.hive.metastore.ObjectStore,
org.apache.calcite.plan.RelOptPlanner, com.amazonaws, org.apache.http,
org.apache.thrift, org.eclipse.jetty, BlockStateChange, root})
DEBUG StatusLogger Building Plugin[name=layout,
class=org.apache.logging.log4j.core.layout.PatternLayout].
DEBUG StatusLogger PatternLayout$Builder(pattern="%d{ISO8601} %5p [%t] %c{2}:
%m%n", PatternSelector=null, Configuration(HiveLog4j2Test), Replace=null,
charset="null", alwaysWriteExceptions="null", disableAnsi="null",
noConsoleNoAnsi="null", header="null", footer="null")
DEBUG StatusLogger PluginManager 'Converter' found 44 plugins
DEBUG StatusLogger Building Plugin[name=appender,
class=org.apache.logging.log4j.core.appender.ConsoleAppender].
DEBUG StatusLogger ConsoleAppender$Builder(target="SYSTEM_ERR", follow="null",
direct="null", bufferedIo="null", bufferSize="null", immediateFlush="null",
ignoreExceptions="null", PatternLayout(%d{ISO8601} %5p [%t] %c{2}: %m%n),
name="console", Configuration(HiveLog4j2Test), Filter=null, ={})
DEBUG StatusLogger Starting OutputStreamManager SYSTEM_ERR.false.false
DEBUG StatusLogger Building Plugin[name=layout,
class=org.apache.logging.log4j.core.layout.PatternLayout].
DEBUG StatusLogger PatternLayout$Builder(pattern="%d{ISO8601} %5p [%t] %c{2}:
%m%n", PatternSelector=null, Configuration(HiveLog4j2Test), Replace=null,
charset="null", alwaysWriteExceptions="null", disableAnsi="null",
noConsoleNoAnsi="null", header="null", footer="null")
DEBUG StatusLogger Building Plugin[name=TimeBasedTriggeringPolicy,
class=org.apache.logging.log4j.core.appender.rolling.TimeBasedTriggeringPolicy].
DEBUG StatusLogger TimeBasedTriggeringPolicy$Builder(interval="1",
modulate="true", maxRandomDelay="null")
DEBUG StatusLogger Building Plugin[name=Policies,
class=org.apache.logging.log4j.core.appender.rolling.CompositeTriggeringPolicy].
DEBUG StatusLogger
createPolicy(={TimeBasedTriggeringPolicy(nextRolloverMillis=0, interval=1,
modulate=true)})
DEBUG StatusLogger Building Plugin[name=DefaultRolloverStrategy,
class=org.apache.logging.log4j.core.appender.rolling.DefaultRolloverStrategy].
DEBUG StatusLogger DefaultRolloverStrategy$Builder(max="30", min="null",
fileIndex="null", compressionLevel="null", ={},
stopCustomActionsOnError="null", tempCompressedFilePattern="null",
Configuration(HiveLog4j2Test))
DEBUG StatusLogger Building Plugin[name=appender,
class=org.apache.logging.log4j.core.appender.RollingRandomAccessFileAppender].
DEBUG StatusLogger
RollingRandomAccessFileAppender$Builder(fileName="/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/log/hive.log",
filePattern="/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/log/hive.log.%d{yyyy-MM-dd}",
append="null",
Policies(CompositeTriggeringPolicy(policies=[TimeBasedTriggeringPolicy(nextRolloverMillis=0,
interval=1, modulate=true)])),
DefaultRolloverStrategy(DefaultRolloverStrategy(min=1, max=30, useMax=true)),
advertise="null", advertiseURI="null", filePermissions="null",
fileOwner="null", fileGroup="null", bufferedIo="null", bufferSize="null",
immediateFlush="null", ignoreExceptions="null", PatternLayout(%d{ISO8601} %5p
[%t] %c{2}: %m%n), name="DRFA", Configuration(HiveLog4j2Test), Filter=null, ={})
TRACE StatusLogger RandomAccessFile
/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/log/hive.log
seek to 0
DEBUG StatusLogger Starting RollingRandomAccessFileManager
/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/log/hive.log
DEBUG StatusLogger PluginManager 'FileConverter' found 3 plugins
DEBUG StatusLogger Setting prev file time to 2020-04-15T20:53:05.716-0700
DEBUG StatusLogger Initializing triggering policy
CompositeTriggeringPolicy(policies=[TimeBasedTriggeringPolicy(nextRolloverMillis=0,
interval=1, modulate=true)])
DEBUG StatusLogger Initializing triggering policy
TimeBasedTriggeringPolicy(nextRolloverMillis=0, interval=1, modulate=true)
TRACE StatusLogger PatternProcessor.getNextTime returning
2020/04/16-00:00:00.000, nextFileTime=2020/04/15-00:00:00.000,
prevFileTime=1969/12/31-16:00:00.000, current=2020/04/15-20:53:05.728,
freq=DAILY
TRACE StatusLogger PatternProcessor.getNextTime returning
2020/04/16-00:00:00.000, nextFileTime=2020/04/15-00:00:00.000,
prevFileTime=2020/04/15-00:00:00.000, current=2020/04/15-20:53:05.729,
freq=DAILY
DEBUG StatusLogger Building Plugin[name=appenders,
class=org.apache.logging.log4j.core.config.AppendersPlugin].
DEBUG StatusLogger createAppenders(={console, DRFA})
DEBUG StatusLogger Configuration
org.apache.logging.log4j.core.config.properties.PropertiesConfiguration@557caf28
initialized
DEBUG StatusLogger Starting configuration
org.apache.logging.log4j.core.config.properties.PropertiesConfiguration@557caf28
DEBUG StatusLogger Started configuration
org.apache.logging.log4j.core.config.properties.PropertiesConfiguration@557caf28
OK.
TRACE StatusLogger Stopping
org.apache.logging.log4j.core.config.DefaultConfiguration@5a4aa2f2...
TRACE StatusLogger DefaultConfiguration notified 1 ReliabilityStrategies that
config will be stopped.
TRACE StatusLogger DefaultConfiguration stopping root LoggerConfig.
TRACE StatusLogger DefaultConfiguration notifying ReliabilityStrategies that
appenders will be stopped.
TRACE StatusLogger DefaultConfiguration stopping remaining Appenders.
DEBUG StatusLogger Shutting down OutputStreamManager SYSTEM_OUT.false.false-1
DEBUG StatusLogger Shut down OutputStreamManager SYSTEM_OUT.false.false-1, all
resources released: true
DEBUG StatusLogger Appender DefaultConsole-1 stopped with status true
TRACE StatusLogger DefaultConfiguration stopped 1 remaining Appenders.
TRACE StatusLogger DefaultConfiguration cleaning Appenders from 1 LoggerConfigs.
DEBUG StatusLogger Stopped
org.apache.logging.log4j.core.config.DefaultConfiguration@5a4aa2f2 OK
TRACE StatusLogger Reregistering MBeans after reconfigure.
Selector=org.apache.logging.log4j.core.selector.ClassLoaderContextSelector@76b0ae1b
TRACE StatusLogger Reregistering context (1/1): '135fbaa4'
org.apache.logging.log4j.core.LoggerContext@223191a6
TRACE StatusLogger Unregistering but no MBeans found matching
'org.apache.logging.log4j2:type=135fbaa4'
TRACE StatusLogger Unregistering but no MBeans found matching
'org.apache.logging.log4j2:type=135fbaa4,component=StatusLogger'
TRACE StatusLogger Unregistering but no MBeans found matching
'org.apache.logging.log4j2:type=135fbaa4,component=ContextSelector'
TRACE StatusLogger Unregistering but no MBeans found matching
'org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=*'
TRACE StatusLogger Unregistering but no MBeans found matching
'org.apache.logging.log4j2:type=135fbaa4,component=Appenders,name=*'
TRACE StatusLogger Unregistering but no MBeans found matching
'org.apache.logging.log4j2:type=135fbaa4,component=AsyncAppenders,name=*'
TRACE StatusLogger Unregistering but no MBeans found matching
'org.apache.logging.log4j2:type=135fbaa4,component=AsyncLoggerRingBuffer'
TRACE StatusLogger Unregistering but no MBeans found matching
'org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=*,subtype=RingBuffer'
DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=135fbaa4
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=StatusLogger
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=ContextSelector
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.tez
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=com.amazonaws
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.thrift
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.eclipse.jetty
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=BlockStateChange
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.hadoop.metrics2
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.zookeeper.ClientCnxn
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.zookeeper
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.hadoop.security
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=JPOX
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.hadoop.yarn.server
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.hadoop.conf.Configuration
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.hadoop.yarn
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.zookeeper.ClientCnxnSocketNIO
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.zookeeper.server.ServerCnxn
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.hadoop.hdfs
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.hadoop.hdfs.server
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.hadoop.hive.serde2.lazy
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.zookeeper.ClientCnxnSocket
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.mortbay
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.http
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.hadoop.hive.ql.exec.Operator
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=DataNucleus
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=Datastore
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.hadoop.hive.metastore.ObjectStore
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.hadoop.ipc
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.zookeeper.server.NIOServerCnxn
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Loggers,name=org.apache.calcite.plan.RelOptPlanner
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Appenders,name=console
DEBUG StatusLogger Registering MBean
org.apache.logging.log4j2:type=135fbaa4,component=Appenders,name=DRFA
TRACE StatusLogger Using default SystemClock for timestamps.
DEBUG StatusLogger org.apache.logging.log4j.core.util.SystemClock does not
support precise timestamps.
TRACE StatusLogger Using DummyNanoClock for nanosecond timestamps.
DEBUG StatusLogger Reconfiguration complete for context[name=135fbaa4] at URI
/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/testconf/hive-log4j2.properties
(org.apache.logging.log4j.core.LoggerContext@223191a6) with optional
ClassLoader: null
DEBUG StatusLogger Shutdown hook enabled. Registering a new one.
DEBUG StatusLogger LoggerContext[name=135fbaa4,
org.apache.logging.log4j.core.LoggerContext@223191a6] started OK.
2020-04-15T20:53:05,844 INFO [main] conf.MetastoreConf: Found configuration
file:
file:/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/testconf/hive-site.xml
DEBUG StatusLogger AsyncLogger.ThreadNameStrategy=UNCACHED (user specified
null, default is UNCACHED)
TRACE StatusLogger Using default SystemClock for timestamps.
DEBUG StatusLogger org.apache.logging.log4j.core.util.SystemClock does not
support precise timestamps.
2020-04-15T20:53:06,020 WARN [main] util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable
2020-04-15T20:53:06,104 INFO [main] conf.MetastoreConf: Found configuration
file:
file:/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/testconf/hivemetastore-site.xml
2020-04-15T20:53:06,105 INFO [main] conf.MetastoreConf: Unable to find config
file: metastore-site.xml
2020-04-15T20:53:06,202 INFO [main] hdfs.MiniDFSCluster: starting cluster:
numNameNodes=1, numDataNodes=1
2020-04-15T20:53:06,819 INFO [main] util.GSet: Computing capacity for map
BlocksMap
2020-04-15T20:53:06,820 INFO [main] util.GSet: VM type = 64-bit
2020-04-15T20:53:06,822 INFO [main] util.GSet: 2.0% max memory 1.8 GB = 36.4 MB
2020-04-15T20:53:06,822 INFO [main] util.GSet: capacity = 2^22 = 4194304
entries
2020-04-15T20:53:06,918 INFO [main] util.GSet: Computing capacity for map
INodeMap
2020-04-15T20:53:06,918 INFO [main] util.GSet: VM type = 64-bit
2020-04-15T20:53:06,918 INFO [main] util.GSet: 1.0% max memory 1.8 GB = 18.2 MB
2020-04-15T20:53:06,918 INFO [main] util.GSet: capacity = 2^21 = 2097152
entries
2020-04-15T20:53:06,940 INFO [main] util.GSet: Computing capacity for map
cachedBlocks
2020-04-15T20:53:06,940 INFO [main] util.GSet: VM type = 64-bit
2020-04-15T20:53:06,941 INFO [main] util.GSet: 0.25% max memory 1.8 GB = 4.6 MB
2020-04-15T20:53:06,941 INFO [main] util.GSet: capacity = 2^19 = 524288
entries
2020-04-15T20:53:06,966 INFO [main] util.GSet: Computing capacity for map
NameNodeRetryCache
2020-04-15T20:53:06,966 INFO [main] util.GSet: VM type = 64-bit
2020-04-15T20:53:06,967 INFO [main] util.GSet: 0.029999999329447746% max
memory 1.8 GB = 559.3 KB
2020-04-15T20:53:06,967 INFO [main] util.GSet: capacity = 2^16 = 65536
entries
2020-04-15T20:53:07,523 INFO [main] beanutils.FluentPropertyBeanIntrospector:
Error when creating PropertyDescriptor for public final void
org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)!
Ignoring this property.
2020-04-15T20:53:07,559 WARN [main] impl.MetricsConfig: Cannot locate
configuration: tried
hadoop-metrics2-namenode.properties,hadoop-metrics2.properties
2020-04-15T20:53:07,586 INFO [main] impl.MetricsSystemImpl: Scheduled Metric
snapshot period at 10 second(s).
2020-04-15T20:53:07,587 INFO [main] impl.MetricsSystemImpl: NameNode metrics
system started
2020-04-15T20:53:07,673 INFO
[org.apache.hadoop.util.JvmPauseMonitor$Monitor@5aaaa446] util.JvmPauseMonitor:
Starting JVM pause monitor
2020-04-15T20:53:07,688 INFO [main] hdfs.DFSUtil: Starting Web-server for hdfs
at: http://localhost:0
2020-04-15T20:53:07,859 INFO [main] server.AuthenticationFilter: Unable to
initialize FileSignerSecretProvider, falling back to use random secrets.
2020-04-15T20:53:07,869 WARN [main] http.HttpRequestLog: Jetty request log can
only be enabled using Log4j
2020-04-15T20:53:07,888 INFO [main] http.HttpServer2: Added global filter
'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2020-04-15T20:53:07,893 INFO [main] http.HttpServer2: Added filter
static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context hdfs
2020-04-15T20:53:07,894 INFO [main] http.HttpServer2: Added filter
static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context static
2020-04-15T20:53:07,894 INFO [main] http.HttpServer2: Added filter
static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context logs
2020-04-15T20:53:07,932 INFO [main] http.HttpServer2: Added filter
'org.apache.hadoop.hdfs.web.AuthFilter'
(class=org.apache.hadoop.hdfs.web.AuthFilter)
2020-04-15T20:53:07,932 INFO [main] http.HttpServer2:
addJerseyResourcePackage:
packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
pathSpec=/webhdfs/v1/*
2020-04-15T20:53:07,944 INFO [main] http.HttpServer2: Jetty bound to port 60465
2020-04-15T20:53:08,213 INFO [main] util.GSet: Computing capacity for map
BlocksMap
2020-04-15T20:53:08,213 INFO [main] util.GSet: VM type = 64-bit
2020-04-15T20:53:08,214 INFO [main] util.GSet: 2.0% max memory 1.8 GB = 36.4 MB
2020-04-15T20:53:08,214 INFO [main] util.GSet: capacity = 2^22 = 4194304
entries
2020-04-15T20:53:08,217 INFO [main] util.GSet: Computing capacity for map
INodeMap
2020-04-15T20:53:08,217 INFO [main] util.GSet: VM type = 64-bit
2020-04-15T20:53:08,217 INFO [main] util.GSet: 1.0% max memory 1.8 GB = 18.2 MB
2020-04-15T20:53:08,217 INFO [main] util.GSet: capacity = 2^21 = 2097152
entries
2020-04-15T20:53:08,219 INFO [main] util.GSet: Computing capacity for map
cachedBlocks
2020-04-15T20:53:08,219 INFO [main] util.GSet: VM type = 64-bit
2020-04-15T20:53:08,220 INFO [main] util.GSet: 0.25% max memory 1.8 GB = 4.6 MB
2020-04-15T20:53:08,220 INFO [main] util.GSet: capacity = 2^19 = 524288
entries
2020-04-15T20:53:08,221 INFO [main] util.GSet: Computing capacity for map
NameNodeRetryCache
2020-04-15T20:53:08,221 INFO [main] util.GSet: VM type = 64-bit
2020-04-15T20:53:08,221 INFO [main] util.GSet: 0.029999999329447746% max
memory 1.8 GB = 559.3 KB
2020-04-15T20:53:08,221 INFO [main] util.GSet: capacity = 2^16 = 65536
entries
2020-04-15T20:53:08,828 INFO [main] hdfs.StateChange: STATE* Leaving safe mode
after 0 secs
2020-04-15T20:53:08,828 INFO [main] hdfs.StateChange: STATE* Network topology
has 0 racks and 0 datanodes
2020-04-15T20:53:08,828 INFO [main] hdfs.StateChange: STATE*
UnderReplicatedBlocks has 0 blocks
2020-04-15T20:53:08,835 INFO [Reconstruction Queue Initializer]
hdfs.StateChange: STATE* Replication Queue initialization scan for invalid,
over- and under-replicated blocks completed in 7 msec
2020-04-15T20:53:08,878 WARN [main] common.MetricsLoggerTask: Metrics logging
will not be async since the logger is not log4j
2020-04-15T20:53:08,896 INFO [main] hdfs.MiniDFSCluster: Starting DataNode 0
with dfs.datanode.data.dir:
[DISK]file:/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/dfs/data/data1,[DISK]file:/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/dfs/data/data2
2020-04-15T20:53:09,075 INFO [main] impl.MetricsSystemImpl: DataNode metrics
system started (again)
2020-04-15T20:53:09,118 INFO [main] server.AuthenticationFilter: Unable to
initialize FileSignerSecretProvider, falling back to use random secrets.
2020-04-15T20:53:09,120 WARN [main] http.HttpRequestLog: Jetty request log can
only be enabled using Log4j
2020-04-15T20:53:09,121 INFO [main] http.HttpServer2: Added global filter
'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2020-04-15T20:53:09,122 INFO [main] http.HttpServer2: Added filter
static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context datanode
2020-04-15T20:53:09,123 INFO [main] http.HttpServer2: Added filter
static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context logs
2020-04-15T20:53:09,123 INFO [main] http.HttpServer2: Added filter
static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context static
2020-04-15T20:53:09,131 INFO [main] http.HttpServer2: Jetty bound to port 60676
2020-04-15T20:53:09,477 INFO
[org.apache.hadoop.util.JvmPauseMonitor$Monitor@e9474f] util.JvmPauseMonitor:
Starting JVM pause monitor
2020-04-15T20:53:09,542 WARN [main] common.MetricsLoggerTask: Metrics logging
will not be async since the logger is not log4j
2020-04-15T20:53:10,405 INFO [IPC Server handler 1 on 52650] hdfs.StateChange:
BLOCK* registerDatanode: from DatanodeRegistration(127.0.0.1:49891,
datanodeUuid=b6489697-d507-4259-af95-984cde3077a6, infoPort=35025,
infoSecurePort=0, ipcPort=49729,
storageInfo=lv=-57;cid=testClusterID;nsid=1271801433;c=1587009186993) storage
b6489697-d507-4259-af95-984cde3077a6
2020-04-15T20:53:10,407 INFO [IPC Server handler 1 on 52650]
net.NetworkTopology: Adding a new node: /default-rack/127.0.0.1:49891
2020-04-15T20:53:10,447 INFO [main] hdfs.MiniDFSCluster: No heartbeat from
DataNode: 127.0.0.1:49891
2020-04-15T20:53:10,447 INFO [main] hdfs.MiniDFSCluster: Waiting for cluster
to become active
2020-04-15T20:53:10,553 INFO [main] hdfs.MiniDFSCluster: Cluster is active
2020-04-15T20:53:10,827 INFO [MetaStoreThread-57893] metastore.AuthFactory:
Using authentication NOSASL with kerberos authentication disabled
2020-04-15T20:53:11,007 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
0: Opening raw store with implementation
class:org.apache.hadoop.hive.metastore.ObjectStore
2020-04-15T20:53:11,062 INFO [MetaStoreThread-57893]
metastore.PersistenceManagerProvider: Configuration
datanucleus.autoStartMechanismMode is not set. Defaulting to 'ignored'
2020-04-15T20:53:11,071 INFO [MetaStoreThread-57893]
metastore.PersistenceManagerProvider: Updating the pmf due to property change
2020-04-15T20:53:11,071 INFO [MetaStoreThread-57893]
metastore.PersistenceManagerProvider: Current pmf properties are uninitialized
2020-04-15T20:53:11,115 INFO [MetaStoreThread-57893] hikari.HikariDataSource:
HikariPool-1 - Starting...
2020-04-15T20:53:12,461 INFO [MetaStoreThread-57893] pool.PoolBase:
HikariPool-1 - Driver does not support get/set network timeout for connections.
(Feature not implemented: No details.)
2020-04-15T20:53:12,467 INFO [MetaStoreThread-57893] hikari.HikariDataSource:
HikariPool-1 - Start completed.
2020-04-15T20:53:13,319 INFO [MetaStoreThread-57893]
metastore.PersistenceManagerProvider: Setting MetaStore object pin classes with
hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2020-04-15T20:53:13,320 INFO [MetaStoreThread-57893] metastore.ObjectStore:
RawStore: org.apache.hadoop.hive.metastore.ObjectStore@5c54cd77, with
PersistenceManager: null will be shutdown
2020-04-15T20:53:13,358 INFO [MetaStoreThread-57893] metastore.ObjectStore:
RawStore: org.apache.hadoop.hive.metastore.ObjectStore@5c54cd77, with
PersistenceManager: org.datanucleus.api.jdo.JDOPersistenceManager@f408f44
created in the thread with id: 133
2020-04-15T20:53:18,269 WARN [MetaStoreThread-57893] metastore.ObjectStore:
Version information not found in metastore. metastore.schema.verification is
not enabled so recording the schema version 4.0.0
2020-04-15T20:53:18,269 WARN [MetaStoreThread-57893] metastore.ObjectStore:
setMetaStoreSchemaVersion called but recording version is disabled: version =
4.0.0, comment = Set by MetaStore [email protected]
2020-04-15T20:53:18,269 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
Created RawStore: org.apache.hadoop.hive.metastore.ObjectStore@5c54cd77 from
thread id: 133
2020-04-15T20:53:18,623 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
Added admin role in metastore
2020-04-15T20:53:18,626 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
Added public role in metastore
2020-04-15T20:53:18,836 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
Added hive_admin_user to admin role
2020-04-15T20:53:18,848 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
HMS server filtering is disabled by configuration
2020-04-15T20:53:19,103 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
Starting DB backed MetaStore Server with SetUGI enabled
2020-04-15T20:53:19,103 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
Binding host localhost for metastore server
2020-04-15T20:53:19,110 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
Started the new metaserver on port [57893]...
2020-04-15T20:53:19,110 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
Options.minWorkerThreads = 200
2020-04-15T20:53:19,110 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
Options.maxWorkerThreads = 1000
2020-04-15T20:53:19,110 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
TCP keepalive = true
2020-04-15T20:53:19,110 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
Enable SSL = false
2020-04-15T20:53:19,110 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
Compaction HMS parameters:
2020-04-15T20:53:19,110 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
metastore.compactor.initiator.on = true
2020-04-15T20:53:19,111 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
metastore.compactor.worker.threads = 1
2020-04-15T20:53:19,111 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
hive.metastore.runworker.in = metastore
2020-04-15T20:53:19,111 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
metastore.compactor.history.reaper.interval = 2
2020-04-15T20:53:19,111 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
metastore.compactor.history.retention.attempted = 2
2020-04-15T20:53:19,111 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
metastore.compactor.history.retention.failed = 3
2020-04-15T20:53:19,111 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
metastore.compactor.history.retention.succeeded = 3
2020-04-15T20:53:19,111 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
metastore.compactor.initiator.failed.compacts.threshold = 2
2020-04-15T20:53:19,111 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
metastore.compactor.enable.stats.compression
2020-04-15T20:53:19,111 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
Direct SQL optimization = true
2020-04-15T20:53:19,111 INFO [MetaStoreThread-57893] metastore.HiveMetaStore:
metastore.housekeeping.leader.hostname is empty. Start all the housekeeping
threads.
2020-04-15T20:53:19,774 INFO [pool-23-thread-1] HiveMetaStore.audit:
ugi=hiveptest ip=127.0.0.1 cmd=Done cleaning up thread local RawStore
2020-04-15T20:53:19,782 INFO [main] metastore.MetaStoreTestUtils: MetaStore
warehouse root dir
(pfile:/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/warehouse/57893)
is created
2020-04-15T20:53:19,782 INFO [main] metastore.MetaStoreTestUtils: MetaStore
Thrift Server started on port: 57893 with warehouse dir:
pfile:/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/warehouse/57893
with jdbcUrl:
jdbc:derby:;databaseName=/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/junit_metastore_db_57893;create=true
2020-04-15T20:53:19,793 INFO [main] metastore.HiveMetaStoreClient: HMS client
filtering is enabled.
2020-04-15T20:53:19,795 INFO [main] metastore.HiveMetaStoreClient: Trying to
connect to metastore with URI (thrift://localhost:57893)
2020-04-15T20:53:19,813 INFO [main] metastore.HiveMetaStoreClient: Opened a
connection to metastore, URI (thrift://localhost:57893) current connections: 1
2020-04-15T20:53:19,814 INFO [main] metastore.HiveMetaStoreClient: Connected
to metastore.
2020-04-15T20:53:20,122 INFO [Metastore threads starter thread]
metastore.HiveMetaStore: Starting metastore thread of type
org.apache.hadoop.hive.ql.txn.compactor.Initiator
2020-04-15T20:53:20,388 INFO [Metastore threads starter thread] conf.HiveConf:
Found configuration file
file:/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/testconf/hive-site.xml
2020-04-15T20:53:20,807 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.size does not exist
2020-04-15T20:53:20,807 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.dummyparam.test.server.specific.config.override does not
exist
2020-04-15T20:53:20,808 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.metastore.metadb.dir does not exist
2020-04-15T20:53:20,808 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.alloc.min does not exist
2020-04-15T20:53:20,808 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.dummyparam.test.server.specific.config.hivesite does not
exist
2020-04-15T20:53:20,808 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.alloc.max does not exist
2020-04-15T20:53:20,808 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.metastore.runworker.in does not exist
2020-04-15T20:53:20,808 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.version does not exist
2020-04-15T20:53:20,809 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.root does not exist
2020-04-15T20:53:20,809 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.dummyparam.test.server.specific.config.metastoresite does
not exist
2020-04-15T20:53:20,810 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.arena.size does not exist
2020-04-15T20:53:20,810 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.stats.key.prefix.reserve.length does not exist
2020-04-15T20:53:20,810 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.test.console.log.level does not exist
2020-04-15T20:53:20,840 INFO [Metastore threads starter thread]
txn.TxnHandler: Hacking in canned values for transaction manager
2020-04-15T20:53:23,565 INFO [Metastore threads starter thread] txn.TxnDbUtil:
TBLS table already exist, ignoring
2020-04-15T20:53:23,605 INFO [Metastore threads starter thread] txn.TxnDbUtil:
TBLS table already exist, ignoring
2020-04-15T20:53:23,621 INFO [Metastore threads starter thread] txn.TxnDbUtil:
PARTITIONS table already exist, ignoring
2020-04-15T20:53:23,637 INFO [Metastore threads starter thread] txn.TxnDbUtil:
TABLE_PARAMS table already exist, ignoring
2020-04-15T20:53:23,653 INFO [Metastore threads starter thread] txn.TxnDbUtil:
PARTITION_PARAMS table already exist, ignoring
2020-04-15T20:53:23,678 INFO [Metastore threads starter thread] txn.TxnDbUtil:
SEQUENCE_TABLE table already exist, ignoring
2020-04-15T20:53:23,695 INFO [Metastore threads starter thread] txn.TxnDbUtil:
NOTIFICATION_SEQUENCE table already exist, ignoring
2020-04-15T20:53:23,710 INFO [Metastore threads starter thread] txn.TxnDbUtil:
NOTIFICATION_LOG table already exist, ignoring
2020-04-15T20:53:23,800 INFO [Metastore threads starter thread]
hikari.HikariDataSource: HikariPool-2 - Starting...
2020-04-15T20:53:23,805 INFO [Metastore threads starter thread] pool.PoolBase:
HikariPool-2 - Driver does not support get/set network timeout for connections.
(Feature not implemented: No details.)
2020-04-15T20:53:23,809 INFO [Metastore threads starter thread]
hikari.HikariDataSource: HikariPool-2 - Start completed.
2020-04-15T20:53:23,814 INFO [Metastore threads starter thread]
hikari.HikariDataSource: HikariPool-3 - Starting...
2020-04-15T20:53:23,821 INFO [Metastore threads starter thread] pool.PoolBase:
HikariPool-3 - Driver does not support get/set network timeout for connections.
(Feature not implemented: No details.)
2020-04-15T20:53:23,822 INFO [Metastore threads starter thread]
hikari.HikariDataSource: HikariPool-3 - Start completed.
2020-04-15T20:53:23,823 INFO [Metastore threads starter thread]
metastore.HiveMetaStore: This HMS instance will act as a Compactor Initiator.
2020-04-15T20:53:23,826 INFO [Metastore threads starter thread]
metastore.HiveMetaStore: Starting metastore thread of type
org.apache.hadoop.hive.ql.txn.compactor.Cleaner
2020-04-15T20:53:23,924 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.size does not exist
2020-04-15T20:53:23,925 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.dummyparam.test.server.specific.config.override does not
exist
2020-04-15T20:53:23,928 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.metastore.metadb.dir does not exist
2020-04-15T20:53:23,928 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.alloc.min does not exist
2020-04-15T20:53:23,932 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.dummyparam.test.server.specific.config.hivesite does not
exist
2020-04-15T20:53:23,932 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.alloc.max does not exist
2020-04-15T20:53:23,932 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.metastore.runworker.in does not exist
2020-04-15T20:53:23,933 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.version does not exist
2020-04-15T20:53:23,933 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.root does not exist
2020-04-15T20:53:23,934 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.dummyparam.test.server.specific.config.metastoresite does
not exist
2020-04-15T20:53:23,934 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.arena.size does not exist
2020-04-15T20:53:23,934 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.stats.key.prefix.reserve.length does not exist
2020-04-15T20:53:23,941 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.test.console.log.level does not exist
2020-04-15T20:53:23,949 INFO [Metastore threads starter thread]
txn.TxnHandler: Hacking in canned values for transaction manager
2020-04-15T20:53:24,029 INFO [Metastore threads starter thread] txn.TxnDbUtil:
Txn tables already exist, returning
2020-04-15T20:53:24,041 INFO [Metastore threads starter thread]
metastore.HiveMetaStore: This HMS instance will act as a Compactor Cleaner.
2020-04-15T20:53:24,050 INFO [Metastore threads starter thread]
metastore.HiveMetaStore: Scheduling for
org.apache.hadoop.hive.metastore.RemoteMetastoreTaskThreadTestImpl2 service.
2020-04-15T20:53:24,071 INFO [Metastore threads starter thread]
metastore.HiveMetaStore: Scheduling for
org.apache.hadoop.hive.metastore.RemoteMetastoreTaskThreadTestImpl1 service.
2020-04-15T20:53:24,091 INFO [Metastore threads starter thread]
metastore.HiveMetaStore: Starting metastore thread of type
org.apache.hadoop.hive.ql.stats.StatsUpdaterThread
2020-04-15T20:53:24,092 INFO [Metastore threads starter thread]
txn.TxnHandler: Hacking in canned values for transaction manager
2020-04-15T20:53:24,127 INFO [Thread-101] txn.CompactionTxnHandler: Removed 0
rows from TXN_TO_WRITE_ID with Txn Low-Water-Mark: 1
2020-04-15T20:53:24,144 INFO [Metastore threads starter thread] txn.TxnDbUtil:
Txn tables already exist, returning
2020-04-15T20:53:24,144 INFO [Metastore threads starter thread]
metastore.PersistenceManagerProvider: Configuration
datanucleus.autoStartMechanismMode is not set. Defaulting to 'ignored'
2020-04-15T20:53:24,149 INFO [Metastore threads starter thread]
metastore.ObjectStore: RawStore:
org.apache.hadoop.hive.metastore.ObjectStore@3c2c9210, with PersistenceManager:
null will be shutdown
2020-04-15T20:53:24,150 INFO [Metastore threads starter thread]
metastore.ObjectStore: RawStore:
org.apache.hadoop.hive.metastore.ObjectStore@3c2c9210, with PersistenceManager:
org.datanucleus.api.jdo.JDOPersistenceManager@6f2db3de created in the thread
with id: 142
2020-04-15T20:53:24,272 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.size does not exist
2020-04-15T20:53:24,273 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.dummyparam.test.server.specific.config.override does not
exist
2020-04-15T20:53:24,273 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.metastore.metadb.dir does not exist
2020-04-15T20:53:24,273 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.alloc.min does not exist
2020-04-15T20:53:24,273 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.dummyparam.test.server.specific.config.hivesite does not
exist
2020-04-15T20:53:24,273 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.alloc.max does not exist
2020-04-15T20:53:24,274 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.metastore.runworker.in does not exist
2020-04-15T20:53:24,274 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.version does not exist
2020-04-15T20:53:24,274 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.root does not exist
2020-04-15T20:53:24,274 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.dummyparam.test.server.specific.config.metastoresite does
not exist
2020-04-15T20:53:24,275 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.arena.size does not exist
2020-04-15T20:53:24,275 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.stats.key.prefix.reserve.length does not exist
2020-04-15T20:53:24,275 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.test.console.log.level does not exist
2020-04-15T20:53:24,276 INFO [Thread-113] stats.StatsUpdaterThread: Stats
updater thread started
2020-04-15T20:53:24,276 INFO [Thread-113] stats.StatsUpdaterThread: Stats
updater worker thread Stats updater worker 0 started
2020-04-15T20:53:24,279 INFO [Metastore threads starter thread]
metastore.HiveMetaStore: Scheduling for
org.apache.hadoop.hive.metastore.MetastoreTaskThreadAlwaysTestImpl service with
frequency 7000ms.
2020-04-15T20:53:24,281 INFO [Metastore threads starter thread]
metastore.HiveMetaStore: Starting metastore thread of type
org.apache.hadoop.hive.ql.txn.compactor.Worker
Hive Session ID = 3dab4a9e-00e7-46ed-bb93-b27579ba1d93
2020-04-15T20:53:24,323 INFO [Stats updater worker 0] SessionState: Hive
Session ID = 3dab4a9e-00e7-46ed-bb93-b27579ba1d93
2020-04-15T20:53:24,332 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.size does not exist
2020-04-15T20:53:24,332 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.dummyparam.test.server.specific.config.override does not
exist
2020-04-15T20:53:24,332 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.metastore.metadb.dir does not exist
2020-04-15T20:53:24,332 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.alloc.min does not exist
2020-04-15T20:53:24,332 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.dummyparam.test.server.specific.config.hivesite does not
exist
2020-04-15T20:53:24,332 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.alloc.max does not exist
2020-04-15T20:53:24,332 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.metastore.runworker.in does not exist
2020-04-15T20:53:24,333 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.version does not exist
2020-04-15T20:53:24,333 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.root does not exist
2020-04-15T20:53:24,333 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.dummyparam.test.server.specific.config.metastoresite does
not exist
2020-04-15T20:53:24,334 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.llap.io.cache.orc.arena.size does not exist
2020-04-15T20:53:24,334 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.stats.key.prefix.reserve.length does not exist
2020-04-15T20:53:24,334 WARN [Metastore threads starter thread] conf.HiveConf:
HiveConf of name hive.test.console.log.level does not exist
2020-04-15T20:53:24,352 INFO [Metastore threads starter thread]
metastore.HiveMetaStoreClient: HMS client filtering is enabled.
2020-04-15T20:53:24,352 INFO [Metastore threads starter thread]
metastore.HiveMetaStoreClient: Trying to connect to metastore with URI
(thrift://localhost:57893)
2020-04-15T20:53:24,352 INFO [Metastore threads starter thread]
metastore.HiveMetaStoreClient: Opened a connection to metastore, URI
(thrift://localhost:57893) current connections: 2
2020-04-15T20:53:24,353 INFO [Metastore threads starter thread]
metastore.HiveMetaStoreClient: Connected to metastore.
2020-04-15T20:53:24,353 INFO [Metastore threads starter thread]
metastore.RetryingMetaStoreClient: RetryingMetaStoreClient proxy=class
org.apache.hadoop.hive.metastore.HiveMetaStoreClient ugi=hiveptest
(auth:SIMPLE) retries=1 delay=1 lifetime=0
ivysettings.xml file not found in HIVE_HOME or
HIVE_CONF_DIR,/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/conf/ivysettings.xml
will be used
2020-04-15T20:53:24,366 INFO [Stats updater worker 0] DependencyResolver:
ivysettings.xml file not found in HIVE_HOME or
HIVE_CONF_DIR,/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/conf/ivysettings.xml
will be used
2020-04-15T20:53:24,393 INFO [Metastore threads starter thread]
metastore.HiveMetaStore: This HMS instance will act as a Compactor Worker with
1 threads
2020-04-15T20:53:24,393 INFO
[hive-ptest-slaves-b43.c.gcp-hive-upstream.internal-171] compactor.Worker:
Starting Worker thread
2020-04-15T20:53:24,397 INFO [cmclearer-1] metastore.ReplChangeManager:
CMClearer started
2020-04-15T20:53:24,402 INFO [pool-23-thread-3] txn.TxnHandler: Hacking in
canned values for transaction manager
2020-04-15T20:53:24,446 INFO [pool-23-thread-3] txn.TxnDbUtil: Txn tables
already exist, returning
2020-04-15T20:53:29,820 INFO [main]
metastore.MetastoreHousekeepingLeaderTestBase:
Name: IPC Server idle connection scanner for port 49729 State: TIMED_WAITING
Class name: java.util.TimerThread
Name: qtp1353756631-85 State: TIMED_WAITING Class name: java.lang.Thread
Name: pool-5-thread-1 State: TIMED_WAITING Class name: java.lang.Thread
Name: pool-23-thread-3 State: RUNNABLE Class name: java.lang.Thread
Name: qtp1353756631-82 State: RUNNABLE Class name: java.lang.Thread
Name: IPC Server handler 2 on 52650 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: org.apache.hadoop.hdfs.server.datanode.DataXceiverServer@f0a66bd State:
RUNNABLE Class name: org.apache.hadoop.util.Daemon
Name: StorageLocationChecker thread 0 State: TIMED_WAITING Class name:
java.lang.Thread
Name:
qtp527939020-29-acceptor-0@5d168fc5-ServerConnector@5484117b{HTTP/1.1,[http/1.1]}{localhost:60465}
State: RUNNABLE Class name: java.lang.Thread
Name: IPC Server handler 7 on 49729 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: main State: RUNNABLE Class name: java.lang.Thread
Name: qtp1353756631-83 State: RUNNABLE Class name: java.lang.Thread
Name: pool-25-thread-3 State: WAITING Class name: java.lang.Thread
Name:
VolumeScannerThread(/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/dfs/data/data2)
State: TIMED_WAITING Class name:
org.apache.hadoop.hdfs.server.datanode.VolumeScanner
Name: IPC Parameter Sending Thread #0 State: TIMED_WAITING Class name:
java.lang.Thread
Name: org.apache.hadoop.util.JvmPauseMonitor$Monitor@e9474f State:
TIMED_WAITING Class name: org.apache.hadoop.util.Daemon
Name: qtp1353756631-86 State: TIMED_WAITING Class name: java.lang.Thread
Name: qtp527939020-30 State: TIMED_WAITING Class name: java.lang.Thread
Name: qtp1353756631-87 State: TIMED_WAITING Class name: java.lang.Thread
Name: org.eclipse.jetty.server.session.HashSessionManager@6cbe68e9Timer State:
TIMED_WAITING Class name: java.lang.Thread
Name: qtp1353756631-80 State: RUNNABLE Class name: java.lang.Thread
Name: org.apache.hadoop.util.JvmPauseMonitor$Monitor@5aaaa446 State:
TIMED_WAITING Class name: org.apache.hadoop.util.Daemon
Name: IPC Server handler 5 on 49729 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: process reaper State: TIMED_WAITING Class name: java.lang.Thread
Name: pool-12-thread-1 State: TIMED_WAITING Class name: java.lang.Thread
Name: org.eclipse.jetty.server.session.HashSessionManager@49e2b3c5Timer State:
TIMED_WAITING Class name: java.lang.Thread
Name: org.apache.hadoop.hdfs.server.namenode.LeaseManager$Monitor@e95595b
State: TIMED_WAITING Class name: org.apache.hadoop.util.Daemon
Name:
org.apache.hadoop.hdfs.server.blockmanagement.PendingReconstructionBlocks$PendingReconstructionMonitor@39ee94de
State: TIMED_WAITING Class name: org.apache.hadoop.util.Daemon
Name: pool-11-thread-1 State: TIMED_WAITING Class name: java.lang.Thread
Name: org.eclipse.jetty.server.session.HashSessionManager@3f0d6038Timer State:
TIMED_WAITING Class name: java.lang.Thread
Name: derby.rawStoreDaemon State: TIMED_WAITING Class name: java.lang.Thread
Name: hive-ptest-slaves-b43.c.gcp-hive-upstream.internal-171 State:
TIMED_WAITING Class name: org.apache.hadoop.hive.ql.txn.compactor.Worker
Name: IPC Server handler 4 on 52650 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: MetaStoreThread-57893 State: RUNNABLE Class name: java.lang.Thread
Name: RedundancyMonitor State: TIMED_WAITING Class name:
org.apache.hadoop.util.Daemon
Name: Thread-113 State: TIMED_WAITING Class name:
org.apache.hadoop.hive.ql.stats.StatsUpdaterThread
Name: org.apache.hadoop.hive.metastore.metrics.JvmPauseMonitor$Monitor@28708229
State: TIMED_WAITING Class name: org.apache.hadoop.util.Daemon
Name: qtp527939020-26 State: RUNNABLE Class name: java.lang.Thread
Name: Timer for 'NameNode' metrics system State: TIMED_WAITING Class name:
java.util.TimerThread
Name:
qtp1353756631-84-acceptor-0@373cb4b5-ServerConnector@1200458e{HTTP/1.1,[http/1.1]}{localhost:60676}
State: RUNNABLE Class name: java.lang.Thread
Name: Socket Reader #1 for port 49729 State: RUNNABLE Class name:
org.apache.hadoop.ipc.Server.Listener.Reader
Name: qtp527939020-32 State: TIMED_WAITING Class name: java.lang.Thread
Name: pool-3-thread-1 State: TIMED_WAITING Class name: java.lang.Thread
Name: pool-23-thread-1 State: WAITING Class name: java.lang.Thread
Name: IPC Server handler 4 on 49729 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: IPC Server handler 3 on 49729 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: Socket Reader #1 for port 52650 State: RUNNABLE Class name:
org.apache.hadoop.ipc.Server.Listener.Reader
Name: surefire-forkedjvm-command-thread State: RUNNABLE Class name:
java.lang.Thread
Name: Reference Handler State: WAITING Class name:
java.lang.ref.Reference.ReferenceHandler
Name: IPC Server handler 5 on 52650 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: DatanodeAdminMonitor-0 State: TIMED_WAITING Class name: java.lang.Thread
Name: BP-638667272-10.128.0.72-1587009186993 heartbeating to
localhost/127.0.0.1:52650 State: TIMED_WAITING Class name: java.lang.Thread
Name:
org.apache.hadoop.hdfs.server.namenode.FSNamesystem$NameNodeResourceMonitor@796f632b
State: TIMED_WAITING Class name: org.apache.hadoop.util.Daemon
Name: StorageLocationChecker thread 1 State: TIMED_WAITING Class name:
java.lang.Thread
Name: IPC Server handler 0 on 52650 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name:
org.apache.hadoop.hdfs.server.blockmanagement.HeartbeatManager$Monitor@2d3ef181
State: TIMED_WAITING Class name: org.apache.hadoop.util.Daemon
Name: HikariPool-2 housekeeper State: TIMED_WAITING Class name: java.lang.Thread
Name: IPC Server Responder State: RUNNABLE Class name:
org.apache.hadoop.ipc.Server.Responder
Name: IPC Server Responder State: RUNNABLE Class name:
org.apache.hadoop.ipc.Server.Responder
Name: IPC Client (999864329) connection to localhost/127.0.0.1:52650 from
hiveptest State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Client.Connection
Name:
refreshUsed-/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/dfs/data/data1/current/BP-638667272-10.128.0.72-1587009186993
State: TIMED_WAITING Class name: java.lang.Thread
Name: qtp527939020-25 State: RUNNABLE Class name: java.lang.Thread
Name: Stats updater worker 0 State: WAITING Class name: java.lang.Thread
Name: Timer-0 State: WAITING Class name: java.util.TimerThread
Name:
refreshUsed-/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/dfs/data/data2/current/BP-638667272-10.128.0.72-1587009186993
State: TIMED_WAITING Class name: java.lang.Thread
Name: org.eclipse.jetty.server.session.HashSessionManager@30db5536Timer State:
TIMED_WAITING Class name: java.lang.Thread
Name: org.eclipse.jetty.server.session.HashSessionManager@17143b3bTimer State:
TIMED_WAITING Class name: java.lang.Thread
Name: qtp1353756631-81 State: RUNNABLE Class name: java.lang.Thread
Name: Finalizer State: WAITING Class name:
java.lang.ref.Finalizer.FinalizerThread
Name: qtp527939020-27 State: RUNNABLE Class name: java.lang.Thread
Name: IPC Server handler 0 on 49729 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner
State: WAITING Class name: java.lang.Thread
Name: IPC Server listener on 52650 State: RUNNABLE Class name:
org.apache.hadoop.ipc.Server.Listener
Name:
org.apache.hadoop.hdfs.server.namenode.FSNamesystem$LazyPersistFileScrubber@40c2ce52
State: TIMED_WAITING Class name: org.apache.hadoop.util.Daemon
Name: IPC Server handler 2 on 49729 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: IPC Server handler 6 on 49729 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: process reaper State: TIMED_WAITING Class name: java.lang.Thread
Name: IPC Server handler 1 on 52650 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: IPC Server handler 8 on 49729 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: Thread-105 State: TIMED_WAITING Class name:
org.apache.hadoop.hive.ql.txn.compactor.Cleaner
Name: qtp527939020-28 State: RUNNABLE Class name: java.lang.Thread
Name: pool-25-thread-2 State: WAITING Class name: java.lang.Thread
Name: Signal Dispatcher State: RUNNABLE Class name: java.lang.Thread
Name: HikariPool-1 housekeeper State: TIMED_WAITING Class name: java.lang.Thread
Name: surefire-forkedjvm-ping-30s State: TIMED_WAITING Class name:
java.lang.Thread
Name: IPC Server handler 1 on 49729 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: Block report processor State: WAITING Class name:
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.BlockReportProcessingThread
Name: pool-9-thread-1 State: TIMED_WAITING Class name: java.lang.Thread
Name: FSEditLogAsync State: WAITING Class name: java.lang.Thread
Name: CacheReplicationMonitor(256006028) State: TIMED_WAITING Class name:
org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor
Name: cmclearer-1 State: TIMED_WAITING Class name: java.lang.Thread
Name:
org.apache.hadoop.hdfs.server.namenode.FSNamesystem$NameNodeEditLogRoller@59b32539
State: TIMED_WAITING Class name: org.apache.hadoop.util.Daemon
Name: IPC Server listener on 49729 State: RUNNABLE Class name:
org.apache.hadoop.ipc.Server.Listener
Name: nioEventLoopGroup-2-1 State: RUNNABLE Class name:
io.netty.util.concurrent.FastThreadLocalThread
Name: IPC Server handler 9 on 52650 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name:
VolumeScannerThread(/home/hiveptest/35.224.55.83-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/dfs/data/data1)
State: TIMED_WAITING Class name:
org.apache.hadoop.hdfs.server.datanode.VolumeScanner
Name: IPC Server handler 7 on 52650 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: IPC Server handler 9 on 49729 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: IPC Server idle connection scanner for port 52650 State: TIMED_WAITING
Class name: java.util.TimerThread
Name: pool-23-thread-2 State: RUNNABLE Class name: java.lang.Thread
Name: java.util.concurrent.ThreadPoolExecutor$Worker@73c234d1[State = -1, empty
queue] State: TIMED_WAITING Class name: org.apache.hadoop.util.Daemon
Name: DataNode DiskChecker thread 0 State: TIMED_WAITING Class name:
java.lang.Thread
Name: IPC Server handler 3 on 52650 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: HikariPool-3 housekeeper State: TIMED_WAITING Class name: java.lang.Thread
Name: IPC Client (999864329) connection to localhost/127.0.0.1:52650 from
hiveptest State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Client.Connection
Name: StorageInfoMonitor State: TIMED_WAITING Class name:
org.apache.hadoop.util.Daemon
Name: qtp527939020-31 State: TIMED_WAITING Class name: java.lang.Thread
Name: pool-25-thread-1 State: TIMED_WAITING Class name: java.lang.Thread
Name: IPC Server handler 8 on 52650 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: Thread-101 State: TIMED_WAITING Class name:
org.apache.hadoop.hive.ql.txn.compactor.Initiator
Name: IPC Server handler 6 on 52650 State: TIMED_WAITING Class name:
org.apache.hadoop.ipc.Server.Handler
Name: org.eclipse.jetty.server.session.HashSessionManager@3204e238Timer State:
TIMED_WAITING Class name: java.lang.Thread
2020-04-15T20:53:29,823 INFO [main]
metastore.TestMetastoreHousekeepingLeaderEmptyConfig: Found thread with name
cmclearer-
{code}
)
> Ignore flaky test testHouseKeepingThreadExistence in
> TestMetastoreHousekeepingLeaderEmptyConfig and TestMetastoreHousekeepingLeader
> -----------------------------------------------------------------------------------------------------------------------------------
>
> Key: HIVE-23221
> URL: https://issues.apache.org/jira/browse/HIVE-23221
> Project: Hive
> Issue Type: Bug
> Reporter: Miklos Gergely
> Assignee: Miklos Gergely
> Priority: Major
> Attachments: HIVE-23221.01.patch
>
>
> These tests are failing in an indeterministic fashion with such error
> messages as the one attached.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)