[ 
https://issues.apache.org/jira/browse/STORM-2841?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16360448#comment-16360448
 ] 

Jungtaek Lim commented on STORM-2841:
-------------------------------------

This is actually super odd, because it shouldn't call the real method while 
mocking spy object partially but it is being called.

We could avoid NPE via calling prepare() earlier, but if the error indicates 
that the mocking is not working in that case, the test will fail anyway.

Maybe we can just try out replacing partial mock with extending class, since 
without mocking that method there's no need to spy the class.

> testNoAcksIfFlushFails UT fails with NullPointerException
> ---------------------------------------------------------
>
>                 Key: STORM-2841
>                 URL: https://issues.apache.org/jira/browse/STORM-2841
>             Project: Apache Storm
>          Issue Type: Test
>            Reporter: Yesha Vora
>            Priority: Major
>
> testNoAcksIfFlushFails UT fails with NPE
> {code}
> java.lang.NullPointerException: null
>       at 
> org.apache.storm.hive.bolt.HiveBolt.flushAllWriters(HiveBolt.java:207)
>       at 
> org.apache.storm.hive.bolt.TestHiveBolt.testNoAcksIfFlushFails(TestHiveBolt.java:322){code}
> {code}
> Standard Output
> 60488 [main] INFO  h.metastore - Mestastore configuration 
> hive.metastore.filter.hook changed from 
> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
>  to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl
> 60489 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: Shutting down the object 
> store...
> 60489 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=Shutting down the object store...   
> 60489 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: Metastore shutdown complete.
> 60489 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=Metastore shutdown complete.        
> 60497 [main] INFO  o.a.h.h.q.s.SessionState - Created local directory: 
> /tmp/75acc3af-f90e-4ab2-888c-a8ad29383128_resources
> 60499 [main] INFO  o.a.h.h.q.s.SessionState - Created HDFS directory: 
> /tmp/hive/nobody/75acc3af-f90e-4ab2-888c-a8ad29383128
> 60501 [main] INFO  o.a.h.h.q.s.SessionState - Created local directory: 
> /tmp/nobody/75acc3af-f90e-4ab2-888c-a8ad29383128
> 60503 [main] INFO  o.a.h.h.q.s.SessionState - Created HDFS directory: 
> /tmp/hive/nobody/75acc3af-f90e-4ab2-888c-a8ad29383128/_tmp_space.db
> 60793 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: get_table_names_by_filter: db 
> = testdb, filter = 
> 60793 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=get_table_names_by_filter: db = testdb, filter =    
> 60847 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: Opening raw store with 
> implementation class:org.apache.hadoop.hive.metastore.ObjectStore
> 60848 [main] INFO  o.a.h.h.m.ObjectStore - ObjectStore, initialize called
> 60864 [main] INFO  D.Persistence - Property 
> hive.metastore.integral.jdo.pushdown unknown - will be ignored
> 60864 [main] INFO  D.Persistence - Property datanucleus.cache.level2 unknown 
> - will be ignored
> 61370 [main] INFO  o.a.h.h.m.ObjectStore - Setting MetaStore object pin 
> classes with 
> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
> 61711 [main] INFO  o.a.h.h.m.MetaStoreDirectSql - Using direct SQL, 
> underlying DB is DERBY
> 61711 [main] INFO  o.a.h.h.m.ObjectStore - Initialized ObjectStore
> 61716 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: get_table : db=testdb 
> tbl=test_table
> 61716 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=get_table : db=testdb tbl=test_table        
> 61738 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: drop_table : db=testdb 
> tbl=test_table
> 61738 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=drop_table : db=testdb tbl=test_table       
> 62053 [main] INFO  h.m.hivemetastoressimpl - deleting  
> raw:/tmp/junit2860976527054408251/testdb.db/test_table
> 62062 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: get_database: testdb
> 62063 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=get_database: testdb        
> 62068 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: drop_database: testdb
> 62068 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=drop_database: testdb       
> 62069 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: get_all_tables: db=testdb
> 62069 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=get_all_tables: db=testdb   
> 62073 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: get_functions: db=testdb pat=*
> 62073 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=get_functions: db=testdb pat=*      
> 62103 [main] INFO  o.a.h.h.m.ObjectStore - Dropping database testdb along 
> with all tables
> 62141 [main] INFO  h.m.hivemetastoressimpl - deleting  
> raw:/tmp/junit2860976527054408251/testdb.db
> 62148 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=Driver.run 
> from=org.apache.hadoop.hive.ql.Driver>
> 62148 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=compile 
> from=org.apache.hadoop.hive.ql.Driver>
> 62149 [main] INFO  o.a.h.h.q.Driver - We are setting the hadoop caller 
> context from  to nobody_20171128150454_9a730ff0-d279-4037-a1c9-4b3c28ff0a90
> 62149 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=parse 
> from=org.apache.hadoop.hive.ql.Driver>
> 62149 [main] INFO  h.q.p.ParseDriver - Parsing command: create database IF 
> NOT EXISTS testdb location 'raw:///tmp/junit8935893841909704896/testdb.db'
> 62149 [main] INFO  h.q.p.ParseDriver - Parse Completed
> 62149 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=parse 
> start=1511881494579 end=1511881494579 duration=0 
> from=org.apache.hadoop.hive.ql.Driver>
> 62149 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=semanticAnalyze 
> from=org.apache.hadoop.hive.ql.Driver>
> 62151 [main] INFO  o.a.h.h.q.Driver - Semantic Analysis Completed
> 62151 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=semanticAnalyze 
> start=1511881494579 end=1511881494581 duration=2 
> from=org.apache.hadoop.hive.ql.Driver>
> 62151 [main] INFO  o.a.h.h.q.Driver - Returning Hive schema: 
> Schema(fieldSchemas:null, properties:null)
> 62151 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=compile 
> start=1511881494578 end=1511881494581 duration=3 
> from=org.apache.hadoop.hive.ql.Driver>
> 62151 [main] INFO  o.a.h.h.q.Driver - We are resetting the hadoop caller 
> context to 
> 62151 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG 
> method=acquireReadWriteLocks from=org.apache.hadoop.hive.ql.Driver>
> 62151 [main] INFO  o.a.h.h.q.l.DbTxnManager - Setting lock request 
> transaction to txnid:0 for 
> queryId=nobody_20171128150454_9a730ff0-d279-4037-a1c9-4b3c28ff0a90
> 62151 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG 
> method=acquireReadWriteLocks start=1511881494581 end=1511881494581 duration=0 
> from=org.apache.hadoop.hive.ql.Driver>
> 62151 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=Driver.execute 
> from=org.apache.hadoop.hive.ql.Driver>
> 62152 [main] INFO  o.a.h.h.q.Driver - Setting caller context to query id 
> nobody_20171128150454_9a730ff0-d279-4037-a1c9-4b3c28ff0a90
> 62152 [main] INFO  o.a.h.h.q.Driver - Starting 
> command(queryId=nobody_20171128150454_9a730ff0-d279-4037-a1c9-4b3c28ff0a90): 
> create database IF NOT EXISTS testdb location 
> 'raw:///tmp/junit8935893841909704896/testdb.db'
> 62152 [main] INFO  o.a.h.h.q.s.a.p.s.SQLStdHiveAccessController - Created 
> SQLStdHiveAccessController for session context : HiveAuthzSessionContext 
> [sessionString=75acc3af-f90e-4ab2-888c-a8ad29383128, clientType=HIVECLI]
> 62152 [main] INFO  h.metastore - Mestastore configuration 
> hive.metastore.filter.hook changed from 
> org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to 
> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
> 62152 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: Shutting down the object 
> store...
> 62152 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=Shutting down the object store...   
> 62153 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: Metastore shutdown complete.
> 62153 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=Metastore shutdown complete.        
> 62153 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=runTasks 
> from=org.apache.hadoop.hive.ql.Driver>
> 62153 [main] INFO  o.a.h.h.q.Driver - Starting task [Stage-0:DDL] in serial 
> mode
> 62153 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: create_database: 
> Database(name:testdb, description:null, 
> locationUri:raw:/tmp/junit8935893841909704896/testdb.db, parameters:null, 
> ownerName:nobody, ownerType:USER)
> 62154 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=create_database: Database(name:testdb, description:null, 
> locationUri:raw:/tmp/junit8935893841909704896/testdb.db, parameters:null, 
> ownerName:nobody, ownerType:USER)        
> 62199 [main] WARN  o.a.h.h.c.HiveConf - HiveConf of name 
> hive.internal.ss.authz.settings.applied.marker does not exist
> 62200 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: Opening raw store with 
> implementation class:org.apache.hadoop.hive.metastore.ObjectStore
> 62201 [main] INFO  o.a.h.h.m.ObjectStore - ObjectStore, initialize called
> 62244 [main] WARN  o.a.h.h.c.HiveConf - HiveConf of name 
> hive.internal.ss.authz.settings.applied.marker does not exist
> 62246 [main] INFO  o.a.h.h.m.MetaStoreDirectSql - Using direct SQL, 
> underlying DB is DERBY
> 62246 [main] INFO  o.a.h.h.m.ObjectStore - Initialized ObjectStore
> 62247 [main] WARN  o.a.h.h.m.ObjectStore - Failed to get database testdb, 
> returning NoSuchObjectException
> 62247 [main] INFO  o.a.h.h.m.HiveMetaStore - create_database_core testdb
> 62247 [main] INFO  o.a.h.h.m.HiveMetaStore - create_database_core preEvent 
> testdb
> 62247 [main] INFO  o.a.h.h.m.HiveMetaStore - create_database_core filesystem 
> testdb
> 62247 [main] INFO  o.a.h.h.m.HiveMetaStore - create_database_core rdbms testdb
> 62256 [main] INFO  o.a.h.h.m.HiveMetaStore - create_database_core rdbms 
> committed testdb
> 62257 [main] INFO  o.a.h.h.m.HiveMetaStore - create_database_core listeners 
> testdb
> 62257 [main] INFO  o.a.h.h.m.HiveMetaStore - create_database_core listeners 
> done testdb
> 62257 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=runTasks 
> start=1511881494583 end=1511881494687 duration=104 
> from=org.apache.hadoop.hive.ql.Driver>
> 62257 [main] INFO  o.a.h.h.q.Driver - Resetting the caller context to 
> 62257 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=Driver.execute 
> start=1511881494581 end=1511881494687 duration=106 
> from=org.apache.hadoop.hive.ql.Driver>
> 62257 [main] INFO  o.a.h.h.q.Driver - OK
> 62257 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=releaseLocks 
> from=org.apache.hadoop.hive.ql.Driver>
> 62257 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=releaseLocks 
> start=1511881494687 end=1511881494687 duration=0 
> from=org.apache.hadoop.hive.ql.Driver>
> 62257 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=Driver.run 
> start=1511881494578 end=1511881494687 duration=109 
> from=org.apache.hadoop.hive.ql.Driver>
> 62257 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=Driver.run 
> from=org.apache.hadoop.hive.ql.Driver>
> 62257 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=compile 
> from=org.apache.hadoop.hive.ql.Driver>
> 62257 [main] INFO  o.a.h.h.q.Driver - We are setting the hadoop caller 
> context from  to nobody_20171128150454_8b6fa569-5a71-41c5-b207-df78c7b738c2
> 62257 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=parse 
> from=org.apache.hadoop.hive.ql.Driver>
> 62258 [main] INFO  h.q.p.ParseDriver - Parsing command: use testdb
> 62258 [main] INFO  h.q.p.ParseDriver - Parse Completed
> 62258 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=parse 
> start=1511881494687 end=1511881494688 duration=1 
> from=org.apache.hadoop.hive.ql.Driver>
> 62258 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=semanticAnalyze 
> from=org.apache.hadoop.hive.ql.Driver>
> 62258 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: get_database: testdb
> 62258 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=get_database: testdb        
> 62259 [main] INFO  o.a.h.h.q.Driver - Semantic Analysis Completed
> 62259 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=semanticAnalyze 
> start=1511881494688 end=1511881494689 duration=1 
> from=org.apache.hadoop.hive.ql.Driver>
> 62259 [main] INFO  o.a.h.h.q.Driver - Returning Hive schema: 
> Schema(fieldSchemas:null, properties:null)
> 62259 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=compile 
> start=1511881494687 end=1511881494689 duration=2 
> from=org.apache.hadoop.hive.ql.Driver>
> 62259 [main] INFO  o.a.h.h.q.Driver - We are resetting the hadoop caller 
> context to 
> 62259 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG 
> method=acquireReadWriteLocks from=org.apache.hadoop.hive.ql.Driver>
> 62259 [main] INFO  o.a.h.h.q.l.DbTxnManager - Setting lock request 
> transaction to txnid:0 for 
> queryId=nobody_20171128150454_8b6fa569-5a71-41c5-b207-df78c7b738c2
> 62259 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG 
> method=acquireReadWriteLocks start=1511881494689 end=1511881494689 duration=0 
> from=org.apache.hadoop.hive.ql.Driver>
> 62259 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=Driver.execute 
> from=org.apache.hadoop.hive.ql.Driver>
> 62259 [main] INFO  o.a.h.h.q.Driver - Setting caller context to query id 
> nobody_20171128150454_8b6fa569-5a71-41c5-b207-df78c7b738c2
> 62259 [main] INFO  o.a.h.h.q.Driver - Starting 
> command(queryId=nobody_20171128150454_8b6fa569-5a71-41c5-b207-df78c7b738c2): 
> create database IF NOT EXISTS testdb location 
> 'raw:///tmp/junit8935893841909704896/testdb.db'
> 62259 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=runTasks 
> from=org.apache.hadoop.hive.ql.Driver>
> 62259 [main] INFO  o.a.h.h.q.Driver - Starting task [Stage-0:DDL] in serial 
> mode
> 62260 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: get_database: testdb
> 62260 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=get_database: testdb        
> 62260 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: get_database: testdb
> 62260 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=get_database: testdb        
> 62261 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=runTasks 
> start=1511881494689 end=1511881494691 duration=2 
> from=org.apache.hadoop.hive.ql.Driver>
> 62261 [main] INFO  o.a.h.h.q.Driver - Resetting the caller context to 
> 62261 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=Driver.execute 
> start=1511881494689 end=1511881494691 duration=2 
> from=org.apache.hadoop.hive.ql.Driver>
> 62261 [main] INFO  o.a.h.h.q.Driver - OK
> 62261 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=releaseLocks 
> from=org.apache.hadoop.hive.ql.Driver>
> 62261 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=releaseLocks 
> start=1511881494691 end=1511881494691 duration=0 
> from=org.apache.hadoop.hive.ql.Driver>
> 62261 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=Driver.run 
> start=1511881494687 end=1511881494691 duration=4 
> from=org.apache.hadoop.hive.ql.Driver>
> 62261 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=Driver.run 
> from=org.apache.hadoop.hive.ql.Driver>
> 62261 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=compile 
> from=org.apache.hadoop.hive.ql.Driver>
> 62261 [main] INFO  o.a.h.h.q.Driver - We are setting the hadoop caller 
> context from  to nobody_20171128150454_f8e917c9-4cbc-4512-b93a-65db6c890fa3
> 62261 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=parse 
> from=org.apache.hadoop.hive.ql.Driver>
> 62261 [main] INFO  h.q.p.ParseDriver - Parsing command: create table 
> test_table ( id int,msg string ) partitioned by (city string,state string ) 
> clustered by ( id ) into 10 buckets  stored as orc  location 
> 'raw:///tmp/junit8935893841909704896/testdb.db/test_table' TBLPROPERTIES 
> ('transactional'='true')
> 62262 [main] INFO  h.q.p.ParseDriver - Parse Completed
> 62262 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=parse 
> start=1511881494691 end=1511881494692 duration=1 
> from=org.apache.hadoop.hive.ql.Driver>
> 62262 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=semanticAnalyze 
> from=org.apache.hadoop.hive.ql.Driver>
> 62262 [main] INFO  o.a.h.h.q.p.CalcitePlanner - Starting Semantic Analysis
> 62262 [main] INFO  o.a.h.h.q.p.CalcitePlanner - Creating table 
> testdb.test_table position=13
> 62263 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: get_database: testdb
> 62263 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=get_database: testdb        
> 62263 [main] INFO  o.a.h.h.q.Driver - Semantic Analysis Completed
> 62263 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=semanticAnalyze 
> start=1511881494692 end=1511881494693 duration=1 
> from=org.apache.hadoop.hive.ql.Driver>
> 62263 [main] INFO  o.a.h.h.q.Driver - Returning Hive schema: 
> Schema(fieldSchemas:null, properties:null)
> 62263 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=compile 
> start=1511881494691 end=1511881494693 duration=2 
> from=org.apache.hadoop.hive.ql.Driver>
> 62264 [main] INFO  o.a.h.h.q.Driver - We are resetting the hadoop caller 
> context to 
> 62264 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG 
> method=acquireReadWriteLocks from=org.apache.hadoop.hive.ql.Driver>
> 62264 [main] INFO  o.a.h.h.q.l.DbTxnManager - Setting lock request 
> transaction to txnid:0 for 
> queryId=nobody_20171128150454_f8e917c9-4cbc-4512-b93a-65db6c890fa3
> 62264 [main] INFO  o.a.h.h.q.l.DbLockManager - Requesting: 
> queryId=nobody_20171128150454_f8e917c9-4cbc-4512-b93a-65db6c890fa3 
> LockRequest(component:[LockComponent(type:SHARED_READ, level:DB, 
> dbname:testdb, operationType:NO_TXN, isDynamicPartitionWrite:false)], 
> txnid:0, user:nobody, 
> hostname:ctr-e134-1499953498516-345859-01-000003.hwx.site, 
> agentInfo:nobody_20171128150454_f8e917c9-4cbc-4512-b93a-65db6c890fa3)
> 62283 [main] INFO  o.a.h.h.q.l.DbLockManager - Response to 
> queryId=nobody_20171128150454_f8e917c9-4cbc-4512-b93a-65db6c890fa3 
> LockResponse(lockid:1, state:ACQUIRED)
> 62283 [main] INFO  o.a.h.h.q.l.DbTxnManager - Started heartbeat with 
> delay/interval = 150000/150000 MILLISECONDS for query: 
> nobody_20171128150454_f8e917c9-4cbc-4512-b93a-65db6c890fa3
> 62283 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG 
> method=acquireReadWriteLocks start=1511881494694 end=1511881494713 
> duration=19 from=org.apache.hadoop.hive.ql.Driver>
> 62283 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=Driver.execute 
> from=org.apache.hadoop.hive.ql.Driver>
> 62283 [main] INFO  o.a.h.h.q.Driver - Setting caller context to query id 
> nobody_20171128150454_f8e917c9-4cbc-4512-b93a-65db6c890fa3
> 62283 [main] INFO  o.a.h.h.q.Driver - Starting 
> command(queryId=nobody_20171128150454_f8e917c9-4cbc-4512-b93a-65db6c890fa3): 
> create database IF NOT EXISTS testdb location 
> 'raw:///tmp/junit8935893841909704896/testdb.db'
> 62283 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=runTasks 
> from=org.apache.hadoop.hive.ql.Driver>
> 62283 [main] INFO  o.a.h.h.q.Driver - Starting task [Stage-0:DDL] in serial 
> mode
> 62285 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: create_table: 
> Table(tableName:test_table, dbName:testdb, owner:nobody, 
> createTime:1511881494, lastAccessTime:0, retention:0, 
> sd:StorageDescriptor(cols:[FieldSchema(name:id, type:int, comment:null), 
> FieldSchema(name:msg, type:string, comment:null)], 
> location:raw:/tmp/junit8935893841909704896/testdb.db/test_table, 
> inputFormat:org.apache.hadoop.hive.ql.io.orc.OrcInputFormat, 
> outputFormat:org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat, 
> compressed:false, numBuckets:10, serdeInfo:SerDeInfo(name:null, 
> serializationLib:org.apache.hadoop.hive.ql.io.orc.OrcSerde, 
> parameters:{serialization.format=1}), bucketCols:[id], sortCols:[], 
> parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], 
> skewedColValueLocationMaps:{}), storedAsSubDirectories:false), 
> partitionKeys:[FieldSchema(name:city, type:string, comment:null), 
> FieldSchema(name:state, type:string, comment:null)], 
> parameters:{transactional=true}, viewOriginalText:null, 
> viewExpandedText:null, tableType:MANAGED_TABLE, 
> privileges:PrincipalPrivilegeSet(userPrivileges:{nobody=[PrivilegeGrantInfo(privilege:INSERT,
>  createTime:-1, grantor:nobody, grantorType:USER, grantOption:true), 
> PrivilegeGrantInfo(privilege:SELECT, createTime:-1, grantor:nobody, 
> grantorType:USER, grantOption:true), PrivilegeGrantInfo(privilege:UPDATE, 
> createTime:-1, grantor:nobody, grantorType:USER, grantOption:true), 
> PrivilegeGrantInfo(privilege:DELETE, createTime:-1, grantor:nobody, 
> grantorType:USER, grantOption:true)]}, groupPrivileges:null, 
> rolePrivileges:null), temporary:false)
> 62285 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=create_table: Table(tableName:test_table, dbName:testdb, owner:nobody, 
> createTime:1511881494, lastAccessTime:0, retention:0, 
> sd:StorageDescriptor(cols:[FieldSchema(name:id, type:int, comment:null), 
> FieldSchema(name:msg, type:string, comment:null)], 
> location:raw:/tmp/junit8935893841909704896/testdb.db/test_table, 
> inputFormat:org.apache.hadoop.hive.ql.io.orc.OrcInputFormat, 
> outputFormat:org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat, 
> compressed:false, numBuckets:10, serdeInfo:SerDeInfo(name:null, 
> serializationLib:org.apache.hadoop.hive.ql.io.orc.OrcSerde, 
> parameters:{serialization.format=1}), bucketCols:[id], sortCols:[], 
> parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], 
> skewedColValueLocationMaps:{}), storedAsSubDirectories:false), 
> partitionKeys:[FieldSchema(name:city, type:string, comment:null), 
> FieldSchema(name:state, type:string, comment:null)], 
> parameters:{transactional=true}, viewOriginalText:null, 
> viewExpandedText:null, tableType:MANAGED_TABLE, 
> privileges:PrincipalPrivilegeSet(userPrivileges:{nobody=[PrivilegeGrantInfo(privilege:INSERT,
>  createTime:-1, grantor:nobody, grantorType:USER, grantOption:true), 
> PrivilegeGrantInfo(privilege:SELECT, createTime:-1, grantor:nobody, 
> grantorType:USER, grantOption:true), PrivilegeGrantInfo(privilege:UPDATE, 
> createTime:-1, grantor:nobody, grantorType:USER, grantOption:true), 
> PrivilegeGrantInfo(privilege:DELETE, createTime:-1, grantor:nobody, 
> grantorType:USER, grantOption:true)]}, groupPrivileges:null, 
> rolePrivileges:null), temporary:false) 
> 62285 [main] INFO  o.a.h.h.m.HiveMetaStore - create_table_core 
> testdb.test_table
> 62285 [main] INFO  o.a.h.h.m.HiveMetaStore - create_table_core preEvent 
> testdb.test_table
> 62285 [main] INFO  o.a.h.h.m.HiveMetaStore - create_table_core rdbms checks 
> testdb.test_table
> 62286 [main] INFO  o.a.h.h.m.HiveMetaStore - create_table_core filesystem 
> testdb.test_table
> 62286 [main] WARN  o.a.h.h.m.HiveMetaStore - Location: 
> raw:/tmp/junit8935893841909704896/testdb.db/test_table specified for 
> non-external table:test_table
> 62286 [main] INFO  o.a.h.h.c.FileUtils - Creating directory if it doesn't 
> exist: raw:/tmp/junit8935893841909704896/testdb.db/test_table
> 62288 [main] INFO  o.a.h.h.m.HiveMetaStore - create_table_core stats 
> testdb.test_table
> 62288 [main] INFO  o.a.h.h.m.HiveMetaStore - create_table_core rdbms create 
> testdb.test_table
> 62324 [main] INFO  o.a.h.h.m.HiveMetaStore - create_table_core rdbms 
> committed testdb.test_table
> 62324 [main] INFO  o.a.h.h.m.HiveMetaStore - create_table_core rdbms 
> listeners testdb.test_table
> 62324 [main] INFO  o.a.h.h.m.HiveMetaStore - create_table_core rdbms 
> listeners done testdb.test_table
> 62324 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=runTasks 
> start=1511881494713 end=1511881494754 duration=41 
> from=org.apache.hadoop.hive.ql.Driver>
> 62324 [main] INFO  o.a.h.h.q.Driver - Resetting the caller context to 
> 62324 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=Driver.execute 
> start=1511881494713 end=1511881494754 duration=41 
> from=org.apache.hadoop.hive.ql.Driver>
> 62324 [main] INFO  o.a.h.h.q.Driver - OK
> 62324 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=releaseLocks 
> from=org.apache.hadoop.hive.ql.Driver>
> 62324 [main] INFO  o.a.h.h.q.l.DbTxnManager - Stopped heartbeat for query: 
> nobody_20171128150454_f8e917c9-4cbc-4512-b93a-65db6c890fa3
> 62324 [main] INFO  o.a.h.h.q.l.DbLockManager - releaseLocks: [lockid:1 
> queryId=nobody_20171128150454_f8e917c9-4cbc-4512-b93a-65db6c890fa3 txnid:0]
> 62328 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=releaseLocks 
> start=1511881494754 end=1511881494758 duration=4 
> from=org.apache.hadoop.hive.ql.Driver>
> 62328 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=Driver.run 
> start=1511881494691 end=1511881494758 duration=67 
> from=org.apache.hadoop.hive.ql.Driver>
> 62328 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=Driver.run 
> from=org.apache.hadoop.hive.ql.Driver>
> 62329 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=compile 
> from=org.apache.hadoop.hive.ql.Driver>
> 62329 [main] INFO  o.a.h.h.q.Driver - We are setting the hadoop caller 
> context from  to nobody_20171128150454_ae26eba9-11ee-41a6-9632-4b64134ecb73
> 62329 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=parse 
> from=org.apache.hadoop.hive.ql.Driver>
> 62329 [main] INFO  h.q.p.ParseDriver - Parsing command: alter table 
> test_table add partition ( city = 'sunnyvale',state = 'ca' )
> 62329 [main] INFO  h.q.p.ParseDriver - Parse Completed
> 62329 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=parse 
> start=1511881494759 end=1511881494759 duration=0 
> from=org.apache.hadoop.hive.ql.Driver>
> 62329 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=semanticAnalyze 
> from=org.apache.hadoop.hive.ql.Driver>
> 62330 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: get_table : db=testdb 
> tbl=test_table
> 62330 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=get_table : db=testdb tbl=test_table        
> 62374 [main] INFO  o.a.h.h.q.Driver - Semantic Analysis Completed
> 62374 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=semanticAnalyze 
> start=1511881494759 end=1511881494804 duration=45 
> from=org.apache.hadoop.hive.ql.Driver>
> 62374 [main] INFO  o.a.h.h.q.Driver - Returning Hive schema: 
> Schema(fieldSchemas:null, properties:null)
> 62374 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=compile 
> start=1511881494759 end=1511881494804 duration=45 
> from=org.apache.hadoop.hive.ql.Driver>
> 62374 [main] INFO  o.a.h.h.q.Driver - We are resetting the hadoop caller 
> context to 
> 62374 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG 
> method=acquireReadWriteLocks from=org.apache.hadoop.hive.ql.Driver>
> 62374 [main] INFO  o.a.h.h.q.l.DbTxnManager - Setting lock request 
> transaction to txnid:0 for 
> queryId=nobody_20171128150454_ae26eba9-11ee-41a6-9632-4b64134ecb73
> 62374 [main] INFO  o.a.h.h.q.l.DbLockManager - Requesting: 
> queryId=nobody_20171128150454_ae26eba9-11ee-41a6-9632-4b64134ecb73 
> LockRequest(component:[LockComponent(type:SHARED_READ, level:TABLE, 
> dbname:testdb, tablename:test_table, operationType:NO_TXN, isAcid:true, 
> isDynamicPartitionWrite:false)], txnid:0, user:nobody, 
> hostname:ctr-e134-1499953498516-345859-01-000003.hwx.site, 
> agentInfo:nobody_20171128150454_ae26eba9-11ee-41a6-9632-4b64134ecb73)
> 62386 [main] INFO  o.a.h.h.q.l.DbLockManager - Response to 
> queryId=nobody_20171128150454_ae26eba9-11ee-41a6-9632-4b64134ecb73 
> LockResponse(lockid:2, state:ACQUIRED)
> 62386 [main] INFO  o.a.h.h.q.l.DbTxnManager - Started heartbeat with 
> delay/interval = 150000/150000 MILLISECONDS for query: 
> nobody_20171128150454_ae26eba9-11ee-41a6-9632-4b64134ecb73
> 62386 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG 
> method=acquireReadWriteLocks start=1511881494804 end=1511881494816 
> duration=12 from=org.apache.hadoop.hive.ql.Driver>
> 62386 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=Driver.execute 
> from=org.apache.hadoop.hive.ql.Driver>
> 62386 [main] INFO  o.a.h.h.q.Driver - Setting caller context to query id 
> nobody_20171128150454_ae26eba9-11ee-41a6-9632-4b64134ecb73
> 62386 [main] INFO  o.a.h.h.q.Driver - Starting 
> command(queryId=nobody_20171128150454_ae26eba9-11ee-41a6-9632-4b64134ecb73): 
> create database IF NOT EXISTS testdb location 
> 'raw:///tmp/junit8935893841909704896/testdb.db'
> 62386 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=runTasks 
> from=org.apache.hadoop.hive.ql.Driver>
> 62387 [main] INFO  o.a.h.h.q.Driver - Starting task [Stage-0:DDL] in serial 
> mode
> 62387 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: get_table : db=testdb 
> tbl=test_table
> 62387 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=get_table : db=testdb tbl=test_table        
> 62393 [main] INFO  o.a.h.h.m.HiveMetaStore - 0: add_partitions
> 62393 [main] INFO  o.a.h.h.m.H.audit - ugi=nobody     ip=unknown-ip-addr      
> cmd=add_partitions      
> 62404 [HMSHandler #3] INFO  o.a.h.h.c.FileUtils - Creating directory if it 
> doesn't exist: 
> raw:/tmp/junit8935893841909704896/testdb.db/test_table/city=sunnyvale/state=ca
> 62422 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=runTasks 
> start=1511881494816 end=1511881494852 duration=36 
> from=org.apache.hadoop.hive.ql.Driver>
> 62422 [main] INFO  o.a.h.h.q.Driver - Resetting the caller context to 
> 62422 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=Driver.execute 
> start=1511881494816 end=1511881494852 duration=36 
> from=org.apache.hadoop.hive.ql.Driver>
> 62422 [main] INFO  o.a.h.h.q.Driver - OK
> 62422 [main] INFO  o.a.h.h.q.l.PerfLogger - <PERFLOG method=releaseLocks 
> from=org.apache.hadoop.hive.ql.Driver>
> 62422 [main] INFO  o.a.h.h.q.l.DbTxnManager - Stopped heartbeat for query: 
> nobody_20171128150454_ae26eba9-11ee-41a6-9632-4b64134ecb73
> 62422 [main] INFO  o.a.h.h.q.l.DbLockManager - releaseLocks: [lockid:2 
> queryId=nobody_20171128150454_ae26eba9-11ee-41a6-9632-4b64134ecb73 txnid:0]
> 62426 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=releaseLocks 
> start=1511881494852 end=1511881494856 duration=4 
> from=org.apache.hadoop.hive.ql.Driver>
> 62426 [main] INFO  o.a.h.h.q.l.PerfLogger - </PERFLOG method=Driver.run 
> start=1511881494758 end=1511881494856 duration=98 
> from=org.apache.hadoop.hive.ql.Driver>
> Standard Error
> OK
> OK
> OK
> OK{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to