Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-5.3/17/

1 tests failed.
FAILED:  org.apache.solr.cloud.hdfs.StressHdfsTest.test

Error Message:
No registered leader was found after waiting for 30000ms , collection: 
delete_data_dir slice: shard4

Stack Trace:
org.apache.solr.common.SolrException: No registered leader was found after 
waiting for 30000ms , collection: delete_data_dir slice: shard4
        at 
__randomizedtesting.SeedInfo.seed([7141961A233E85D7:F915A9C08DC2E82F]:0)
        at 
org.apache.solr.common.cloud.ZkStateReader.getLeaderRetry(ZkStateReader.java:637)
        at 
org.apache.solr.cloud.hdfs.StressHdfsTest.createAndDeleteCollection(StressHdfsTest.java:159)
        at 
org.apache.solr.cloud.hdfs.StressHdfsTest.test(StressHdfsTest.java:98)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886)
        at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:963)
        at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:938)
        at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
        at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
        at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
        at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
        at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
        at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
        at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
        at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
        at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
        at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
        at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
        at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
        at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
        at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
        at java.lang.Thread.run(Thread.java:745)




Build Log:
[...truncated 10969 lines...]
   [junit4] JVM J1: stdout was not empty, see: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/temp/junit4-J1-20160122_142625_681.sysout
   [junit4] >>> JVM J1: stdout (verbatim) ----
   [junit4] java.lang.OutOfMemoryError: GC overhead limit exceeded
   [junit4] Dumping heap to 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/heapdumps/java_pid2447.hprof
 ...
   [junit4] Heap dump file created [608644680 bytes in 24.688 secs]
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J1: stderr was not empty, see: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/temp/junit4-J1-20160122_142625_681.syserr
   [junit4] >>> JVM J1: stderr (verbatim) ----
   [junit4] WARN: Unhandled exception in event serialization. -> 
java.lang.OutOfMemoryError: GC overhead limit exceeded
   [junit4]     at java.nio.CharBuffer.wrap(CharBuffer.java:369)
   [junit4]     at sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:265)
   [junit4]     at sun.nio.cs.StreamEncoder.write(StreamEncoder.java:125)
   [junit4]     at sun.nio.cs.StreamEncoder.write(StreamEncoder.java:113)
   [junit4]     at java.io.OutputStreamWriter.write(OutputStreamWriter.java:194)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.dependencies.com.google.gson.stream.JsonWriter.string(JsonWriter.java:535)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.dependencies.com.google.gson.stream.JsonWriter.value(JsonWriter.java:364)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.dependencies.com.google.gson.internal.bind.TypeAdapters$22.write(TypeAdapters.java:626)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.dependencies.com.google.gson.internal.bind.TypeAdapters$22.write(TypeAdapters.java:578)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.dependencies.com.google.gson.internal.Streams.write(Streams.java:67)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.dependencies.com.google.gson.GsonToMiniGsonTypeAdapterFactory$3.write(GsonToMiniGsonTypeAdapterFactory.java:98)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.dependencies.com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper.write(TypeAdapterRuntimeTypeWrapper.java:66)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.dependencies.com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.write(ReflectiveTypeAdapterFactory.java:82)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.dependencies.com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.write(ReflectiveTypeAdapterFactory.java:194)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.dependencies.com.google.gson.Gson.toJson(Gson.java:512)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.events.Serializer.serialize(Serializer.java:87)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.slave.SlaveMain$4.write(SlaveMain.java:410)
   [junit4]     at 
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
   [junit4]     at 
java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
   [junit4]     at java.io.PrintStream.flush(PrintStream.java:338)
   [junit4]     at java.io.FilterOutputStream.flush(FilterOutputStream.java:140)
   [junit4]     at java.io.PrintStream.write(PrintStream.java:482)
   [junit4]     at sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:221)
   [junit4]     at 
sun.nio.cs.StreamEncoder.implFlushBuffer(StreamEncoder.java:291)
   [junit4]     at sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:295)
   [junit4]     at sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:141)
   [junit4]     at java.io.OutputStreamWriter.flush(OutputStreamWriter.java:229)
   [junit4]     at 
org.apache.log4j.helpers.QuietWriter.flush(QuietWriter.java:59)
   [junit4]     at 
org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:324)
   [junit4]     at 
org.apache.log4j.WriterAppender.append(WriterAppender.java:162)
   [junit4]     at 
org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
   [junit4]     at 
org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
   [junit4] <<< JVM J1: EOF ----

[...truncated 641 lines...]
   [junit4] Suite: org.apache.solr.cloud.hdfs.StressHdfsTest
   [junit4]   2> Creating dataDir: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/init-core-data-001
   [junit4]   2> 9642014 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] 
o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: 
/ab_m/wo
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 9642100 WARN  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] 
o.a.h.m.i.MetricsConfig Cannot locate configuration: tried 
hadoop-metrics2-namenode.properties,hadoop-metrics2.properties
   [junit4]   2> 9642115 WARN  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] 
o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 9642118 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] o.m.log 
jetty-6.1.26
   [junit4]   2> 9642158 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] o.m.log Extract 
jar:file:/x1/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/hdfs
 to ./temp/Jetty_localhost_34535_hdfs____y9njeo/webapp
   [junit4]   2> 9642523 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] o.m.log NO JSP 
Support for /, did not find org.apache.jasper.servlet.JspServlet
   [junit4]   2> 9643008 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] o.m.log Started 
HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:34535
   [junit4]   2> 9643297 WARN  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] 
o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 9643299 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] o.m.log 
jetty-6.1.26
   [junit4]   2> 9643329 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] o.m.log Extract 
jar:file:/x1/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/datanode
 to ./temp/Jetty_localhost_56341_datanode____ld00ly/webapp
   [junit4]   2> 9643693 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] o.m.log NO JSP 
Support for /, did not find org.apache.jasper.servlet.JspServlet
   [junit4]   2> 9644097 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] o.m.log Started 
HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:56341
   [junit4]   2> 9644513 WARN  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] 
o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 9644514 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] o.m.log 
jetty-6.1.26
   [junit4]   2> 9644567 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] o.m.log Extract 
jar:file:/x1/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/datanode
 to ./temp/Jetty_localhost_34204_datanode____.ck5otg/webapp
   [junit4]   2> 9644653 INFO  (IPC Server handler 4 on 43763) [    ] 
BlockStateChange BLOCK* processReport: from storage 
DS-5943d48a-93bf-4f50-acb0-9efc92284236 node DatanodeRegistration(127.0.0.1, 
datanodeUuid=e880bdf6-3cde-4875-8e90-fa54f39c1df3, infoPort=56341, 
ipcPort=39414, storageInfo=lv=-56;cid=testClusterID;nsid=1875309937;c=0), 
blocks: 0, hasStaleStorages: true, processing time: 0 msecs
   [junit4]   2> 9644654 INFO  (IPC Server handler 4 on 43763) [    ] 
BlockStateChange BLOCK* processReport: from storage 
DS-923ed43a-9fca-4afa-bb50-775d2d9b1f76 node DatanodeRegistration(127.0.0.1, 
datanodeUuid=e880bdf6-3cde-4875-8e90-fa54f39c1df3, infoPort=56341, 
ipcPort=39414, storageInfo=lv=-56;cid=testClusterID;nsid=1875309937;c=0), 
blocks: 0, hasStaleStorages: false, processing time: 0 msecs
   [junit4]   2> 9644995 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] o.m.log NO JSP 
Support for /, did not find org.apache.jasper.servlet.JspServlet
   [junit4]   2> 9645281 INFO  
(SUITE-StressHdfsTest-seed#[7141961A233E85D7]-worker) [    ] o.m.log Started 
HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:34204
   [junit4]   2> 9645602 INFO  (IPC Server handler 6 on 43763) [    ] 
BlockStateChange BLOCK* processReport: from storage 
DS-8c9d9fdc-d28a-4c28-8e3b-9a66704a8388 node DatanodeRegistration(127.0.0.1, 
datanodeUuid=6cc931c2-449b-4cb1-8e87-8c39e4e40057, infoPort=34204, 
ipcPort=33010, storageInfo=lv=-56;cid=testClusterID;nsid=1875309937;c=0), 
blocks: 0, hasStaleStorages: true, processing time: 0 msecs
   [junit4]   2> 9645603 INFO  (IPC Server handler 6 on 43763) [    ] 
BlockStateChange BLOCK* processReport: from storage 
DS-e5a97b2e-64d8-470d-a759-43bbbfc49217 node DatanodeRegistration(127.0.0.1, 
datanodeUuid=6cc931c2-449b-4cb1-8e87-8c39e4e40057, infoPort=34204, 
ipcPort=33010, storageInfo=lv=-56;cid=testClusterID;nsid=1875309937;c=0), 
blocks: 0, hasStaleStorages: false, processing time: 0 msecs
   [junit4]   2> 9645720 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.ZkTestServer 
STARTING ZK TEST SERVER
   [junit4]   2> 9645727 INFO  (Thread-103172) [    ] o.a.s.c.ZkTestServer 
client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 9645727 INFO  (Thread-103172) [    ] o.a.s.c.ZkTestServer 
Starting server
   [junit4]   2> 9645823 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.ZkTestServer 
start zk server on port:60619
   [junit4]   2> 9645824 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
   [junit4]   2> 9645855 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 9645899 INFO  (zkCallback-2848-thread-1) [    ] 
o.a.s.c.c.ConnectionManager Watcher 
org.apache.solr.common.cloud.ConnectionManager@5bbff98b 
name:ZooKeeperConnection Watcher:127.0.0.1:60619 got event WatchedEvent 
state:SyncConnected type:None path:null path:null type:None
   [junit4]   2> 9645900 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 9645900 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
   [junit4]   2> 9645900 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /solr
   [junit4]   2> 9645903 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
   [junit4]   2> 9645932 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 9645971 INFO  (zkCallback-2849-thread-1) [    ] 
o.a.s.c.c.ConnectionManager Watcher 
org.apache.solr.common.cloud.ConnectionManager@16c3a841 
name:ZooKeeperConnection Watcher:127.0.0.1:60619/solr got event WatchedEvent 
state:SyncConnected type:None path:null path:null type:None
   [junit4]   2> 9645979 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 9645980 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
   [junit4]   2> 9645980 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /collections/collection1
   [junit4]   2> 9645981 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /collections/collection1/shards
   [junit4]   2> 9645982 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /collections/control_collection
   [junit4]   2> 9645983 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /collections/control_collection/shards
   [junit4]   2> 9645985 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml
 to /configs/conf1/solrconfig.xml
   [junit4]   2> 9645985 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/solrconfig.xml
   [junit4]   2> 9645987 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/src/test-files/solr/collection1/conf/schema.xml
 to /configs/conf1/schema.xml
   [junit4]   2> 9645987 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/schema.xml
   [junit4]   2> 9645988 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml
 to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 9645989 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: 
/configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 9645990 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/src/test-files/solr/collection1/conf/stopwords.txt
 to /configs/conf1/stopwords.txt
   [junit4]   2> 9645990 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/stopwords.txt
   [junit4]   2> 9645991 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/src/test-files/solr/collection1/conf/protwords.txt
 to /configs/conf1/protwords.txt
   [junit4]   2> 9645991 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/protwords.txt
   [junit4]   2> 9645993 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/src/test-files/solr/collection1/conf/currency.xml
 to /configs/conf1/currency.xml
   [junit4]   2> 9645993 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/currency.xml
   [junit4]   2> 9645994 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml
 to /configs/conf1/enumsConfig.xml
   [junit4]   2> 9645994 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/enumsConfig.xml
   [junit4]   2> 9645996 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json
 to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 9645996 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/open-exchange-rates.json
   [junit4]   2> 9645997 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt
 to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 9645997 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 9645998 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt
 to /configs/conf1/old_synonyms.txt
   [junit4]   2> 9645999 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/old_synonyms.txt
   [junit4]   2> 9646000 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/src/test-files/solr/collection1/conf/synonyms.txt
 to /configs/conf1/synonyms.txt
   [junit4]   2> 9646000 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/synonyms.txt
   [junit4]   2> 9646452 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.SolrTestCaseJ4 
Writing core.properties file to 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/cores/collection1
   [junit4]   2> 9646459 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.e.j.s.Server 
jetty-9.2.11.v20150529
   [junit4]   2> 9646555 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.e.j.s.h.ContextHandler Started 
o.e.j.s.ServletContextHandler@3a3aaed8{/ab_m/wo,null,AVAILABLE}
   [junit4]   2> 9646556 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.e.j.s.ServerConnector Started 
ServerConnector@62ad04c5{HTTP/1.1}{127.0.0.1:37899}
   [junit4]   2> 9646556 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.e.j.s.Server 
Started @9649713ms
   [junit4]   2> 9646556 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostPort=37899, 
solr.data.dir=hdfs://localhost:43763/hdfs__localhost_43763__x1_jenkins_jenkins-slave_workspace_Lucene-Solr-NightlyTests-5.3_solr_build_solr-core_test_J0_temp_solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001_tempDir-002_control_data,
 hostContext=/ab_m/wo, 
coreRootDirectory=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/cores}
   [junit4]   2> 9646557 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.s.SolrDispatchFilter SolrDispatchFilter.init(): 
sun.misc.Launcher$AppClassLoader@7b3cb2c6
   [junit4]   2> 9646557 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.SolrResourceLoader new SolrResourceLoader for directory: 
'/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/'
   [junit4]   2> 9646580 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
   [junit4]   2> 9646611 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 9646643 INFO  (zkCallback-2850-thread-1) [    ] 
o.a.s.c.c.ConnectionManager Watcher 
org.apache.solr.common.cloud.ConnectionManager@67fbefe2 
name:ZooKeeperConnection Watcher:127.0.0.1:60619/solr got event WatchedEvent 
state:SyncConnected type:None path:null path:null type:None
   [junit4]   2> 9646644 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 9646644 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
   [junit4]   2> 9646645 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in 
ZooKeeper)
   [junit4]   2> 9646645 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.SolrXmlConfig 
Loading container configuration from 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/solr.xml
   [junit4]   2> 9646662 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.CoresLocator 
Config-defined core root directory: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/cores
   [junit4]   2> 9646663 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.CoreContainer 
New CoreContainer 1046115022
   [junit4]   2> 9646663 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.CoreContainer 
Loading cores into CoreContainer 
[instanceDir=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/]
   [junit4]   2> 9646663 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.CoreContainer 
loading shared library: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/lib
   [junit4]   2> 9646667 WARN  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.SolrResourceLoader Can't find (or read) directory to add to 
classloader: lib (resolved as: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/lib).
   [junit4]   2> 9646683 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.h.c.HttpShardHandlerFactory created with socketTimeout : 90000,urlScheme 
: ,connTimeout : 15000,maxConnectionsPerHost : 20,maxConnections : 
10000,corePoolSize : 0,maximumPoolSize : 2147483647,maxThreadIdleTime : 
5,sizeOfQueue : -1,fairnessPolicy : false,useRetries : false,
   [junit4]   2> 9646685 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.u.UpdateShardHandler Creating UpdateShardHandler HTTP client with params: 
socketTimeout=340000&connTimeout=45000&retry=true
   [junit4]   2> 9646685 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.l.LogWatcher 
SLF4J impl is org.slf4j.impl.Log4jLoggerFactory
   [junit4]   2> 9646685 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.l.LogWatcher 
Registering Log Listener [Log4j (org.slf4j.impl.Log4jLoggerFactory)]
   [junit4]   2> 9646685 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.ZkContainer 
Zookeeper client=127.0.0.1:60619/solr
   [junit4]   2> 9646686 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.ZkController 
zkHost includes chroot
   [junit4]   2> 9646686 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
   [junit4]   2> 9646716 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 9646745 INFO  (zkCallback-2852-thread-1) [    ] 
o.a.s.c.c.ConnectionManager Watcher 
org.apache.solr.common.cloud.ConnectionManager@5d8afdc3 
name:ZooKeeperConnection Watcher:127.0.0.1:60619 got event WatchedEvent 
state:SyncConnected type:None path:null path:null type:None
   [junit4]   2> 9646747 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 9646747 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
   [junit4]   2> 9646844 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 9646855 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.c.ConnectionManager Watcher 
org.apache.solr.common.cloud.ConnectionManager@1949bd0b 
name:ZooKeeperConnection Watcher:127.0.0.1:60619/solr got event WatchedEvent 
state:SyncConnected type:None path:null path:null type:None
   [junit4]   2> 9646856 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 9646857 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /overseer/queue
   [junit4]   2> 9646858 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /overseer/collection-queue-work
   [junit4]   2> 9646860 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /overseer/collection-map-running
   [junit4]   2> 9646861 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /overseer/collection-map-completed
   [junit4]   2> 9646863 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /overseer/collection-map-failure
   [junit4]   2> 9646864 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /live_nodes
   [junit4]   2> 9646865 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /aliases.json
   [junit4]   2> 9646866 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /clusterstate.json
   [junit4]   2> 9646867 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /security.json
   [junit4]   2> 9646868 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.ZkController Register node as live in 
ZooKeeper:/live_nodes/127.0.0.1:37899_ab_m%2Fwo
   [junit4]   2> 9646868 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /live_nodes/127.0.0.1:37899_ab_m%2Fwo
   [junit4]   2> 9646870 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /overseer_elect
   [junit4]   2> 9646870 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /overseer_elect/election
   [junit4]   2> 9646871 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 9646872 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.LeaderElector Joined leadership election with path: 
/overseer_elect/election/95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000
   [junit4]   2> 9646873 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.OverseerElectionContext I am going to be the leader 
127.0.0.1:37899_ab_m%2Fwo
   [junit4]   2> 9646873 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /overseer_elect/leader
   [junit4]   2> 9646874 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.Overseer Overseer 
(id=95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000) starting
   [junit4]   2> 9646875 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /overseer/queue-work
   [junit4]   2> 9646879 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.OverseerAutoReplicaFailoverThread Starting 
OverseerAutoReplicaFailoverThread autoReplicaFailoverWorkLoopDelay=10000 
autoReplicaFailoverWaitAfterExpiration=10000 
autoReplicaFailoverBadNodeExpiration=60000
   [junit4]   2> 9646894 INFO  
(OverseerStateUpdate-95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.Overseer Starting to work on the main 
queue
   [junit4]   2> 9646907 INFO  
(OverseerCollectionProcessor-95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000)
 [n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.OverseerCollectionProcessor Process 
current queue of collection creations
   [junit4]   2> 9646922 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.c.ZkStateReader Updating cluster state from ZooKeeper... 
   [junit4]   2> 9646946 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.CoreContainer Security conf doesn't exist. Skipping setup for 
authorization module.
   [junit4]   2> 9646946 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.CoreContainer No authentication plugin used.
   [junit4]   2> 9646947 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.CoresLocator Looking for core definitions underneath 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/cores
   [junit4]   2> 9646948 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.SolrCore Created CoreDescriptor: 
{instanceDir=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/cores/collection1,
 dataDir=data/, configSetProperties=configsetprops.json, 
absoluteInstDir=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/cores/collection1/,
 name=collection1, shard=, transient=false, coreNodeName=, 
collection=control_collection, config=solrconfig.xml, loadOnStartup=true, 
schema=schema.xml}
   [junit4]   2> 9646948 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.CoresLocator Found core collection1 in 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/cores/collection1/
   [junit4]   2> 9646948 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.c.CoresLocator Found 1 core definitions
   [junit4]   2> 9646949 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.ZkController publishing state=down
   [junit4]   2> 9646949 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.ZkController numShards not found on descriptor - reading it from system 
property
   [junit4]   2> 9646949 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.s.SolrDispatchFilter 
user.dir=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0
   [junit4]   2> 9646949 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:37899_ab_m%2Fwo 
   ] o.a.s.s.SolrDispatchFilter SolrDispatchFilter.init() done
   [junit4]   2> 9646950 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.ZkController look for our core node name
   [junit4]   2> 9646950 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.DistributedQueue NodeChildrenChanged 
fired on path /overseer/queue state SyncConnected
   [junit4]   2> 9646951 INFO  
(OverseerStateUpdate-95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.Overseer processMessage: queueSize: 
1, message = {
   [junit4]   2>   "state":"down",
   [junit4]   2>   "shard":null,
   [junit4]   2>   "operation":"state",
   [junit4]   2>   "core":"collection1",
   [junit4]   2>   "numShards":"1",
   [junit4]   2>   "collection":"control_collection",
   [junit4]   2>   "roles":null,
   [junit4]   2>   "node_name":"127.0.0.1:37899_ab_m%2Fwo",
   [junit4]   2>   "base_url":"http://127.0.0.1:37899/ab_m/wo"} current state 
version: 0
   [junit4]   2> 9646951 INFO  
(OverseerStateUpdate-95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.o.ReplicaMutator Update state 
numShards=1 message={
   [junit4]   2>   "state":"down",
   [junit4]   2>   "shard":null,
   [junit4]   2>   "operation":"state",
   [junit4]   2>   "core":"collection1",
   [junit4]   2>   "numShards":"1",
   [junit4]   2>   "collection":"control_collection",
   [junit4]   2>   "roles":null,
   [junit4]   2>   "node_name":"127.0.0.1:37899_ab_m%2Fwo",
   [junit4]   2>   "base_url":"http://127.0.0.1:37899/ab_m/wo"}
   [junit4]   2> 9646952 INFO  
(OverseerStateUpdate-95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.o.ClusterStateMutator building a new 
cName: control_collection
   [junit4]   2> 9646952 INFO  
(OverseerStateUpdate-95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.o.ReplicaMutator Assigning new node 
to shard shard=shard1
   [junit4]   2> 9646953 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.c.ZkStateReader A cluster state 
change: WatchedEvent state:SyncConnected type:NodeDataChanged 
path:/clusterstate.json, has occurred - updating... (live nodes size: 1)
   [junit4]   2> 9646957 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.c.ZkStateReader Updated cluster state 
version to 1
   [junit4]   2> 9647950 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.ZkController waiting to find shard id in clusterstate for collection1
   [junit4]   2> 9647950 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.ZkController Check for collection zkNode:control_collection
   [junit4]   2> 9647951 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.ZkController Collection zkNode exists
   [junit4]   2> 9647951 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.c.ZkStateReader Load collection config 
from:/collections/control_collection
   [junit4]   2> 9647952 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.c.ZkStateReader path=/collections/control_collection configName=conf1 
specified config exists in ZooKeeper
   [junit4]   2> 9647952 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.SolrResourceLoader new SolrResourceLoader for directory: 
'/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/cores/collection1/'
   [junit4]   2> 9647980 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.Config loaded config solrconfig.xml with version 0 
   [junit4]   2> 9647997 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.SolrConfig current version of requestparams : -1
   [junit4]   2> 9648019 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.SolrConfig Using Lucene MatchVersion: 5.3.2
   [junit4]   2> 9648071 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.Config Loaded SolrConfig: solrconfig.xml
   [junit4]   2> 9648080 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.IndexSchema Reading Solr Schema from /configs/conf1/schema.xml
   [junit4]   2> 9648087 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.IndexSchema [collection1] Schema name=test
   [junit4]   2> 9649051 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.OpenExchangeRatesOrgProvider Initialized with 
rates=open-exchange-rates.json, refreshInterval=1440.
   [junit4]   2> 9649131 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.IndexSchema default search field in schema is text
   [junit4]   2> 9649228 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.IndexSchema unique key field: id
   [junit4]   2> 9649265 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.FileExchangeRateProvider Reloading exchange rates from file currency.xml
   [junit4]   2> 9649268 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.FileExchangeRateProvider Reloading exchange rates from file currency.xml
   [junit4]   2> 9649285 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.OpenExchangeRatesOrgProvider Reloading exchange rates from 
open-exchange-rates.json
   [junit4]   2> 9649286 WARN  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.OpenExchangeRatesOrgProvider Unknown key IMPORTANT NOTE
   [junit4]   2> 9649287 WARN  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.OpenExchangeRatesOrgProvider Expected key, got STRING
   [junit4]   2> 9649287 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.OpenExchangeRatesOrgProvider Reloading exchange rates from 
open-exchange-rates.json
   [junit4]   2> 9649287 WARN  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.OpenExchangeRatesOrgProvider Unknown key IMPORTANT NOTE
   [junit4]   2> 9649287 WARN  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.s.OpenExchangeRatesOrgProvider Expected key, got STRING
   [junit4]   2> 9649288 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.ConfigSetProperties Did not find ConfigSet properties, assuming default 
properties: Can't find resource 'configsetprops.json' in classpath or 
'/configs/conf1', 
cwd=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0
   [junit4]   2> 9649289 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection   x:collection1] 
o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from 
collection control_collection
   [junit4]   2> 9649289 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore org.apache.solr.core.HdfsDirectoryFactory
   [junit4]   2> 9649289 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.HdfsDirectoryFactory 
solr.hdfs.home=hdfs://localhost:43763/solr_hdfs_home
   [junit4]   2> 9649289 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication 
disabled
   [junit4]   2> 9649289 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new SolrCore at 
[/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/control-001/cores/collection1/],
 dataDir=[null]
   [junit4]   2> 9649290 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is enabled. Adding Solr 
mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanServer@2c44579
   [junit4]   2> 9649290 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.HdfsDirectoryFactory creating directory factory for path 
hdfs://localhost:43763/solr_hdfs_home/control_collection/core_node1/data
   [junit4]   2> 9649345 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.s.h.HdfsLocalityReporter Registering direcotry 
hdfs://localhost:43763/solr_hdfs_home/control_collection/core_node1/data for 
locality metrics.
   [junit4]   2> 9649346 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.CachingDirectoryFactory return new directory for 
hdfs://localhost:43763/solr_hdfs_home/control_collection/core_node1/data
   [junit4]   2> 9649347 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore New index directory detected: old=null 
new=hdfs://localhost:43763/solr_hdfs_home/control_collection/core_node1/data/index/
   [junit4]   2> 9649365 WARN  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore [collection1] Solr index directory 
'hdfs:/localhost:43763/solr_hdfs_home/control_collection/core_node1/data/index' 
doesn't exist. Creating new index...
   [junit4]   2> 9649365 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.HdfsDirectoryFactory creating directory factory for path 
hdfs://localhost:43763/solr_hdfs_home/control_collection/core_node1/data/index
   [junit4]   2> 9649388 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] 
with direct memory allocation set to [true]
   [junit4]   2> 9649388 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, 
slab size of [8388608] will allocate [1] slabs and use ~[8388608] bytes
   [junit4]   2> 9649388 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS 
BlockCache
   [junit4]   2> 9649476 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 9649476 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.s.h.HdfsLocalityReporter Registering direcotry 
hdfs://localhost:43763/solr_hdfs_home/control_collection/core_node1/data/index 
for locality metrics.
   [junit4]   2> 9649476 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.CachingDirectoryFactory return new directory for 
hdfs://localhost:43763/solr_hdfs_home/control_collection/core_node1/data/index
   [junit4]   2> 9649477 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class 
org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: 
maxMergeAtOnce=33, maxMergeAtOnceExplicit=43, maxMergedSegmentMB=12.1083984375, 
floorSegmentMB=1.21484375, forceMergeDeletesPctAllowed=6.952317279418578, 
segmentsPerTier=26.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0
   [junit4]   2> 9649566 INFO  (IPC Server handler 4 on 43763) [    ] 
BlockStateChange BLOCK* addStoredBlock: blockMap updated: 127.0.0.1:59143 is 
added to blk_1073741825_1001{blockUCState=UNDER_CONSTRUCTION, 
primaryNodeIndex=-1, 
replicas=[ReplicaUnderConstruction[[DISK]DS-e5a97b2e-64d8-470d-a759-43bbbfc49217:NORMAL:127.0.0.1:49824|RBW],
 
ReplicaUnderConstruction[[DISK]DS-923ed43a-9fca-4afa-bb50-775d2d9b1f76:NORMAL:127.0.0.1:59143|FINALIZED]]}
 size 0
   [junit4]   2> 9649578 INFO  (IPC Server handler 5 on 43763) [    ] 
BlockStateChange BLOCK* addStoredBlock: blockMap updated: 127.0.0.1:49824 is 
added to blk_1073741825_1001{blockUCState=UNDER_CONSTRUCTION, 
primaryNodeIndex=-1, 
replicas=[ReplicaUnderConstruction[[DISK]DS-e5a97b2e-64d8-470d-a759-43bbbfc49217:NORMAL:127.0.0.1:49824|RBW],
 
ReplicaUnderConstruction[[DISK]DS-923ed43a-9fca-4afa-bb50-775d2d9b1f76:NORMAL:127.0.0.1:59143|FINALIZED]]}
 size 0
   [junit4]   2> 9649582 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore SolrDeletionPolicy.onCommit: commits: num=1
   [junit4]   2>        
commit{dir=NRTCachingDirectory(BlockDirectory(HdfsDirectory@b73ecc7d 
lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@3c1284ec); 
maxCacheMB=192.0 maxMergeSizeMB=16.0),segFN=segments_1,generation=1}
   [junit4]   2> 9649582 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore newest commit generation = 1
   [junit4]   2> 9649598 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain creating 
updateRequestProcessorChain "nodistrib"
   [junit4]   2> 9649598 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain creating 
updateRequestProcessorChain "dedupe"
   [junit4]   2> 9649598 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain inserting 
DistributedUpdateProcessorFactory into updateRequestProcessorChain "dedupe"
   [junit4]   2> 9649598 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain creating 
updateRequestProcessorChain "stored_sig"
   [junit4]   2> 9649599 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain inserting 
DistributedUpdateProcessorFactory into updateRequestProcessorChain "stored_sig"
   [junit4]   2> 9649599 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain creating 
updateRequestProcessorChain "distrib-dup-test-chain-explicit"
   [junit4]   2> 9649599 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain creating 
updateRequestProcessorChain "distrib-dup-test-chain-implicit"
   [junit4]   2> 9649599 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain inserting 
DistributedUpdateProcessorFactory into updateRequestProcessorChain 
"distrib-dup-test-chain-implicit"
   [junit4]   2> 9649600 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore no updateRequestProcessorChain defined as 
default, creating implicit default
   [junit4]   2> 9649602 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.h.l.XMLLoader xsltCacheLifetimeSeconds=60
   [junit4]   2> 9649603 INFO  
(OldIndexDirectoryCleanupThreadForCore-collection1) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore Looking for old index directories to cleanup 
for core collection1 in 
hdfs://localhost:43763/solr_hdfs_home/control_collection/core_node1/data/
   [junit4]   2> 9649605 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.h.l.XMLLoader xsltCacheLifetimeSeconds=60
   [junit4]   2> 9649606 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.h.l.XMLLoader xsltCacheLifetimeSeconds=60
   [junit4]   2> 9649608 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.h.l.XMLLoader xsltCacheLifetimeSeconds=60
   [junit4]   2> 9649651 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.RequestHandlers Registered paths: 
/admin/plugins,/admin/mbeans,/admin/segments,/admin/threads,/admin/luke,/get,/update/json,/update/csv,/replication,/admin/properties,/admin/ping,/admin/file,/schema,standard,/admin/system,/update,/admin/logging,/update/json/docs,/config
   [junit4]   2> 9649653 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore Using default statsCache cache: 
org.apache.solr.search.stats.LocalStatsCache
   [junit4]   2> 9649659 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.UpdateHandler Using UpdateLog implementation: 
org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 9649660 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.UpdateLog Initializing HdfsUpdateLog: dataDir= 
defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 
tlogDfsReplication=2
   [junit4]   2> 9649716 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore Hard AutoCommit: disabled
   [junit4]   2> 9649717 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore Soft AutoCommit: disabled
   [junit4]   2> 9649718 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class 
org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: 
maxMergeAtOnce=38, maxMergeAtOnceExplicit=34, maxMergedSegmentMB=97.5302734375, 
floorSegmentMB=0.2021484375, forceMergeDeletesPctAllowed=29.43237093662322, 
segmentsPerTier=19.0, maxCFSSegmentSizeMB=8.796093022207999E12, 
noCFSRatio=0.7948610523962821
   [junit4]   2> 9649742 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore SolrDeletionPolicy.onInit: commits: num=1
   [junit4]   2>        
commit{dir=NRTCachingDirectory(BlockDirectory(HdfsDirectory@b73ecc7d 
lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@3c1284ec); 
maxCacheMB=192.0 maxMergeSizeMB=16.0),segFN=segments_1,generation=1}
   [junit4]   2> 9649742 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore newest commit generation = 1
   [junit4]   2> 9649743 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.s.SolrIndexSearcher Opening Searcher@41a99670[collection1] 
main
   [junit4]   2> 9649743 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.c.ZkStateReader Load collection config 
from:/collections/control_collection
   [junit4]   2> 9649744 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.c.ZkStateReader path=/collections/control_collection 
configName=conf1 specified config exists in ZooKeeper
   [junit4]   2> 9649744 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.r.ManagedResourceStorage Setting up ZooKeeper-based 
storage for the RestManager with znodeBase: /configs/conf1
   [junit4]   2> 9649744 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO 
with znodeBase: /configs/conf1
   [junit4]   2> 9649744 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.r.RestManager Initializing RestManager with initArgs: {}
   [junit4]   2> 9649745 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.r.ManagedResourceStorage Reading _rest_managed.json using 
ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 9649745 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.r.ManagedResourceStorage No data found for znode 
/configs/conf1/_rest_managed.json
   [junit4]   2> 9649745 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at path 
_rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 9649745 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.r.RestManager Initializing 0 registered ManagedResources
   [junit4]   2> 9649745 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 9649747 INFO  
(searcherExecutor-10815-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore [collection1] Registered new searcher 
Searcher@41a99670[collection1] 
main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 9649748 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.UpdateLog Looking up max value of version field to seed 
version buckets
   [junit4]   2> 9649749 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.VersionInfo Refreshing highest value of _version_ for 
256 version buckets from index
   [junit4]   2> 9649749 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.VersionInfo No terms found for _version_, cannot seed 
version bucket highest value from index
   [junit4]   2> 9649749 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.UpdateLog Could not find max version in index or recent 
updates, using new clock 1524086801641242624
   [junit4]   2> 9649749 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.u.UpdateLog Took 0 ms to seed version buckets with highest 
version 1524086801641242624
   [junit4]   2> 9649749 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.ZkController watch zkdir /configs/conf1
   [junit4]   2> 9649750 INFO  
(coreLoadExecutor-10814-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.CoreContainer registering core: collection1
   [junit4]   2> 9649852 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.ZkController Register replica - core:collection1 
address:http://127.0.0.1:37899/ab_m/wo collection:control_collection 
shard:shard1
   [junit4]   2> 9649859 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.c.SolrZkClient makePath: 
/collections/control_collection/leader_elect/shard1/election
   [junit4]   2> 9649863 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.LeaderElector Joined leadership election with path: 
/collections/control_collection/leader_elect/shard1/election/95255424840040452-core_node1-n_0000000000
   [junit4]   2> 9649864 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.ShardLeaderElectionContext Running the leader process 
for shard shard1
   [junit4]   2> 9649865 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.DistributedQueue NodeChildrenChanged 
fired on path /overseer/queue state SyncConnected
   [junit4]   2> 9649866 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.ShardLeaderElectionContext Enough replicas found to 
continue.
   [junit4]   2> 9649866 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try 
and sync
   [junit4]   2> 9649866 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SyncStrategy Sync replicas to 
http://127.0.0.1:37899/ab_m/wo/collection1/
   [junit4]   2> 9649866 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 9649866 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SyncStrategy http://127.0.0.1:37899/ab_m/wo/collection1/ 
has no replicas
   [junit4]   2> 9649867 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.ShardLeaderElectionContext I am the new leader: 
http://127.0.0.1:37899/ab_m/wo/collection1/ shard1
   [junit4]   2> 9649867 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.c.SolrZkClient makePath: 
/collections/control_collection/leaders/shard1
   [junit4]   2> 9649867 INFO  
(OverseerStateUpdate-95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.Overseer processMessage: queueSize: 
1, message = {
   [junit4]   2>   "operation":"leader",
   [junit4]   2>   "shard":"shard1",
   [junit4]   2>   "collection":"control_collection"} current state version: 1
   [junit4]   2> 9649868 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.c.ZkStateReader A cluster state 
change: WatchedEvent state:SyncConnected type:NodeDataChanged 
path:/clusterstate.json, has occurred - updating... (live nodes size: 1)
   [junit4]   2> 9649870 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.c.ZkStateReader Updated cluster state 
version to 2
   [junit4]   2> 9649872 INFO  
(OverseerStateUpdate-95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.Overseer processMessage: queueSize: 
1, message = {
   [junit4]   2>   "operation":"leader",
   [junit4]   2>   "shard":"shard1",
   [junit4]   2>   "collection":"control_collection",
   [junit4]   2>   "base_url":"http://127.0.0.1:37899/ab_m/wo";,
   [junit4]   2>   "core":"collection1",
   [junit4]   2>   "state":"active"} current state version: 2
   [junit4]   2> 9649873 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.DistributedQueue NodeChildrenChanged 
fired on path /overseer/queue state SyncConnected
   [junit4]   2> 9649874 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.c.ZkStateReader A cluster state 
change: WatchedEvent state:SyncConnected type:NodeDataChanged 
path:/clusterstate.json, has occurred - updating... (live nodes size: 1)
   [junit4]   2> 9649875 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.c.ZkStateReader Updated cluster state 
version to 3
   [junit4]   2> 9649922 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.ZkController We are 
http://127.0.0.1:37899/ab_m/wo/collection1/ and leader is 
http://127.0.0.1:37899/ab_m/wo/collection1/
   [junit4]   2> 9649922 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.ZkController No LogReplay needed for core=collection1 
baseURL=http://127.0.0.1:37899/ab_m/wo
   [junit4]   2> 9649923 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 9649923 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.ZkController publishing state=active
   [junit4]   2> 9649923 INFO  
(coreZkRegister-10808-thread-1-processing-s:shard1 x:collection1 
c:control_collection r:core_node1 n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.ZkController numShards not found on descriptor - reading 
it from system property
   [junit4]   2> 9649924 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.DistributedQueue NodeChildrenChanged 
fired on path /overseer/queue state SyncConnected
   [junit4]   2> 9649925 INFO  
(OverseerStateUpdate-95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.Overseer processMessage: queueSize: 
1, message = {
   [junit4]   2>   "state":"active",
   [junit4]   2>   "shard":"shard1",
   [junit4]   2>   "core_node_name":"core_node1",
   [junit4]   2>   "operation":"state",
   [junit4]   2>   "core":"collection1",
   [junit4]   2>   "numShards":"1",
   [junit4]   2>   "collection":"control_collection",
   [junit4]   2>   "roles":null,
   [junit4]   2>   "node_name":"127.0.0.1:37899_ab_m%2Fwo",
   [junit4]   2>   "base_url":"http://127.0.0.1:37899/ab_m/wo"} current state 
version: 3
   [junit4]   2> 9649926 INFO  
(OverseerStateUpdate-95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.o.ReplicaMutator Update state 
numShards=1 message={
   [junit4]   2>   "state":"active",
   [junit4]   2>   "shard":"shard1",
   [junit4]   2>   "core_node_name":"core_node1",
   [junit4]   2>   "operation":"state",
   [junit4]   2>   "core":"collection1",
   [junit4]   2>   "numShards":"1",
   [junit4]   2>   "collection":"control_collection",
   [junit4]   2>   "roles":null,
   [junit4]   2>   "node_name":"127.0.0.1:37899_ab_m%2Fwo",
   [junit4]   2>   "base_url":"http://127.0.0.1:37899/ab_m/wo"}
   [junit4]   2> 9649952 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
   [junit4]   2> 9649983 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 9649999 INFO  (zkCallback-2855-thread-1) [    ] 
o.a.s.c.c.ConnectionManager Watcher 
org.apache.solr.common.cloud.ConnectionManager@10468d4d 
name:ZooKeeperConnection Watcher:127.0.0.1:60619/solr got event WatchedEvent 
state:SyncConnected type:None path:null path:null type:None
   [junit4]   2> 9650003 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 9650004 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
   [junit4]   2> 9650004 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ZkStateReader Updating cluster state from ZooKeeper... 
   [junit4]   2> 9650006 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.ChaosMonkey 
monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 9650028 INFO  (zkCallback-2855-thread-1) [    ] 
o.a.s.c.c.ZkStateReader A cluster state change: WatchedEvent 
state:SyncConnected type:NodeDataChanged path:/clusterstate.json, has occurred 
- updating... (live nodes size: 1)
   [junit4]   2> 9650028 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.c.ZkStateReader A cluster state 
change: WatchedEvent state:SyncConnected type:NodeDataChanged 
path:/clusterstate.json, has occurred - updating... (live nodes size: 1)
   [junit4]   2> 9650030 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.c.ZkStateReader Updated cluster state 
version to 4
   [junit4]   2> 9650037 INFO  (zkCallback-2855-thread-1) [    ] 
o.a.s.c.c.ZkStateReader Updated cluster state version to 4
   [junit4]   2> 9650365 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.SolrTestCaseJ4 
Writing core.properties file to 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001/cores/collection1
   [junit4]   2> 9650376 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001
   [junit4]   2> 9650387 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.e.j.s.Server 
jetty-9.2.11.v20150529
   [junit4]   2> 9650464 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.e.j.s.h.ContextHandler Started 
o.e.j.s.ServletContextHandler@b0176c6{/ab_m/wo,null,AVAILABLE}
   [junit4]   2> 9650464 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.e.j.s.ServerConnector Started 
ServerConnector@d1cc6cf{HTTP/1.1}{127.0.0.1:48419}
   [junit4]   2> 9650464 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.e.j.s.Server 
Started @9653621ms
   [junit4]   2> 9650465 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.s.e.JettySolrRunner Jetty properties: 
{solr.data.dir=hdfs://localhost:43763/hdfs__localhost_43763__x1_jenkins_jenkins-slave_workspace_Lucene-Solr-NightlyTests-5.3_solr_build_solr-core_test_J0_temp_solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001_tempDir-002_jetty1,
 hostContext=/ab_m/wo, solrconfig=solrconfig.xml, hostPort=48419, 
coreRootDirectory=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001/cores}
   [junit4]   2> 9650465 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.s.SolrDispatchFilter SolrDispatchFilter.init(): 
sun.misc.Launcher$AppClassLoader@7b3cb2c6
   [junit4]   2> 9650465 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.SolrResourceLoader new SolrResourceLoader for directory: 
'/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001/'
   [junit4]   2> 9650501 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
   [junit4]   2> 9650524 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 9650542 INFO  (zkCallback-2856-thread-1) [    ] 
o.a.s.c.c.ConnectionManager Watcher 
org.apache.solr.common.cloud.ConnectionManager@730c916c 
name:ZooKeeperConnection Watcher:127.0.0.1:60619/solr got event WatchedEvent 
state:SyncConnected type:None path:null path:null type:None
   [junit4]   2> 9650542 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 9650543 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
   [junit4]   2> 9650544 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in 
ZooKeeper)
   [junit4]   2> 9650544 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.SolrXmlConfig 
Loading container configuration from 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001/solr.xml
   [junit4]   2> 9650558 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.CoresLocator 
Config-defined core root directory: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001/cores
   [junit4]   2> 9650559 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.CoreContainer 
New CoreContainer 693354047
   [junit4]   2> 9650559 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.CoreContainer 
Loading cores into CoreContainer 
[instanceDir=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001/]
   [junit4]   2> 9650559 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.CoreContainer 
loading shared library: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001/lib
   [junit4]   2> 9650559 WARN  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.SolrResourceLoader Can't find (or read) directory to add to 
classloader: lib (resolved as: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001/lib).
   [junit4]   2> 9650579 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.h.c.HttpShardHandlerFactory created with socketTimeout : 90000,urlScheme 
: ,connTimeout : 15000,maxConnectionsPerHost : 20,maxConnections : 
10000,corePoolSize : 0,maximumPoolSize : 2147483647,maxThreadIdleTime : 
5,sizeOfQueue : -1,fairnessPolicy : false,useRetries : false,
   [junit4]   2> 9650580 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.u.UpdateShardHandler Creating UpdateShardHandler HTTP client with params: 
socketTimeout=340000&connTimeout=45000&retry=true
   [junit4]   2> 9650581 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.l.LogWatcher 
SLF4J impl is org.slf4j.impl.Log4jLoggerFactory
   [junit4]   2> 9650581 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.l.LogWatcher 
Registering Log Listener [Log4j (org.slf4j.impl.Log4jLoggerFactory)]
   [junit4]   2> 9650581 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.ZkContainer 
Zookeeper client=127.0.0.1:60619/solr
   [junit4]   2> 9650582 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] o.a.s.c.ZkController 
zkHost includes chroot
   [junit4]   2> 9650582 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
   [junit4]   2> 9650595 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 9650635 INFO  (zkCallback-2858-thread-1) [    ] 
o.a.s.c.c.ConnectionManager Watcher 
org.apache.solr.common.cloud.ConnectionManager@c61d75f name:ZooKeeperConnection 
Watcher:127.0.0.1:60619 got event WatchedEvent state:SyncConnected type:None 
path:null path:null type:None
   [junit4]   2> 9650636 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 9650637 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [    ] 
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
   [junit4]   2> 9650667 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 9650683 INFO  
(zkCallback-2859-thread-1-processing-n:127.0.0.1:48419_ab_m%2Fwo) 
[n:127.0.0.1:48419_ab_m%2Fwo    ] o.a.s.c.c.ConnectionManager Watcher 
org.apache.solr.common.cloud.ConnectionManager@4a36ba70 
name:ZooKeeperConnection Watcher:127.0.0.1:60619/solr got event WatchedEvent 
state:SyncConnected type:None path:null path:null type:None
   [junit4]   2> 9650684 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 9650687 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.c.ZkStateReader Updating cluster state from ZooKeeper... 
   [junit4]   2> 9651691 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.ZkController Register node as live in 
ZooKeeper:/live_nodes/127.0.0.1:48419_ab_m%2Fwo
   [junit4]   2> 9651691 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.c.SolrZkClient makePath: /live_nodes/127.0.0.1:48419_ab_m%2Fwo
   [junit4]   2> 9651695 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 9651695 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.LeaderElector Joined leadership election with path: 
/overseer_elect/election/95255424840040456-127.0.0.1:48419_ab_m%2Fwo-n_0000000001
   [junit4]   2> 9651696 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.LeaderElector Watching path 
/overseer_elect/election/95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000
 to know if I could be the leader
   [junit4]   2> 9651733 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.CoreContainer Security conf doesn't exist. Skipping setup for 
authorization module.
   [junit4]   2> 9651733 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.CoreContainer No authentication plugin used.
   [junit4]   2> 9651739 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.CoresLocator Looking for core definitions underneath 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001/cores
   [junit4]   2> 9651739 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.SolrCore Created CoreDescriptor: {shard=, config=solrconfig.xml, 
schema=schema.xml, transient=false, 
absoluteInstDir=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001/cores/collection1/,
 loadOnStartup=true, coreNodeName=, dataDir=data/, 
configSetProperties=configsetprops.json, 
instanceDir=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001/cores/collection1,
 name=collection1, collection=collection1}
   [junit4]   2> 9651740 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.CoresLocator Found core collection1 in 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.StressHdfsTest_7141961A233E85D7-001/shard-1-001/cores/collection1/
   [junit4]   2> 9651740 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.c.CoresLocator Found 1 core definitions
   [junit4]   2> 9651741 INFO  
(coreLoadExecutor-10825-thread-1-processing-n:127.0.0.1:48419_ab_m%2Fwo) 
[n:127.0.0.1:48419_ab_m%2Fwo c:collection1   x:collection1] 
o.a.s.c.ZkController publishing state=down
   [junit4]   2> 9651741 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.s.SolrDispatchFilter 
user.dir=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/J0
   [junit4]   2> 9651741 INFO  
(TEST-StressHdfsTest.test-seed#[7141961A233E85D7]) [n:127.0.0.1:48419_ab_m%2Fwo 
   ] o.a.s.s.SolrDispatchFilter SolrDispatchFilter.init() done
   [junit4]   2> 9651741 INFO  
(coreLoadExecutor-10825-thread-1-processing-n:127.0.0.1:48419_ab_m%2Fwo) 
[n:127.0.0.1:48419_ab_m%2Fwo c:collection1   x:collection1] 
o.a.s.c.ZkController numShards not found on descriptor - reading it from system 
property
   [junit4]   2> 9651742 INFO  
(zkCallback-2853-thread-1-processing-n:127.0.0.1:37899_ab_m%2Fwo) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.DistributedQueue NodeChildrenChanged 
fired on path /overseer/queue state SyncConnected
   [junit4]   2> 9651742 INFO  
(coreLoadExecutor-10825-thread-1-processing-n:127.0.0.1:48419_ab_m%2Fwo) 
[n:127.0.0.1:48419_ab_m%2Fwo c:collection1   x:collection1] 
o.a.s.c.ZkController look for our core node name
   [junit4]   2> 9651743 INFO  
(OverseerStateUpdate-95255424840040452-127.0.0.1:37899_ab_m%2Fwo-n_0000000000) 
[n:127.0.0.1:37899_ab_m%2Fwo    ] o.a.s.c.Overseer processMessage: queueSize: 
1, message = {
   [junit4]   2>   "state":"down",
   [junit4]   2>  

[...truncated too long message...]

ar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/commons-math3-3.4.1.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/easymock-3.0.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/ehcache-core-2.4.4.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/hadoop-common-2.6.0-tests.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/hadoop-hdfs-2.6.0-tests.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/hadoop-minikdc-2.6.0.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/jackson-annotations-2.5.4.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/jackson-databind-2.5.4.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/jcl-over-slf4j-1.7.7.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/jersey-core-1.9.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/jersey-server-1.9.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/jetty-6.1.26.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/jetty-util-6.1.26.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/mina-core-2.0.0-M5.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/core/test-lib/objenesis-1.2.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/lucene/build/analysis/icu/lucene-analyzers-icu-5.3.2-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/contrib/solr-analysis-extras/classes/java:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/contrib/analysis-extras/lib/icu4j-54.1.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-launcher.jar:/x1/jenkins/.ant/lib/ivy-2.3.0.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-antlr.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-apache-regexp.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-apache-resolver.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-commons-net.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-jmf.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-apache-oro.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-apache-xalan2.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-commons-logging.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-jsch.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-junit4.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-javamail.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-jai.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-jdepend.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-junit.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-apache-bsf.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-swing.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-netrexx.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-apache-bcel.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-testutil.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Ant_AntInstallation/ant-1.8.2/lib/ant-apache-log4j.jar:/x1/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.7/lib/tools.jar:/x1/jenkins/.ivy2/cache/com.carrotsearch.randomizedtesting/junit4-ant/jars/junit4-ant-2.1.13.jar
 com.carrotsearch.ant.tasks.junit4.slave.SlaveMainSafe -eventsfile 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/temp/junit4-J2-20160122_142625_686.events
 
@/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build/solr-core/test/temp/junit4-J2-20160122_142625_686.suites
   [junit4] ERROR: JVM J2 ended with an exception: Quit event not received from 
the forked process? This may indicate JVM crash or runner bugs.
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.JUnit4.executeSlave(JUnit4.java:1504)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.JUnit4.access$000(JUnit4.java:133)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.JUnit4$2.call(JUnit4.java:964)
   [junit4]     at 
com.carrotsearch.ant.tasks.junit4.JUnit4$2.call(JUnit4.java:961)
   [junit4]     at java.util.concurrent.FutureTask.run(FutureTask.java:262)
   [junit4]     at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
   [junit4]     at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
   [junit4]     at java.lang.Thread.run(Thread.java:745)

BUILD FAILED
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/build.xml:733: 
The following error occurred while executing this line:
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/build.xml:670: 
The following error occurred while executing this line:
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/build.xml:59: 
The following error occurred while executing this line:
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/build.xml:230:
 The following error occurred while executing this line:
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/solr/common-build.xml:524:
 The following error occurred while executing this line:
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/lucene/common-build.xml:1449:
 The following error occurred while executing this line:
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3/lucene/common-build.xml:1003:
 At least one slave process threw an exception, first: Quit event not received 
from the forked process? This may indicate JVM crash or runner bugs.

Total time: 376 minutes 18 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
No prior successful build to compare, so performing full copy of artifacts
ERROR: Failed to archive {README.txt=README.txt, 
solr/build/solr-core/test/temp/junit4-J2-20160122_142625_686.events=solr/build/solr-core/test/temp/junit4-J2-20160122_142625_686.events,
 heapdumps/java_pid2447.hprof=heapdumps/java_pid2447.hprof, 
solr/build/solr-core/test/temp/junit4-J1-20160122_142625_681.events=solr/build/solr-core/test/temp/junit4-J1-20160122_142625_681.events,
 
solr/build/solr-core/test/temp/junit4-J0-20160122_142625_681.events=solr/build/solr-core/test/temp/junit4-J0-20160122_142625_681.events}
 due to internal error; falling back to full archiving
java.lang.IllegalArgumentException: Negative time
        at java.io.File.setLastModified(File.java:1421)
        at hudson.FilePath.readFromTar(FilePath.java:2289)
        at hudson.FilePath.copyRecursiveTo(FilePath.java:2208)
        at 
jenkins.model.StandardArtifactManager.archive(StandardArtifactManager.java:61)
        at 
com.cloudbees.jenkins.plugins.jsync.archiver.JSyncArtifactManager.remoteSync(JSyncArtifactManager.java:111)
        at 
com.cloudbees.jenkins.plugins.jsync.archiver.JSyncArtifactManager.archive(JSyncArtifactManager.java:72)
        at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:219)
        at 
hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:74)
        at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:776)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:723)
        at hudson.model.Build$BuildExecution.post2(Build.java:183)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:670)
        at hudson.model.Run.execute(Run.java:1763)
        at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
        at hudson.model.ResourceController.execute(ResourceController.java:98)
        at hudson.model.Executor.run(Executor.java:381)
ERROR: Build step failed with exception
java.lang.IllegalArgumentException: Negative time
        at java.io.File.setLastModified(File.java:1421)
        at hudson.FilePath.readFromTar(FilePath.java:2289)
        at hudson.FilePath.copyRecursiveTo(FilePath.java:2208)
        at 
jenkins.model.StandardArtifactManager.archive(StandardArtifactManager.java:61)
        at 
com.cloudbees.jenkins.plugins.jsync.archiver.JSyncArtifactManager.archive(JSyncArtifactManager.java:76)
        at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:219)
        at 
hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:74)
        at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:776)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:723)
        at hudson.model.Build$BuildExecution.post2(Build.java:183)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:670)
        at hudson.model.Run.execute(Run.java:1763)
        at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
        at hudson.model.ResourceController.execute(ResourceController.java:98)
        at hudson.model.Executor.run(Executor.java:381)
Build step 'Archive the artifacts' marked build as failure
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to