Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-Linux/14401/
Java: 32bit/jdk1.9.0-ea-b78 -client -XX:+UseConcMarkSweepGC
4 tests failed.
FAILED: junit.framework.TestSuite.org.apache.solr.cloud.SaslZkACLProviderTest
Error Message:
5 threads leaked from SUITE scope at
org.apache.solr.cloud.SaslZkACLProviderTest: 1) Thread[id=11306,
name=apacheds, state=WAITING, group=TGRP-SaslZkACLProviderTest] at
java.lang.Object.wait(Native Method) at
java.lang.Object.wait(Object.java:516) at
java.util.TimerThread.mainLoop(Timer.java:526) at
java.util.TimerThread.run(Timer.java:505) 2) Thread[id=11308,
name=kdcReplayCache.data, state=TIMED_WAITING,
group=TGRP-SaslZkACLProviderTest] at sun.misc.Unsafe.park(Native
Method) at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746) 3) Thread[id=11310,
name=groupCache.data, state=TIMED_WAITING, group=TGRP-SaslZkACLProviderTest]
at sun.misc.Unsafe.park(Native Method) at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746) 4) Thread[id=11309,
name=ou=system.data, state=TIMED_WAITING, group=TGRP-SaslZkACLProviderTest]
at sun.misc.Unsafe.park(Native Method) at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746) 5) Thread[id=11307,
name=changePwdReplayCache.data, state=TIMED_WAITING,
group=TGRP-SaslZkACLProviderTest] at sun.misc.Unsafe.park(Native
Method) at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746)
Stack Trace:
com.carrotsearch.randomizedtesting.ThreadLeakError: 5 threads leaked from SUITE
scope at org.apache.solr.cloud.SaslZkACLProviderTest:
1) Thread[id=11306, name=apacheds, state=WAITING,
group=TGRP-SaslZkACLProviderTest]
at java.lang.Object.wait(Native Method)
at java.lang.Object.wait(Object.java:516)
at java.util.TimerThread.mainLoop(Timer.java:526)
at java.util.TimerThread.run(Timer.java:505)
2) Thread[id=11308, name=kdcReplayCache.data, state=TIMED_WAITING,
group=TGRP-SaslZkACLProviderTest]
at sun.misc.Unsafe.park(Native Method)
at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746)
3) Thread[id=11310, name=groupCache.data, state=TIMED_WAITING,
group=TGRP-SaslZkACLProviderTest]
at sun.misc.Unsafe.park(Native Method)
at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746)
4) Thread[id=11309, name=ou=system.data, state=TIMED_WAITING,
group=TGRP-SaslZkACLProviderTest]
at sun.misc.Unsafe.park(Native Method)
at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746)
5) Thread[id=11307, name=changePwdReplayCache.data, state=TIMED_WAITING,
group=TGRP-SaslZkACLProviderTest]
at sun.misc.Unsafe.park(Native Method)
at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746)
at __randomizedtesting.SeedInfo.seed([E02AB809BB9D0705]:0)
FAILED: junit.framework.TestSuite.org.apache.solr.cloud.SaslZkACLProviderTest
Error Message:
There are still zombie threads that couldn't be terminated: 1)
Thread[id=11306, name=apacheds, state=WAITING,
group=TGRP-SaslZkACLProviderTest] at java.lang.Object.wait(Native
Method) at java.lang.Object.wait(Object.java:516) at
java.util.TimerThread.mainLoop(Timer.java:526) at
java.util.TimerThread.run(Timer.java:505) 2) Thread[id=11308,
name=kdcReplayCache.data, state=TIMED_WAITING,
group=TGRP-SaslZkACLProviderTest] at sun.misc.Unsafe.park(Native
Method) at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746) 3) Thread[id=11310,
name=groupCache.data, state=TIMED_WAITING, group=TGRP-SaslZkACLProviderTest]
at sun.misc.Unsafe.park(Native Method) at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746) 4) Thread[id=11309,
name=ou=system.data, state=TIMED_WAITING, group=TGRP-SaslZkACLProviderTest]
at sun.misc.Unsafe.park(Native Method) at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746) 5) Thread[id=11307,
name=changePwdReplayCache.data, state=TIMED_WAITING,
group=TGRP-SaslZkACLProviderTest] at sun.misc.Unsafe.park(Native
Method) at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746)
Stack Trace:
com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie
threads that couldn't be terminated:
1) Thread[id=11306, name=apacheds, state=WAITING,
group=TGRP-SaslZkACLProviderTest]
at java.lang.Object.wait(Native Method)
at java.lang.Object.wait(Object.java:516)
at java.util.TimerThread.mainLoop(Timer.java:526)
at java.util.TimerThread.run(Timer.java:505)
2) Thread[id=11308, name=kdcReplayCache.data, state=TIMED_WAITING,
group=TGRP-SaslZkACLProviderTest]
at sun.misc.Unsafe.park(Native Method)
at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746)
3) Thread[id=11310, name=groupCache.data, state=TIMED_WAITING,
group=TGRP-SaslZkACLProviderTest]
at sun.misc.Unsafe.park(Native Method)
at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746)
4) Thread[id=11309, name=ou=system.data, state=TIMED_WAITING,
group=TGRP-SaslZkACLProviderTest]
at sun.misc.Unsafe.park(Native Method)
at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746)
5) Thread[id=11307, name=changePwdReplayCache.data, state=TIMED_WAITING,
group=TGRP-SaslZkACLProviderTest]
at sun.misc.Unsafe.park(Native Method)
at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:746)
at __randomizedtesting.SeedInfo.seed([E02AB809BB9D0705]:0)
FAILED: org.apache.solr.cloud.SaslZkACLProviderTest.testSaslZkACLProvider
Error Message:
org.apache.directory.api.ldap.model.exception.LdapOtherException:
org.apache.directory.api.ldap.model.exception.LdapOtherException:
ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value
ERR_04473_NOT_VALID_VALUE Not a valid value '20090818022733Z' for the
AttributeType 'ATTRIBUTE_TYPE ( 1.3.6.1.4.1.18060.0.4.1.2.35 NAME
'schemaModifyTimestamp' DESC time which schema was modified SUP
modifyTimestamp EQUALITY generalizedTimeMatch ORDERING
generalizedTimeOrderingMatch SYNTAX 1.3.6.1.4.1.1466.115.121.1.24 USAGE
directoryOperation ) '
Stack Trace:
java.lang.RuntimeException:
org.apache.directory.api.ldap.model.exception.LdapOtherException:
org.apache.directory.api.ldap.model.exception.LdapOtherException:
ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value
ERR_04473_NOT_VALID_VALUE Not a valid value '20090818022733Z' for the
AttributeType 'ATTRIBUTE_TYPE ( 1.3.6.1.4.1.18060.0.4.1.2.35
NAME 'schemaModifyTimestamp'
DESC time which schema was modified
SUP modifyTimestamp
EQUALITY generalizedTimeMatch
ORDERING generalizedTimeOrderingMatch
SYNTAX 1.3.6.1.4.1.1466.115.121.1.24
USAGE directoryOperation
)
'
at
org.apache.solr.cloud.SaslZkACLProviderTest$SaslZkTestServer.run(SaslZkACLProviderTest.java:211)
at
org.apache.solr.cloud.SaslZkACLProviderTest.setUp(SaslZkACLProviderTest.java:81)
at sun.reflect.GeneratedMethodAccessor40.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:504)
at
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1665)
at
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:898)
at
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:914)
at
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
at
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
at
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
at
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:809)
at
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:460)
at
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:873)
at
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:775)
at
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:809)
at
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:820)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
at
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
at
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
at
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
at java.lang.Thread.run(Thread.java:746)
Caused by: org.apache.directory.api.ldap.model.exception.LdapOtherException:
org.apache.directory.api.ldap.model.exception.LdapOtherException:
ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value
ERR_04473_NOT_VALID_VALUE Not a valid value '20090818022733Z' for the
AttributeType 'ATTRIBUTE_TYPE ( 1.3.6.1.4.1.18060.0.4.1.2.35
NAME 'schemaModifyTimestamp'
DESC time which schema was modified
SUP modifyTimestamp
EQUALITY generalizedTimeMatch
ORDERING generalizedTimeOrderingMatch
SYNTAX 1.3.6.1.4.1.1466.115.121.1.24
USAGE directoryOperation
)
'
at
org.apache.directory.server.core.api.partition.AbstractPartition.initialize(AbstractPartition.java:84)
at
org.apache.directory.server.core.DefaultDirectoryService.initialize(DefaultDirectoryService.java:1808)
at
org.apache.directory.server.core.DefaultDirectoryService.startup(DefaultDirectoryService.java:1248)
at
org.apache.hadoop.minikdc.MiniKdc.initDirectoryService(MiniKdc.java:383)
at org.apache.hadoop.minikdc.MiniKdc.start(MiniKdc.java:319)
at
org.apache.solr.cloud.SaslZkACLProviderTest$SaslZkTestServer.run(SaslZkACLProviderTest.java:204)
... 38 more
Caused by: java.lang.RuntimeException:
org.apache.directory.api.ldap.model.exception.LdapOtherException:
ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value
ERR_04473_NOT_VALID_VALUE Not a valid value '20090818022733Z' for the
AttributeType 'ATTRIBUTE_TYPE ( 1.3.6.1.4.1.18060.0.4.1.2.35
NAME 'schemaModifyTimestamp'
DESC time which schema was modified
SUP modifyTimestamp
EQUALITY generalizedTimeMatch
ORDERING generalizedTimeOrderingMatch
SYNTAX 1.3.6.1.4.1.1466.115.121.1.24
USAGE directoryOperation
)
'
at
org.apache.directory.server.core.api.schema.SchemaPartition.doInit(SchemaPartition.java:226)
at
org.apache.directory.server.core.api.partition.AbstractPartition.initialize(AbstractPartition.java:79)
... 43 more
Caused by: org.apache.directory.api.ldap.model.exception.LdapOtherException:
ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value
ERR_04473_NOT_VALID_VALUE Not a valid value '20090818022733Z' for the
AttributeType 'ATTRIBUTE_TYPE ( 1.3.6.1.4.1.18060.0.4.1.2.35
NAME 'schemaModifyTimestamp'
DESC time which schema was modified
SUP modifyTimestamp
EQUALITY generalizedTimeMatch
ORDERING generalizedTimeOrderingMatch
SYNTAX 1.3.6.1.4.1.1466.115.121.1.24
USAGE directoryOperation
)
'
at
org.apache.directory.server.core.api.partition.AbstractPartition.initialize(AbstractPartition.java:84)
at
org.apache.directory.server.core.api.schema.SchemaPartition.doInit(SchemaPartition.java:219)
... 44 more
Caused by:
org.apache.directory.api.ldap.model.exception.LdapInvalidAttributeValueException:
ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value
ERR_04473_NOT_VALID_VALUE Not a valid value '20090818022733Z' for the
AttributeType 'ATTRIBUTE_TYPE ( 1.3.6.1.4.1.18060.0.4.1.2.35
NAME 'schemaModifyTimestamp'
DESC time which schema was modified
SUP modifyTimestamp
EQUALITY generalizedTimeMatch
ORDERING generalizedTimeOrderingMatch
SYNTAX 1.3.6.1.4.1.1466.115.121.1.24
USAGE directoryOperation
)
'
at
org.apache.directory.api.ldap.model.entry.AbstractValue.apply(AbstractValue.java:211)
at
org.apache.directory.api.ldap.model.entry.StringValue.<init>(StringValue.java:107)
at
org.apache.directory.api.ldap.model.entry.DefaultAttribute.<init>(DefaultAttribute.java:468)
at
org.apache.directory.api.ldap.model.entry.DefaultEntry.<init>(DefaultEntry.java:315)
at
org.apache.directory.server.core.partition.ldif.LdifPartition.loadEntries(LdifPartition.java:517)
at
org.apache.directory.server.core.partition.ldif.LdifPartition.loadEntries(LdifPartition.java:549)
at
org.apache.directory.server.core.partition.ldif.LdifPartition.doInit(LdifPartition.java:164)
at
org.apache.directory.server.core.api.partition.AbstractPartition.initialize(AbstractPartition.java:79)
... 45 more
Caused by:
org.apache.directory.api.ldap.model.exception.LdapInvalidAttributeValueException:
ERR_04473_NOT_VALID_VALUE Not a valid value '20090818022733Z' for the
AttributeType 'ATTRIBUTE_TYPE ( 1.3.6.1.4.1.18060.0.4.1.2.35
NAME 'schemaModifyTimestamp'
DESC time which schema was modified
SUP modifyTimestamp
EQUALITY generalizedTimeMatch
ORDERING generalizedTimeOrderingMatch
SYNTAX 1.3.6.1.4.1.1466.115.121.1.24
USAGE directoryOperation
)
'
at
org.apache.directory.api.ldap.model.entry.AbstractValue.apply(AbstractValue.java:204)
... 52 more
FAILED: org.apache.solr.cloud.ShardSplitTest.test
Error Message:
Wrong doc count on shard1_0. See SOLR-5309 expected:<310> but was:<309>
Stack Trace:
java.lang.AssertionError: Wrong doc count on shard1_0. See SOLR-5309
expected:<310> but was:<309>
at
__randomizedtesting.SeedInfo.seed([E02AB809BB9D0705:687E87D315616AFD]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.failNotEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:128)
at org.junit.Assert.assertEquals(Assert.java:472)
at
org.apache.solr.cloud.ShardSplitTest.checkDocCountsAndShardStates(ShardSplitTest.java:433)
at
org.apache.solr.cloud.ShardSplitTest.splitByUniqueKeyTest(ShardSplitTest.java:215)
at org.apache.solr.cloud.ShardSplitTest.test(ShardSplitTest.java:77)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:504)
at
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1665)
at
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:864)
at
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:900)
at
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:914)
at
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:963)
at
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:938)
at
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
at
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
at
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
at
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:809)
at
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:460)
at
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:873)
at
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:775)
at
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:809)
at
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:820)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
at
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
at
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
at
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
at
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
at java.lang.Thread.run(Thread.java:746)
Build Log:
[...truncated 9714 lines...]
[junit4] Suite: org.apache.solr.cloud.ShardSplitTest
[junit4] 2> Creating dataDir:
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/init-core-data-001
[junit4] 2> 259917 INFO
(SUITE-ShardSplitTest-seed#[E02AB809BB9D0705]-worker) [ ]
o.a.s.BaseDistributedSearchTestCase Setting hostContext system property:
/_lph/su
[junit4] 2> 259919 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.c.ZkTestServer
STARTING ZK TEST SERVER
[junit4] 2> 259919 INFO (Thread-772) [ ] o.a.s.c.ZkTestServer client
port:0.0.0.0/0.0.0.0:0
[junit4] 2> 259919 INFO (Thread-772) [ ] o.a.s.c.ZkTestServer Starting
server
[junit4] 2> 260019 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.c.ZkTestServer
start zk server on port:38732
[junit4] 2> 260019 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
[junit4] 2> 260019 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
[junit4] 2> 260021 INFO (zkCallback-359-thread-1) [ ]
o.a.s.c.c.ConnectionManager Watcher
org.apache.solr.common.cloud.ConnectionManager@d92d15 name:ZooKeeperConnection
Watcher:127.0.0.1:38732 got event WatchedEvent state:SyncConnected type:None
path:null path:null type:None
[junit4] 2> 260021 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
[junit4] 2> 260023 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
[junit4] 2> 260023 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /solr
[junit4] 2> 260025 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
[junit4] 2> 260025 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
[junit4] 2> 260026 INFO (zkCallback-360-thread-1) [ ]
o.a.s.c.c.ConnectionManager Watcher
org.apache.solr.common.cloud.ConnectionManager@69677d name:ZooKeeperConnection
Watcher:127.0.0.1:38732/solr got event WatchedEvent state:SyncConnected
type:None path:null path:null type:None
[junit4] 2> 260026 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
[junit4] 2> 260026 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
[junit4] 2> 260026 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /collections/collection1
[junit4] 2> 260028 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /collections/collection1/shards
[junit4] 2> 260030 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /collections/control_collection
[junit4] 2> 260031 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /collections/control_collection/shards
[junit4] 2> 260033 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.AbstractZkTestCase put
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml
to /configs/conf1/solrconfig.xml
[junit4] 2> 260033 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/solrconfig.xml
[junit4] 2> 260035 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.AbstractZkTestCase put
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/core/src/test-files/solr/collection1/conf/schema15.xml
to /configs/conf1/schema.xml
[junit4] 2> 260035 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/schema.xml
[junit4] 2> 260037 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.AbstractZkTestCase put
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml
to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
[junit4] 2> 260037 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath:
/configs/conf1/solrconfig.snippet.randomindexconfig.xml
[junit4] 2> 260038 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.AbstractZkTestCase put
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt
to /configs/conf1/stopwords.txt
[junit4] 2> 260038 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/stopwords.txt
[junit4] 2> 260039 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.AbstractZkTestCase put
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt
to /configs/conf1/protwords.txt
[junit4] 2> 260040 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/protwords.txt
[junit4] 2> 260041 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.AbstractZkTestCase put
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml
to /configs/conf1/currency.xml
[junit4] 2> 260041 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/currency.xml
[junit4] 2> 260042 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.AbstractZkTestCase put
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml
to /configs/conf1/enumsConfig.xml
[junit4] 2> 260043 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/enumsConfig.xml
[junit4] 2> 260044 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.AbstractZkTestCase put
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json
to /configs/conf1/open-exchange-rates.json
[junit4] 2> 260044 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/open-exchange-rates.json
[junit4] 2> 260045 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.AbstractZkTestCase put
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt
to /configs/conf1/mapping-ISOLatin1Accent.txt
[junit4] 2> 260046 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/mapping-ISOLatin1Accent.txt
[junit4] 2> 260047 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.AbstractZkTestCase put
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt
to /configs/conf1/old_synonyms.txt
[junit4] 2> 260047 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/old_synonyms.txt
[junit4] 2> 260048 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.AbstractZkTestCase put
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt
to /configs/conf1/synonyms.txt
[junit4] 2> 260048 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient makePath: /configs/conf1/synonyms.txt
[junit4] 2> 260176 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.SolrTestCaseJ4
Writing core.properties file to
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores/collection1
[junit4] 2> 260178 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.e.j.s.Server
jetty-9.2.13.v20150730
[junit4] 2> 260179 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.e.j.s.h.ContextHandler Started
o.e.j.s.ServletContextHandler@1741eea{/_lph/su,null,AVAILABLE}
[junit4] 2> 260181 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.e.j.s.ServerConnector Started
ServerConnector@d6b59f{HTTP/1.1}{127.0.0.1:43203}
[junit4] 2> 260181 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.e.j.s.Server
Started @261682ms
[junit4] 2> 260181 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.s.e.JettySolrRunner Jetty properties:
{solr.data.dir=/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/tempDir-001/control/data,
hostContext=/_lph/su, hostPort=43203,
coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores}
[junit4] 2> 260181 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.s.SolrDispatchFilter SolrDispatchFilter.init():
sun.misc.Launcher$AppClassLoader@530c12
[junit4] 2> 260181 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.SolrResourceLoader new SolrResourceLoader for directory:
'/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/'
[junit4] 2> 260207 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
[junit4] 2> 260207 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
[junit4] 2> 260210 INFO (zkCallback-361-thread-1) [ ]
o.a.s.c.c.ConnectionManager Watcher
org.apache.solr.common.cloud.ConnectionManager@10f3bf7 name:ZooKeeperConnection
Watcher:127.0.0.1:38732/solr got event WatchedEvent state:SyncConnected
type:None path:null path:null type:None
[junit4] 2> 260210 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
[junit4] 2> 260211 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
[junit4] 2> 260212 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in
ZooKeeper)
[junit4] 2> 260212 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.c.SolrXmlConfig
Loading container configuration from
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/solr.xml
[junit4] 2> 260229 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.c.CoresLocator
Config-defined core root directory:
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores
[junit4] 2> 260229 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.c.CoreContainer
New CoreContainer 18429870
[junit4] 2> 260229 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.c.CoreContainer
Loading cores into CoreContainer
[instanceDir=/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/]
[junit4] 2> 260229 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.c.CoreContainer
loading shared library:
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/lib
[junit4] 2> 260230 WARN
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.SolrResourceLoader Can't find (or read) directory to add to
classloader: lib (resolved as:
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/lib).
[junit4] 2> 260238 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.h.c.HttpShardHandlerFactory created with socketTimeout : 90000,urlScheme
: ,connTimeout : 15000,maxConnectionsPerHost : 20,maxConnections :
10000,corePoolSize : 0,maximumPoolSize : 2147483647,maxThreadIdleTime :
5,sizeOfQueue : -1,fairnessPolicy : false,useRetries : false,
[junit4] 2> 260241 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.u.UpdateShardHandler Creating UpdateShardHandler HTTP client with params:
socketTimeout=340000&connTimeout=45000&retry=true
[junit4] 2> 260242 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.l.LogWatcher
SLF4J impl is org.slf4j.impl.Log4jLoggerFactory
[junit4] 2> 260242 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.l.LogWatcher
Registering Log Listener [Log4j (org.slf4j.impl.Log4jLoggerFactory)]
[junit4] 2> 260242 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.c.ZkContainer
Zookeeper client=127.0.0.1:38732/solr
[junit4] 2> 260242 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.c.ZkController
zkHost includes chroot
[junit4] 2> 260242 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
[junit4] 2> 260242 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
[junit4] 2> 260244 INFO (zkCallback-363-thread-1) [ ]
o.a.s.c.c.ConnectionManager Watcher
org.apache.solr.common.cloud.ConnectionManager@2b9042 name:ZooKeeperConnection
Watcher:127.0.0.1:38732 got event WatchedEvent state:SyncConnected type:None
path:null path:null type:None
[junit4] 2> 260244 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
[junit4] 2> 260244 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
[junit4] 2> 260246 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
[junit4] 2> 260247 INFO
(zkCallback-364-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.c.ConnectionManager Watcher
org.apache.solr.common.cloud.ConnectionManager@daa3d1 name:ZooKeeperConnection
Watcher:127.0.0.1:38732/solr got event WatchedEvent state:SyncConnected
type:None path:null path:null type:None
[junit4] 2> 260247 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
[junit4] 2> 260248 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /overseer/queue
[junit4] 2> 260250 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /overseer/collection-queue-work
[junit4] 2> 260253 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /overseer/collection-map-running
[junit4] 2> 260255 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /overseer/collection-map-completed
[junit4] 2> 260256 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /overseer/collection-map-failure
[junit4] 2> 260258 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /live_nodes
[junit4] 2> 260259 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /aliases.json
[junit4] 2> 260260 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /clusterstate.json
[junit4] 2> 260261 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /security.json
[junit4] 2> 260262 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.ZkController Register node as live in
ZooKeeper:/live_nodes/127.0.0.1:43203__lph%2Fsu
[junit4] 2> 260262 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /live_nodes/127.0.0.1:43203__lph%2Fsu
[junit4] 2> 260263 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /overseer_elect
[junit4] 2> 260264 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /overseer_elect/election
[junit4] 2> 260265 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.Overseer Overseer (id=null) closing
[junit4] 2> 260266 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.LeaderElector Joined leadership election with path:
/overseer_elect/election/94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000
[junit4] 2> 260266 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.OverseerElectionContext I am going to be the leader
127.0.0.1:43203__lph%2Fsu
[junit4] 2> 260266 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /overseer_elect/leader
[junit4] 2> 260267 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.Overseer Overseer
(id=94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000) starting
[junit4] 2> 260273 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.SolrZkClient makePath: /overseer/queue-work
[junit4] 2> 260279 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.OverseerAutoReplicaFailoverThread Starting
OverseerAutoReplicaFailoverThread autoReplicaFailoverWorkLoopDelay=10000
autoReplicaFailoverWaitAfterExpiration=30000
autoReplicaFailoverBadNodeExpiration=60000
[junit4] 2> 260279 INFO
(OverseerCollectionConfigSetProcessor-94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.OverseerTaskProcessor Process
current queue of overseer operations
[junit4] 2> 260279 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.c.ZkStateReader Updating cluster state from ZooKeeper...
[junit4] 2> 260280 INFO
(OverseerStateUpdate-94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.Overseer Starting to work on the main
queue
[junit4] 2> 260334 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.CoreContainer Security conf doesn't exist. Skipping setup for
authorization module.
[junit4] 2> 260334 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.CoreContainer No authentication plugin used.
[junit4] 2> 260334 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.CoresLocator Looking for core definitions underneath
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores
[junit4] 2> 260335 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.SolrCore Created CoreDescriptor: {name=collection1,
config=solrconfig.xml, transient=false, schema=schema.xml, loadOnStartup=true,
configSetProperties=configsetprops.json,
instanceDir=/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores/collection1,
collection=control_collection,
absoluteInstDir=/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores/collection1/,
coreNodeName=, dataDir=data/, shard=}
[junit4] 2> 260335 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.CoresLocator Found core collection1 in
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores/collection1/
[junit4] 2> 260335 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.c.CoresLocator Found 1 core definitions
[junit4] 2> 260336 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.s.SolrDispatchFilter
user.dir=/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1
[junit4] 2> 260336 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [n:127.0.0.1:43203__lph%2Fsu
] o.a.s.s.SolrDispatchFilter SolrDispatchFilter.init() done
[junit4] 2> 260336 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.ZkController publishing state=down
[junit4] 2> 260336 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.ZkController numShards not found on descriptor - reading it from system
property
[junit4] 2> 260338 INFO
(OverseerStateUpdate-94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.Overseer processMessage: queueSize:
1, message = {
[junit4] 2> "core":"collection1",
[junit4] 2> "roles":null,
[junit4] 2> "base_url":"http://127.0.0.1:43203/_lph/su",
[junit4] 2> "node_name":"127.0.0.1:43203__lph%2Fsu",
[junit4] 2> "numShards":"1",
[junit4] 2> "state":"down",
[junit4] 2> "shard":null,
[junit4] 2> "collection":"control_collection",
[junit4] 2> "operation":"state"} current state version: 0
[junit4] 2> 260338 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.ZkController look for our core node name
[junit4] 2> 260338 INFO
(OverseerStateUpdate-94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.o.ReplicaMutator Update state
numShards=1 message={
[junit4] 2> "core":"collection1",
[junit4] 2> "roles":null,
[junit4] 2> "base_url":"http://127.0.0.1:43203/_lph/su",
[junit4] 2> "node_name":"127.0.0.1:43203__lph%2Fsu",
[junit4] 2> "numShards":"1",
[junit4] 2> "state":"down",
[junit4] 2> "shard":null,
[junit4] 2> "collection":"control_collection",
[junit4] 2> "operation":"state"}
[junit4] 2> 260338 INFO
(OverseerStateUpdate-94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.o.ClusterStateMutator building a new
cName: control_collection
[junit4] 2> 260338 INFO
(OverseerStateUpdate-94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.o.ReplicaMutator Assigning new node
to shard shard=shard1
[junit4] 2> 260339 INFO
(zkCallback-364-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.c.ZkStateReader A cluster state
change: WatchedEvent state:SyncConnected type:NodeDataChanged
path:/clusterstate.json, has occurred - updating... (live nodes size: 1)
[junit4] 2> 261338 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.ZkController waiting to find shard id in clusterstate for collection1
[junit4] 2> 261338 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.ZkController Check for collection zkNode:control_collection
[junit4] 2> 261339 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.ZkController Collection zkNode exists
[junit4] 2> 261339 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.c.ZkStateReader Load collection config
from:/collections/control_collection
[junit4] 2> 261339 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.c.ZkStateReader path=/collections/control_collection configName=conf1
specified config exists in ZooKeeper
[junit4] 2> 261339 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.SolrResourceLoader new SolrResourceLoader for directory:
'/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores/collection1/'
[junit4] 2> 261352 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.Config loaded config solrconfig.xml with version 0
[junit4] 2> 261360 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.SolrConfig current version of requestparams : -1
[junit4] 2> 261373 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.SolrConfig Using Lucene MatchVersion: 6.0.0
[junit4] 2> 261389 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.Config Loaded SolrConfig: solrconfig.xml
[junit4] 2> 261390 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.s.IndexSchema Reading Solr Schema from /configs/conf1/schema.xml
[junit4] 2> 261414 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.s.IndexSchema [collection1] Schema name=test
[junit4] 2> 261550 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.s.IndexSchema default search field in schema is text
[junit4] 2> 261551 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.s.IndexSchema unique key field: id
[junit4] 2> 261554 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.s.FileExchangeRateProvider Reloading exchange rates from file currency.xml
[junit4] 2> 261557 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.s.FileExchangeRateProvider Reloading exchange rates from file currency.xml
[junit4] 2> 261577 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.ConfigSetProperties Did not find ConfigSet properties, assuming default
properties: Can't find resource 'configsetprops.json' in classpath or
'/configs/conf1',
cwd=/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1
[junit4] 2> 261577 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection x:collection1]
o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from
collection control_collection
[junit4] 2> 261577 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore org.apache.solr.core.MockDirectoryFactory
[junit4] 2> 261577 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new SolrCore at
[/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores/collection1/],
dataDir=[null]
[junit4] 2> 261577 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is enabled. Adding Solr
mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanServer@1d3444
[junit4] 2> 261578 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.CachingDirectoryFactory return new directory for
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores/collection1/data
[junit4] 2> 261578 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore New index directory detected: old=null
new=/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores/collection1/data/index/
[junit4] 2> 261578 WARN
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore [collection1] Solr index directory
'/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores/collection1/data/index'
doesn't exist. Creating new index...
[junit4] 2> 261578 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.CachingDirectoryFactory return new directory for
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores/collection1/data/index
[junit4] 2> 261579 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class
org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy:
maxMergeAtOnce=50, maxMergeAtOnceExplicit=39, maxMergedSegmentMB=51.208984375,
floorSegmentMB=1.8095703125, forceMergeDeletesPctAllowed=10.129753892129271,
segmentsPerTier=48.0, maxCFSSegmentSizeMB=8.796093022207999E12,
noCFSRatio=0.17595387922754668
[junit4] 2> 261579 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore SolrDeletionPolicy.onCommit: commits: num=1
[junit4] 2> commit{dir=MockDirectoryWrapper(RAMDirectory@e82f11
lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@18d080e),segFN=segments_1,generation=1}
[junit4] 2> 261579 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore newest commit generation = 1
[junit4] 2> 261579 INFO
(OldIndexDirectoryCleanupThreadForCore-collection1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore Looking for old index directories to cleanup
for core collection1 in
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores/collection1/data/
[junit4] 2> 261580 WARN
(OldIndexDirectoryCleanupThreadForCore-collection1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.DirectoryFactory
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.ShardSplitTest_E02AB809BB9D0705-001/control-001/cores/collection1/data/
does not point to a valid data directory; skipping clean-up of old index
directories.
[junit4] 2> 261582 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain creating
updateRequestProcessorChain "nodistrib"
[junit4] 2> 261582 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain creating
updateRequestProcessorChain "dedupe"
[junit4] 2> 261582 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain inserting
DistributedUpdateProcessorFactory into updateRequestProcessorChain "dedupe"
[junit4] 2> 261582 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain creating
updateRequestProcessorChain "stored_sig"
[junit4] 2> 261582 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain inserting
DistributedUpdateProcessorFactory into updateRequestProcessorChain "stored_sig"
[junit4] 2> 261582 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain creating
updateRequestProcessorChain "distrib-dup-test-chain-explicit"
[junit4] 2> 261583 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain creating
updateRequestProcessorChain "distrib-dup-test-chain-implicit"
[junit4] 2> 261583 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.p.UpdateRequestProcessorChain inserting
DistributedUpdateProcessorFactory into updateRequestProcessorChain
"distrib-dup-test-chain-implicit"
[junit4] 2> 261583 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore no updateRequestProcessorChain defined as
default, creating implicit default
[junit4] 2> 261584 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.h.l.XMLLoader xsltCacheLifetimeSeconds=60
[junit4] 2> 261587 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.h.l.XMLLoader xsltCacheLifetimeSeconds=60
[junit4] 2> 261587 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.h.l.XMLLoader xsltCacheLifetimeSeconds=60
[junit4] 2> 261588 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.h.l.XMLLoader xsltCacheLifetimeSeconds=60
[junit4] 2> 261591 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.RequestHandlers Registered paths:
/admin/mbeans,standard,/update/csv,/update/json/docs,/admin/luke,/admin/segments,/get,/admin/system,/replication,/admin/properties,/config,/schema,/admin/plugins,/admin/logging,/update/json,/admin/threads,/admin/ping,/update,/admin/file
[junit4] 2> 261592 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore Using default statsCache cache:
org.apache.solr.search.stats.LocalStatsCache
[junit4] 2> 261592 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.UpdateHandler Using UpdateLog implementation:
org.apache.solr.update.UpdateLog
[junit4] 2> 261592 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=
defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10
numVersionBuckets=65536
[junit4] 2> 261594 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore Hard AutoCommit: disabled
[junit4] 2> 261594 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore Soft AutoCommit: disabled
[junit4] 2> 261594 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class
org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy:
maxMergeAtOnce=48, maxMergeAtOnceExplicit=33, maxMergedSegmentMB=55.201171875,
floorSegmentMB=0.3291015625, forceMergeDeletesPctAllowed=8.333188483138116,
segmentsPerTier=39.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
[junit4] 2> 261595 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore SolrDeletionPolicy.onInit: commits: num=1
[junit4] 2> commit{dir=MockDirectoryWrapper(RAMDirectory@e82f11
lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@18d080e),segFN=segments_1,generation=1}
[junit4] 2> 261595 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore newest commit generation = 1
[junit4] 2> 261595 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.s.SolrIndexSearcher Opening Searcher@4e4829[collection1]
main
[junit4] 2> 261596 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.c.ZkStateReader Load collection config
from:/collections/control_collection
[junit4] 2> 261596 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.c.ZkStateReader path=/collections/control_collection
configName=conf1 specified config exists in ZooKeeper
[junit4] 2> 261596 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.r.ManagedResourceStorage Setting up ZooKeeper-based
storage for the RestManager with znodeBase: /configs/conf1
[junit4] 2> 261597 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO
with znodeBase: /configs/conf1
[junit4] 2> 261597 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.r.RestManager Initializing RestManager with initArgs: {}
[junit4] 2> 261597 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.r.ManagedResourceStorage Reading _rest_managed.json using
ZooKeeperStorageIO:path=/configs/conf1
[junit4] 2> 261597 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.r.ManagedResourceStorage No data found for znode
/configs/conf1/_rest_managed.json
[junit4] 2> 261597 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at path
_rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
[junit4] 2> 261597 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.r.RestManager Initializing 0 registered ManagedResources
[junit4] 2> 261598 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.h.ReplicationHandler Commits will be reserved for 10000
[junit4] 2> 261598 INFO
(searcherExecutor-823-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SolrCore [collection1] Registered new searcher
Searcher@4e4829[collection1]
main{ExitableDirectoryReader(UninvertingDirectoryReader())}
[junit4] 2> 261598 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.UpdateLog Looking up max value of version field to seed
version buckets
[junit4] 2> 261598 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.VersionInfo Refreshing highest value of _version_ for
65536 version buckets from index
[junit4] 2> 261598 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.VersionInfo No terms found for _version_, cannot seed
version bucket highest value from index
[junit4] 2> 261598 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.UpdateLog Could not find max version in index or recent
updates, using new clock 1514072932890968064
[junit4] 2> 261600 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.u.UpdateLog Took 2.0ms to seed version buckets with
highest version 1514072932890968064
[junit4] 2> 261600 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.ZkController watch zkdir /configs/conf1
[junit4] 2> 261601 INFO
(coreLoadExecutor-822-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.CoreContainer registering core: collection1
[junit4] 2> 261601 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.ZkController Register replica - core:collection1
address:http://127.0.0.1:43203/_lph/su collection:control_collection
shard:shard1
[junit4] 2> 261602 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.c.SolrZkClient makePath:
/collections/control_collection/leader_elect/shard1/election
[junit4] 2> 261606 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.LeaderElector Joined leadership election with path:
/collections/control_collection/leader_elect/shard1/election/94629558195716100-core_node1-n_0000000000
[junit4] 2> 261606 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.ShardLeaderElectionContext Running the leader process
for shard shard1
[junit4] 2> 261608 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.ShardLeaderElectionContext Enough replicas found to
continue.
[junit4] 2> 261608 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try
and sync
[junit4] 2> 261608 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SyncStrategy Sync replicas to
http://127.0.0.1:43203/_lph/su/collection1/
[junit4] 2> 261608 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
[junit4] 2> 261608 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.SyncStrategy http://127.0.0.1:43203/_lph/su/collection1/
has no replicas
[junit4] 2> 261608 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.ShardLeaderElectionContext I am the new leader:
http://127.0.0.1:43203/_lph/su/collection1/ shard1
[junit4] 2> 261608 INFO
(OverseerStateUpdate-94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.Overseer processMessage: queueSize:
1, message = {
[junit4] 2> "operation":"leader",
[junit4] 2> "shard":"shard1",
[junit4] 2> "collection":"control_collection"} current state version: 1
[junit4] 2> 261608 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.c.SolrZkClient makePath:
/collections/control_collection/leaders/shard1
[junit4] 2> 261610 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.ShardLeaderElectionContextBase Creating leader
registration node
[junit4] 2> 261612 INFO
(OverseerStateUpdate-94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.Overseer processMessage: queueSize:
1, message = {
[junit4] 2> "operation":"leader",
[junit4] 2> "shard":"shard1",
[junit4] 2> "collection":"control_collection",
[junit4] 2> "base_url":"http://127.0.0.1:43203/_lph/su",
[junit4] 2> "core":"collection1",
[junit4] 2> "state":"active"} current state version: 1
[junit4] 2> 261715 INFO
(zkCallback-364-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.c.ZkStateReader A cluster state
change: WatchedEvent state:SyncConnected type:NodeDataChanged
path:/clusterstate.json, has occurred - updating... (live nodes size: 1)
[junit4] 2> 261763 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.ZkController We are
http://127.0.0.1:43203/_lph/su/collection1/ and leader is
http://127.0.0.1:43203/_lph/su/collection1/
[junit4] 2> 261763 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.ZkController No LogReplay needed for core=collection1
baseURL=http://127.0.0.1:43203/_lph/su
[junit4] 2> 261763 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.ZkController I am the leader, no recovery necessary
[junit4] 2> 261763 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.ZkController publishing state=active
[junit4] 2> 261763 INFO
(coreZkRegister-816-thread-1-processing-n:127.0.0.1:43203__lph%2Fsu
x:collection1 s:shard1 c:control_collection r:core_node1)
[n:127.0.0.1:43203__lph%2Fsu c:control_collection s:shard1 r:core_node1
x:collection1] o.a.s.c.ZkController numShards not found on descriptor - reading
it from system property
[junit4] 2> 261764 INFO
(OverseerStateUpdate-94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.Overseer processMessage: queueSize:
1, message = {
[junit4] 2> "core":"collection1",
[junit4] 2> "core_node_name":"core_node1",
[junit4] 2> "roles":null,
[junit4] 2> "base_url":"http://127.0.0.1:43203/_lph/su",
[junit4] 2> "node_name":"127.0.0.1:43203__lph%2Fsu",
[junit4] 2> "numShards":"1",
[junit4] 2> "state":"active",
[junit4] 2> "shard":"shard1",
[junit4] 2> "collection":"control_collection",
[junit4] 2> "operation":"state"} current state version: 2
[junit4] 2> 261765 INFO
(OverseerStateUpdate-94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.o.ReplicaMutator Update state
numShards=1 message={
[junit4] 2> "core":"collection1",
[junit4] 2> "core_node_name":"core_node1",
[junit4] 2> "roles":null,
[junit4] 2> "base_url":"http://127.0.0.1:43203/_lph/su",
[junit4] 2> "node_name":"127.0.0.1:43203__lph%2Fsu",
[junit4] 2> "numShards":"1",
[junit4] 2> "state":"active",
[junit4] 2> "shard":"shard1",
[junit4] 2> "collection":"control_collection",
[junit4] 2> "operation":"state"}
[junit4] 2> 261838 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
[junit4] 2> 261838 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
[junit4] 2> 261839 INFO (zkCallback-366-thread-1) [ ]
o.a.s.c.c.ConnectionManager Watcher
org.apache.solr.common.cloud.ConnectionManager@2c2616 name:ZooKeeperConnection
Watcher:127.0.0.1:38732/solr got event WatchedEvent state:SyncConnected
type:None path:null path:null type:None
[junit4] 2> 261839 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
[junit4] 2> 261840 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
[junit4] 2> 261840 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ZkStateReader Updating cluster state from ZooKeeper...
[junit4] 2> 261842 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ] o.a.s.c.ChaosMonkey
monkey: init - expire sessions:false cause connection loss:false
[junit4] 2> 261842 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.AbstractFullDistribZkTestBase Creating collection1 with stateFormat=2
[junit4] 2> 261842 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient Using default ZkCredentialsProvider
[junit4] 2> 261842 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
[junit4] 2> 261843 INFO (zkCallback-367-thread-1) [ ]
o.a.s.c.c.ConnectionManager Watcher
org.apache.solr.common.cloud.ConnectionManager@2aad19 name:ZooKeeperConnection
Watcher:127.0.0.1:38732/solr got event WatchedEvent state:SyncConnected
type:None path:null path:null type:None
[junit4] 2> 261843 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
[junit4] 2> 261843 INFO
(TEST-ShardSplitTest.test-seed#[E02AB809BB9D0705]) [ ]
o.a.s.c.c.SolrZkClient Using default ZkACLProvider
[junit4] 2> 261845 INFO
(OverseerStateUpdate-94629558195716100-127.0.0.1:43203__lph%2Fsu-n_0000000000)
[n:127.0.0.1:43203__lph%2Fsu ] o.a.s.c.Overseer processMessage: queueSize:
1, message = {
[junit4] 2> "operation":"create",
[junit4] 2> "name":"collection1",
[junit4] 2> "numShards":"2",
[junit4] 2> "stateFormat":"2"} current state version: 2
[junit
[...truncated too long message...]
etTask(ThreadPoolExecutor.java:1067)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[junit4] > at java.lang.Thread.run(Thread.java:746)
[junit4] > 3) Thread[id=11310, name=groupCache.data,
state=TIMED_WAITING, group=TGRP-SaslZkACLProviderTest]
[junit4] > at sun.misc.Unsafe.park(Native Method)
[junit4] > at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
[junit4] > at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[junit4] > at java.lang.Thread.run(Thread.java:746)
[junit4] > 4) Thread[id=11309, name=ou=system.data,
state=TIMED_WAITING, group=TGRP-SaslZkACLProviderTest]
[junit4] > at sun.misc.Unsafe.park(Native Method)
[junit4] > at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
[junit4] > at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[junit4] > at java.lang.Thread.run(Thread.java:746)
[junit4] > 5) Thread[id=11307, name=changePwdReplayCache.data,
state=TIMED_WAITING, group=TGRP-SaslZkACLProviderTest]
[junit4] > at sun.misc.Unsafe.park(Native Method)
[junit4] > at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
[junit4] > at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[junit4] > at java.lang.Thread.run(Thread.java:746)
[junit4] > at
__randomizedtesting.SeedInfo.seed([E02AB809BB9D0705]:0)Throwable #2:
com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie
threads that couldn't be terminated:
[junit4] > 1) Thread[id=11306, name=apacheds, state=WAITING,
group=TGRP-SaslZkACLProviderTest]
[junit4] > at java.lang.Object.wait(Native Method)
[junit4] > at java.lang.Object.wait(Object.java:516)
[junit4] > at java.util.TimerThread.mainLoop(Timer.java:526)
[junit4] > at java.util.TimerThread.run(Timer.java:505)
[junit4] > 2) Thread[id=11308, name=kdcReplayCache.data,
state=TIMED_WAITING, group=TGRP-SaslZkACLProviderTest]
[junit4] > at sun.misc.Unsafe.park(Native Method)
[junit4] > at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
[junit4] > at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[junit4] > at java.lang.Thread.run(Thread.java:746)
[junit4] > 3) Thread[id=11310, name=groupCache.data,
state=TIMED_WAITING, group=TGRP-SaslZkACLProviderTest]
[junit4] > at sun.misc.Unsafe.park(Native Method)
[junit4] > at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
[junit4] > at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[junit4] > at java.lang.Thread.run(Thread.java:746)
[junit4] > 4) Thread[id=11309, name=ou=system.data,
state=TIMED_WAITING, group=TGRP-SaslZkACLProviderTest]
[junit4] > at sun.misc.Unsafe.park(Native Method)
[junit4] > at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
[junit4] > at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[junit4] > at java.lang.Thread.run(Thread.java:746)
[junit4] > 5) Thread[id=11307, name=changePwdReplayCache.data,
state=TIMED_WAITING, group=TGRP-SaslZkACLProviderTest]
[junit4] > at sun.misc.Unsafe.park(Native Method)
[junit4] > at
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
[junit4] > at
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
[junit4] > at
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
[junit4] > at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[junit4] > at java.lang.Thread.run(Thread.java:746)
[junit4] > at
__randomizedtesting.SeedInfo.seed([E02AB809BB9D0705]:0)
[junit4] Completed [416/546] on J0 in 25.82s, 1 test, 3 errors <<< FAILURES!
[...truncated 408 lines...]
BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/build.xml:775: The following
error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/build.xml:719: The following
error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/build.xml:59: The following
error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/build.xml:233: The
following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/solr/common-build.xml:516: The
following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/lucene/common-build.xml:1432:
The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-trunk-Linux/lucene/common-build.xml:987:
There were test failures: 546 suites (8 ignored), 2145 tests, 2 suite-level
errors, 1 error, 1 failure, 379 ignored (35 assumptions) [seed:
E02AB809BB9D0705]
Total time: 48 minutes 19 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]