justinborromeo opened a new issue #7193: 
KafkaSupervisorTest#testCheckpointForUnknownTaskGroup() is flaky
URL: https://github.com/apache/incubator-druid/issues/7193
 
 
   # Affected Version
   
   `master` branch.
   
   # Description
   
   Transient failures observed by both me and @clintropolis on Travis and when 
run locally in IntelliJ.
   
   # Test Output
   
   ```
   2019-03-05T01:49:07,109 WARN [Time-limited test] 
org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 
'myCustomKey' was supplied but isn't a known config.
   2019-03-05T01:49:07,109 INFO [Time-limited test] 
org.apache.kafka.common.utils.AppInfoParser - Kafka version : 2.1.0
   2019-03-05T01:49:07,109 INFO [Time-limited test] 
org.apache.kafka.common.utils.AppInfoParser - Kafka commitId : eec43959745f444f
   2019-03-05T01:49:07,110 INFO [Time-limited test] 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor - 
Started SeekableStreamSupervisor[testDS], first run in [PT86400S], with spec: 
[KafkaSupervisorSpec{dataSchema=DataSchema{dataSource='testDS', 
parser={type=string, parseSpec={format=json, timestampSpec={column=timestamp, 
format=iso, missingValue=null}, dimensionsSpec={dimensions=[{type=string, 
name=dim1, multiValueHandling=SORTED_ARRAY, createBitmapIndex=true}, 
{type=string, name=dim2, multiValueHandling=SORTED_ARRAY, 
createBitmapIndex=true}], dimensionExclusions=[]}, 
flattenSpec={useFieldDiscovery=true, fields=[]}, featureSpec={}}, 
encoding=UTF-8}, aggregators=[CountAggregatorFactory{name='rows'}], 
granularitySpec=UniformGranularitySpec{segmentGranularity={type=period, 
period=PT1H, timeZone=UTC, origin=null}, queryGranularity=NoneGranularity, 
rollup=true, inputIntervals=[], 
wrappedSpec=ArbitraryGranularitySpec{intervals=[], 
queryGranularity=NoneGranularity, rollup=true}}, 
transformSpec=TransformSpec{filter=null, transforms=[]}}, 
tuningConfig=KafkaSupervisorTuningConfig{maxRowsInMemory=1000, 
maxRowsPerSegment=50000, maxTotalRows=null, maxBytesInMemory=233046016, 
intermediatePersistPeriod=P1Y, basePersistDirectory=/test, 
maxPendingPersists=0, 
indexSpec=IndexSpec{bitmapSerdeFactory=ConciseBitmapSerdeFactory{}, 
dimensionCompression=lz4, metricCompression=lz4, longEncoding=longs}, 
reportParseExceptions=false, handoffConditionTimeout=0, 
resetOffsetAutomatically=false, segmentWriteOutMediumFactory=null, 
workerThreads=8, chatThreads=3, chatRetries=9, httpTimeout=PT10S, 
shutdownTimeout=PT80S, offsetFetchPeriod=PT30S, 
intermediateHandoffPeriod=P2147483647D, logParseExceptions=false, 
maxParseExceptions=2147483647, maxSavedParseExceptions=0}, 
ioConfig=KafkaSupervisorIOConfig{topic='testTopic68', replicas=2, taskCount=1, 
taskDuration=PT1S, consumerProperties={myCustomKey=myCustomValue, 
isolation.level=read_committed, bootstrap.servers=localhost:13818}, 
pollTimeout=100, startDelay=PT86400S, period=PT30S, useEarliestOffset=true, 
completionTimeout=PT1800S, earlyMessageRejectionPeriod=Optional.absent(), 
lateMessageRejectionPeriod=Optional.absent()}, context=null, suspend=false}]
   2019-03-05T01:49:07,110 INFO [Time-limited test] 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor - 
Checkpointing 
[SeekableStreamDataSourceMetadata{SeekableStreamPartitions=SeekableStreamPartitions{stream/topic='testTopic68',
 partitionSequenceNumberMap/partitionOffsetMap={}}}] for taskGroup [0]
   2019-03-05T01:49:07,110 ERROR [KafkaSupervisor-testDS] 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor - 
SeekableStreamSupervisor[testDS] failed to handle notice: 
{class=org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor,
 exceptionType=class org.apache.druid.java.util.common.ISE, 
exceptionMessage=WTH?! cannot find taskGroup [0] among all 
activelyReadingTaskGroups [{}], noticeClass=CheckpointNotice}
   org.apache.druid.java.util.common.ISE: WTH?! cannot find taskGroup [0] among 
all activelyReadingTaskGroups [{}]
        at 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor$CheckpointNotice.isValidTaskGroup(SeekableStreamSupervisor.java:417)
 ~[classes/:?]
        at 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor$CheckpointNotice.handle(SeekableStreamSupervisor.java:371)
 ~[classes/:?]
        at 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor.lambda$tryInit$3(SeekableStreamSupervisor.java:724)
 ~[classes/:?]
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
[?:1.8.0_191]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
[?:1.8.0_191]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[?:1.8.0_191]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[?:1.8.0_191]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
   2019-03-05T01:49:07,184 WARN [KafkaSupervisor-testDS] 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor - 
Could not fetch partitions for topic/stream [testTopic57]
   2019-03-05T01:49:07,184 DEBUG [KafkaSupervisor-testDS] 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor - 
full stack trace
   org.apache.druid.java.util.common.ISE: Topic [testTopic57] is not found in 
KafkaConsumer's list of topics
        at 
org.apache.druid.indexing.kafka.KafkaRecordSupplier.getPartitionIds(KafkaRecordSupplier.java:156)
 ~[classes/:?]
        at 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor.updatePartitionDataFromStream(SeekableStreamSupervisor.java:1703)
 ~[classes/:?]
        at 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor.runInternal(SeekableStreamSupervisor.java:1003)
 ~[classes/:?]
        at 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor$RunNotice.handle(SeekableStreamSupervisor.java:265)
 ~[classes/:?]
        at 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor.lambda$tryInit$3(SeekableStreamSupervisor.java:724)
 ~[classes/:?]
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
[?:1.8.0_191]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
[?:1.8.0_191]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[?:1.8.0_191]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[?:1.8.0_191]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
   2019-03-05T01:49:07,185 DEBUG [KafkaSupervisor-testDS] 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor - 
Found [3] seekablestream indexing tasks for dataSource [testDS]
   2019-03-05T01:49:07,185 ERROR [KafkaSupervisor-testDS] 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor - 
SeekableStreamSupervisor[testDS] failed to handle notice: 
{class=org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor,
 exceptionType=class java.lang.NullPointerException, exceptionMessage=null, 
noticeClass=RunNotice}
   java.lang.NullPointerException
        at 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor.checkPendingCompletionTasks(SeekableStreamSupervisor.java:2132)
 ~[classes/:?]
        at 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor.runInternal(SeekableStreamSupervisor.java:1007)
 ~[classes/:?]
        at 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor$RunNotice.handle(SeekableStreamSupervisor.java:265)
 ~[classes/:?]
        at 
org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisor.lambda$tryInit$3(SeekableStreamSupervisor.java:724)
 ~[classes/:?]
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
[?:1.8.0_191]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
[?:1.8.0_191]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[?:1.8.0_191]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[?:1.8.0_191]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
   
   java.lang.AssertionError
        at org.junit.Assert.fail(Assert.java:86)
        at org.junit.Assert.assertTrue(Assert.java:41)
        at org.junit.Assert.assertTrue(Assert.java:52)
        at 
org.apache.druid.indexing.kafka.supervisor.KafkaSupervisorTest.testCheckpointForUnknownTaskGroup(KafkaSupervisorTest.java:2334)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
        at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
        at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
        at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
        at 
org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:298)
        at 
org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:292)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.lang.Thread.run(Thread.java:748)
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to