dsalgos opened a new issue #11323:
URL: https://github.com/apache/druid/issues/11323
Please provide a detailed title (e.g. "Broker crashes when using TopN query
with Bound filter" instead of just "Broker crashes").
### Affected Version
0.20.2, 0.21.0
The Druid version where the problem was encountered.
### Description
Task fails to process any events once the ingestion encounters an
unparseable json record
Please include as much detailed information about the problem as possible.
- Steps to reproduce the problem
Push any even with is has bad timestamp or is not properly formatted json,
any un parseable exception leads to ingestion task failure.
- The error message or stack traces encountered. Providing more context,
such as nearby log messages or even entire logs, can be helpful.
2021-06-02T08:55:28,782 ERROR [task-runner-0-priority-0]
org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner -
Encountered exception while running task.
org.apache.druid.java.util.common.parsers.ParseException: Unable to parse
row [hi]
at
org.apache.druid.java.util.common.parsers.JSONPathParser.parseToMap(JSONPathParser.java:74)
~[druid-core-0.21.0.jar:0.21.0]
at
org.apache.druid.data.input.impl.StringInputRowParser.parseString(StringInputRowParser.java:155)
~[druid-core-0.21.0.jar:0.21.0]
at
org.apache.druid.data.input.impl.StringInputRowParser.buildStringKeyMap(StringInputRowParser.java:119)
~[druid-core-0.21.0.jar:0.21.0]
at
org.apache.druid.data.input.impl.StringInputRowParser.parseBatch(StringInputRowParser.java:80)
~[druid-core-0.21.0.jar:0.21.0]
at
org.apache.druid.segment.transform.TransformingStringInputRowParser.parseBatch(TransformingStringInputRowParser.java:50)
~[druid-processing-0.21.0.jar:0.21.0]
at
org.apache.druid.segment.transform.TransformingStringInputRowParser.parseBatch(TransformingStringInputRowParser.java:31)
~[druid-processing-0.21.0.jar:0.21.0]
at
org.apache.druid.indexing.seekablestream.StreamChunkParser.lambda$parseWithParser$0(StreamChunkParser.java:111)
~[druid-indexing-service-0.21.0.jar:0.21.0]
at
com.google.common.collect.Iterators$8.transform(Iterators.java:794)
~[guava-16.0.1.jar:?]
at
com.google.common.collect.TransformedIterator.next(TransformedIterator.java:48)
~[guava-16.0.1.jar:?]
at
com.google.common.collect.TransformedIterator.next(TransformedIterator.java:48)
~[guava-16.0.1.jar:?]
at com.google.common.collect.Iterators$5.hasNext(Iterators.java:543)
~[guava-16.0.1.jar:?]
at
org.apache.druid.java.util.common.CloseableIterators$1.hasNext(CloseableIterators.java:71)
~[druid-core-0.21.0.jar:0.21.0]
at
org.apache.druid.indexing.common.task.FilteringCloseableInputRowIterator.hasNext(FilteringCloseableInputRowIterator.java:62)
~[druid-indexing-service-0.21.0.jar:0.21.0]
at com.google.common.collect.Iterators.addAll(Iterators.java:356)
~[guava-16.0.1.jar:?]
at com.google.common.collect.Lists.newArrayList(Lists.java:147)
~[guava-16.0.1.jar:?]
at
org.apache.druid.indexing.seekablestream.StreamChunkParser.parseWithParser(StreamChunkParser.java:119)
~[druid-indexing-service-0.21.0.jar:0.21.0]
at
org.apache.druid.indexing.seekablestream.StreamChunkParser.parse(StreamChunkParser.java:102)
~[druid-indexing-service-0.21.0.jar:0.21.0]
at
org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner.runInternal(SeekableStreamIndexTaskRunner.java:625)
~[druid-indexing-service-0.21.0.jar:0.21.0]
at
org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner.run(SeekableStreamIndexTaskRunner.java:268)
[druid-indexing-service-0.21.0.jar:0.21.0]
at
org.apache.druid.indexing.seekablestream.SeekableStreamIndexTask.run(SeekableStreamIndexTask.java:146)
[druid-indexing-service-0.21.0.jar:0.21.0]
at
org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner$SingleTaskBackgroundRunnerCallable.call(SingleTaskBackgroundRunner.java:451)
[druid-indexing-service-0.21.0.jar:0.21.0]
at
org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner$SingleTaskBackgroundRunnerCallable.call(SingleTaskBackgroundRunner.java:423)
[druid-indexing-service-0.21.0.jar:0.21.0]
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[?:1.8.0_152]
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[?:1.8.0_152]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[?:1.8.0_152]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_152]
2021-06-02T08:55:28,791 INFO [task-runner-0-priority-0]
org.apache.druid.indexing.worker.executor.ExecutorLifecycle - Task completed
with status: {
"id" : "index_kafka_EPC_Raw_9fc0aedb01ba417_jfnkmgfb",
"status" : "FAILED",
"duration" : 846,
"errorMsg" : "org.apache.druid.java.util.common.parsers.ParseException:
Unable to parse row [hi]\n\tat org.apache.dr...",
"location" : {
"host" : null,
"port" : -1,
"tlsPort" : -1
}
}
- Any debugging that you have already done
No.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]