DockerLive opened a new issue #10134:
URL: https://github.com/apache/druid/issues/10134


   ### Affected Version
   
   Druid version 0.18.0
   
   ### Description
   compaction task :
   {
     "type": "compact",
     "dataSource": "dis_take_hour",
     "interval": "2020-07-01T16:00:00/2020-07-02T16:00:00", 
   // or "interval": "2020-07-02T00:00:00+0800/2020-07-03T00:00:00+0800"
     "inputSegmentSizeBytes": 52914560,
     "segmentGranularity":{"type": "period", "period": "P1D", "timeZone": 
"Asia/Shanghai"},
     "maxNumSegmentsToCompact": 30,
     "skipOffsetFromLatest": "P1D",
     "taskPriority": 100,
     "tuningConfig": {
           "type": "index",
           "maxRowsPerSegment": 8000000,
           "maxRowsInMemory": 1000000
     }
   }
   
   1.Task status is successful
   2.No metadata information was submitted
   3.The segment file is submitted to deep storage
   
   Task important log information is shown below 
   
   2020-07-03T18:22:09,902 INFO [task-runner-0-priority-0] 
org.apache.druid.indexing.common.task.batch.parallel.ParallelIndexSupervisorTask
 - Found chat handler of 
class[org.apache.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider]
   2020-07-03T18:22:09,902 WARN [task-runner-0-priority-0] 
org.apache.druid.indexing.common.task.batch.parallel.ParallelIndexSupervisorTask
 - maxNumConcurrentSubTasks[1] is less than or equal to 1. Running 
sequentially. Please set maxNumConcurrentSubTasks to something higher than 1 if 
you want to run in parallel ingestion mode.
   2020-07-03T18:22:09,913 INFO [task-runner-0-priority-0] 
org.apache.druid.indexing.common.task.AbstractBatchIndexTask - 
[forceTimeChunkLock] is set to true in task context. Use timeChunk lock
   2020-07-03T18:22:09,919 WARN [task-runner-0-priority-0] 
org.apache.druid.indexing.common.task.IndexTask - Chat handler is already 
registered. Skipping chat handler registration.
   2020-07-03T18:22:09,925 INFO [task-runner-0-priority-0] 
org.apache.druid.indexing.common.task.IndexTask - Skipping determine partition 
scan
   2020-07-03T18:22:10,012 INFO [task-runner-0-priority-0] 
org.apache.druid.storage.hdfs.HdfsDataSegmentPuller - **Unzipped 1132862 bytes 
from 
[hdfs://nameservice2/druid/segments/dis_take_hour/20200702T000000.000+0800_20200703T000000.000+0800/2020-07-03T10_09_06.061Z/0_index.zip]
 to 
[/app/druid/var/tmp/task/compact_dis_take_hour_ghhghijg_2020-07-03T10:22:01.826Z/work/indexing-tmp/dis_take_hour/2020-07-01T16:00:00.000Z_2020-07-02T16:00:00.000Z/2020-07-03T10:09:06.061Z/0]**
   2020-07-03T18:22:10,057 INFO [task-runner-0-priority-0] 
org.apache.druid.segment.realtime.appenderator.BaseAppenderatorDriver - New 
segment[dis_take_hour_2020-07-02T00:00:00.000+08:00_2020-07-03T00:00:00.000+08:00_2020-07-03T10:22:01.861Z]
 for sequenceName[compact_dis_take_hour_ghhghijg_2020-07-03T10:22:01.826Z_0].
   2020-07-03T18:22:10,832 INFO [task-runner-0-priority-0] 
org.apache.druid.segment.realtime.appenderator.BaseAppenderatorDriver - Pushing 
[1] segments in background
   2020-07-03T18:22:10,836 INFO [task-runner-0-priority-0] 
org.apache.druid.segment.realtime.appenderator.BaseAppenderatorDriver - Pushing 
segments: 
[dis_take_hour_2020-07-02T00:00:00.000+08:00_2020-07-03T00:00:00.000+08:00_2020-07-03T10:22:01.861Z]
   2020-07-03T18:22:11,396 INFO 
[[compact_dis_take_hour_ghhghijg_2020-07-03T10:22:01.826Z]-appenderator-persist]
 org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Flushed 
in-memory data for 
segment[dis_take_hour_2020-07-02T00:00:00.000+08:00_2020-07-03T00:00:00.000+08:00_2020-07-03T10:22:01.861Z]
 spill[0] to disk in [558] ms (101,989 rows).
   2020-07-03T18:22:11,450 INFO 
[[compact_dis_take_hour_ghhghijg_2020-07-03T10:22:01.826Z]-appenderator-persist]
 org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Flushed 
in-memory data with commit metadata [null] for segments: 
dis_take_hour_2020-07-02T00:00:00.000+08:00_2020-07-03T00:00:00.000+08:00_2020-07-03T10:22:01.861Z
   2020-07-03T18:22:12,018 INFO 
[[compact_dis_take_hour_ghhghijg_2020-07-03T10:22:01.826Z]-appenderator-merge] 
org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - 
Segment[dis_take_hour_2020-07-02T00:00:00.000+08:00_2020-07-03T00:00:00.000+08:00_2020-07-03T10:22:01.861Z]
 of 1,132,862 bytes built from 1 incremental persist(s) in 387ms; pushed to 
deep storage in 168ms. Load spec is: 
{"type":"hdfs","path":"hdfs://nameservice2/druid/segments/dis_take_hour/20200702T000000.000+0800_20200703T000000.000+0800/2020-07-03T10_22_01.861Z/0_index.zip"}
   2020-07-03T18:22:12,033 INFO 
[[compact_dis_take_hour_ghhghijg_2020-07-03T10:22:01.826Z]-appenderator-persist]
 org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Dropped 
segment[dis_take_hour_2020-07-02T00:00:00.000+08:00_2020-07-03T00:00:00.000+08:00_2020-07-03T10:22:01.861Z].
   2020-07-03T18:22:12,037 WARN [task-runner-0-priority-0] 
org.apache.druid.indexing.input.DruidSegmentInputEntity - Could not clean 
temporary segment file: 
var/tmp/task/compact_dis_take_hour_ghhghijg_2020-07-03T10:22:01.826Z/work/indexing-tmp/dis_take_hour/2020-07-01T16:00:00.000Z_2020-07-02T16:00:00.000Z/2020-07-03T10:09:06.061Z/0
   2020-07-03T18:22:12,106 INFO 
[[compact_dis_take_hour_ghhghijg_2020-07-03T10:22:01.826Z]-publish] 
org.apache.druid.segment.realtime.appenderator.BaseAppenderatorDriver - 
**Published [1] segments with commit metadata [null]**
   2020-07-03T18:22:12,108 INFO 
[[compact_dis_take_hour_ghhghijg_2020-07-03T10:22:01.826Z]-publish] 
org.apache.druid.segment.realtime.appenderator.BaseAppenderatorDriver - 
Published segments: 
[dis_take_hour_2020-07-02T00:00:00.000+08:00_2020-07-03T00:00:00.000+08:00_2020-07-03T10:22:01.861Z]
   2020-07-03T18:22:12,109 INFO [task-runner-0-priority-0] 
org.apache.druid.indexing.common.task.IndexTask - Processed[101,989] events, 
unparseable[0], thrownAway[0].
   2020-07-03T18:22:12,109 INFO [task-runner-0-priority-0] 
org.apache.druid.indexing.common.task.IndexTask - Published [1] segments
   2020-07-03T18:22:12,120 WARN [task-runner-0-priority-0] 
org.apache.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider 
- handler[compact_dis_take_hour_ghhghijg_2020-07-03T10:22:01.826Z_0] not 
currently registered, ignoring.
   2020-07-03T18:22:12,120 INFO [task-runner-0-priority-0] 
org.apache.druid.indexing.common.task.CompactionTask - Run [1] specs, [1] 
succeeded, [0] failed
   2020-07-03T18:22:12,124 INFO [task-runner-0-priority-0] 
org.apache.druid.indexing.worker.executor.ExecutorLifecycle - Task completed 
with status: {
     "id" : "compact_dis_take_hour_ghhghijg_2020-07-03T10:22:01.826Z",
     "status" : "SUCCESS",
     "duration" : 2940,
     "errorMsg" : null,
     "location" : {
       "host" : null,
       "port" : -1,
       "tlsPort" : -1
     }
   }
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to