zhangyue19921010 commented on pull request #10688:
URL: https://github.com/apache/druid/pull/10688#issuecomment-754393991


   Here is the full logs during query failed and trigger unload action.
   ```
   2021-01-05T03:19:42,622 INFO [ZKCoordinator--2] 
org.apache.druid.server.coordination.ZkCoordinator - Completed request [LOAD: 
todate_ad_historical_hc_regtest_2021-01-04T00:00:00.000Z_2021-01-05T00:00:00.000Z_2021-01-05T02:57:11.749Z_1]
   2021-01-05T03:19:42,622 INFO [ZkCoordinator] 
org.apache.druid.server.coordination.ZkCoordinator - 
zNode[/druid/loadQueue/druid-dev-8-historical-0.druid-dev-8-historical.druid-dev-8.svc.cluster.local:8083/todate_ad_historical_hc_regtest_2021-01-04T00:00:00.000Z_2021-01-05T00:00:00.000Z_2021-01-05T02:57:11.749Z_1]
 was removed
   2021-01-05T03:21:14,592 WARN 
[segmentMetadata_traffic__ops_feed__realtime__second__dev__big__segment_[2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z]]
 org.apache.druid.segment.IndexIO - Exception when deserialize Column country
   com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot 
construct instance of `org.apache.druid.segment.column.ColumnDescriptor` 
(although at least one Creator exists): no String-argument constructor/factory 
method to deserialize from String value ('valueType')
    at [Source: (StringReader); line: 1, column: 1]
        at 
com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
 ~[jackson-databind-2.10.1.jar:2.10.1]
        at 
com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1429)
 ~[jackson-databind-2.10.1.jar:2.10.1]
        at 
com.fasterxml.jackson.databind.DeserializationContext.handleMissingInstantiator(DeserializationContext.java:1059)
 ~[jackson-databind-2.10.1.jar:2.10.1]
        at 
com.fasterxml.jackson.databind.deser.ValueInstantiator._createFromStringFallbacks(ValueInstantiator.java:371)
 ~[jackson-databind-2.10.1.jar:2.10.1]
        at 
com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.createFromString(StdValueInstantiator.java:323)
 ~[jackson-databind-2.10.1.jar:2.10.1]
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromString(BeanDeserializerBase.java:1373)
 ~[jackson-databind-2.10.1.jar:2.10.1]
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeOther(BeanDeserializer.java:171)
 ~[jackson-databind-2.10.1.jar:2.10.1]
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:161)
 ~[jackson-databind-2.10.1.jar:2.10.1]
        at 
com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4202)
 ~[jackson-databind-2.10.1.jar:2.10.1]
        at 
com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3205) 
~[jackson-databind-2.10.1.jar:2.10.1]
        at 
com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3173) 
~[jackson-databind-2.10.1.jar:2.10.1]
        at 
org.apache.druid.segment.IndexIO$V9IndexLoader.deserializeColumn(IndexIO.java:657)
 ~[druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.segment.IndexIO$V9IndexLoader.lambda$load$0(IndexIO.java:603) 
~[druid-processing-0.17.1.jar:0.17.1]
        at 
com.google.common.base.Suppliers$MemoizingSupplier.get(Suppliers.java:125) 
~[guava-16.0.1.jar:?]
        at 
org.apache.druid.segment.SimpleQueryableIndex.getColumnHolder(SimpleQueryableIndex.java:163)
 ~[druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.metadata.SegmentAnalyzer.analyze(SegmentAnalyzer.java:103)
 ~[druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.metadata.SegmentMetadataQueryRunnerFactory$1.run(SegmentMetadataQueryRunnerFactory.java:92)
 ~[druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.ReferenceCountingSegmentQueryRunner.run(ReferenceCountingSegmentQueryRunner.java:51)
 ~[druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.MetricsEmittingQueryRunner.lambda$run$0(MetricsEmittingQueryRunner.java:97)
 ~[druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.LazySequence.accumulate(LazySequence.java:40)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence$1.get(WrappingSequence.java:50)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.SequenceWrapper.wrap(SequenceWrapper.java:55)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence.accumulate(WrappingSequence.java:45)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.MappedSequence.accumulate(MappedSequence.java:43)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence$1.get(WrappingSequence.java:50)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.SequenceWrapper.wrap(SequenceWrapper.java:55)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence.accumulate(WrappingSequence.java:45)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.LazySequence.accumulate(LazySequence.java:40)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence$1.get(WrappingSequence.java:50)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.SequenceWrapper.wrap(SequenceWrapper.java:55)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence.accumulate(WrappingSequence.java:45)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.spec.SpecificSegmentQueryRunner$1.accumulate(SpecificSegmentQueryRunner.java:79)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence$1.get(WrappingSequence.java:50)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.spec.SpecificSegmentQueryRunner.doNamed(SpecificSegmentQueryRunner.java:163)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.spec.SpecificSegmentQueryRunner.access$100(SpecificSegmentQueryRunner.java:42)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.spec.SpecificSegmentQueryRunner$2.wrap(SpecificSegmentQueryRunner.java:145)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence.accumulate(WrappingSequence.java:45)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence$1.get(WrappingSequence.java:50)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.CPUTimeMetricQueryRunner$1.wrap(CPUTimeMetricQueryRunner.java:74)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence.accumulate(WrappingSequence.java:45)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.Sequence.toList(Sequence.java:85) 
[druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.metadata.SegmentMetadataQueryRunnerFactory$2$1$1.call(SegmentMetadataQueryRunnerFactory.java:217)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.metadata.SegmentMetadataQueryRunnerFactory$2$1$1.call(SegmentMetadataQueryRunnerFactory.java:213)
 [druid-processing-0.17.1.jar:0.17.1]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
[?:1.8.0_221]
        at 
org.apache.druid.query.PrioritizedListenableFutureTask.run(PrioritizedExecutorService.java:247)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[?:1.8.0_221]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[?:1.8.0_221]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_221]
   2021-01-05T03:21:14,595 INFO 
[segmentMetadata_traffic__ops_feed__realtime__second__dev__big__segment_[2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z]]
 org.apache.druid.server.coordination.BatchDataSegmentAnnouncer - Unannouncing 
segment[traffic__ops_feed__realtime__second__dev__big__segment_2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z_2020-12-02T02:33:58.519Z]
 at 
path[/druid/segments/druid-dev-8-historical-0.druid-dev-8-historical.druid-dev-8.svc.cluster.local:8083/druid-dev-8-historical-0.druid-dev-8-historical.druid-dev-8.svc.cluster.local:8083_historical__default_tier_2021-01-05T03:09:36.189Z_bc91ea1098a749438ad4654b430c5cf61474]
   2021-01-05T03:21:14,599 INFO 
[segmentMetadata_traffic__ops_feed__realtime__second__dev__big__segment_[2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z]]
 org.apache.druid.server.SegmentManager - Attempting to close segment 
traffic__ops_feed__realtime__second__dev__big__segment_2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z_2020-12-02T02:33:58.519Z
   2021-01-05T03:21:14,599 INFO 
[segmentMetadata_traffic__ops_feed__realtime__second__dev__big__segment_[2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z]]
 org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting 
directory[/var/druid/segment-cache/traffic__ops_feed__realtime__second__dev__big__segment/2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z/2020-12-02T02:33:58.519Z/0]
   2021-01-05T03:21:14,601 WARN 
[segmentMetadata_traffic__ops_feed__realtime__second__dev__big__segment_[2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z]]
 org.apache.druid.segment.loading.StorageLocation - 
SegmentDir[/var/druid/segment-cache/traffic__ops_feed__realtime__second__dev__big__segment/2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z/2020-12-02T02:33:58.519Z/0]
 is not found under this location[/var/druid/segment-cache]
   2021-01-05T03:21:14,748 WARN 
[segmentMetadata_traffic__ops_feed__realtime__second__dev__big__segment_[2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z]]
 org.apache.druid.segment.IndexIO - Exception when deserialize Column country
   java.nio.BufferUnderflowException: null
        at java.nio.DirectByteBuffer.get(DirectByteBuffer.java:271) 
~[?:1.8.0_221]
        at java.nio.ByteBuffer.get(ByteBuffer.java:715) ~[?:1.8.0_221]
        at 
org.apache.druid.common.utils.SerializerUtils.readBytes(SerializerUtils.java:68)
 ~[druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.common.utils.SerializerUtils.readString(SerializerUtils.java:62)
 ~[druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.segment.IndexIO$V9IndexLoader.deserializeColumn(IndexIO.java:658)
 ~[druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.segment.IndexIO$V9IndexLoader.lambda$load$0(IndexIO.java:603) 
~[druid-processing-0.17.1.jar:0.17.1]
        at 
com.google.common.base.Suppliers$MemoizingSupplier.get(Suppliers.java:125) 
~[guava-16.0.1.jar:?]
        at 
org.apache.druid.segment.SimpleQueryableIndex.getColumnHolder(SimpleQueryableIndex.java:163)
 ~[druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.metadata.SegmentAnalyzer.analyze(SegmentAnalyzer.java:103)
 ~[druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.metadata.SegmentMetadataQueryRunnerFactory$1.run(SegmentMetadataQueryRunnerFactory.java:92)
 ~[druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.ReferenceCountingSegmentQueryRunner.run(ReferenceCountingSegmentQueryRunner.java:51)
 ~[druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.MetricsEmittingQueryRunner.lambda$run$0(MetricsEmittingQueryRunner.java:97)
 ~[druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.LazySequence.accumulate(LazySequence.java:40)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence$1.get(WrappingSequence.java:50)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.SequenceWrapper.wrap(SequenceWrapper.java:55)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence.accumulate(WrappingSequence.java:45)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.MappedSequence.accumulate(MappedSequence.java:43)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence$1.get(WrappingSequence.java:50)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.SequenceWrapper.wrap(SequenceWrapper.java:55)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence.accumulate(WrappingSequence.java:45)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.LazySequence.accumulate(LazySequence.java:40)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence$1.get(WrappingSequence.java:50)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.SequenceWrapper.wrap(SequenceWrapper.java:55)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence.accumulate(WrappingSequence.java:45)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.spec.SpecificSegmentQueryRunner$1.accumulate(SpecificSegmentQueryRunner.java:79)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence$1.get(WrappingSequence.java:50)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.spec.SpecificSegmentQueryRunner.doNamed(SpecificSegmentQueryRunner.java:163)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.spec.SpecificSegmentQueryRunner.access$100(SpecificSegmentQueryRunner.java:42)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.spec.SpecificSegmentQueryRunner$2.wrap(SpecificSegmentQueryRunner.java:145)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence.accumulate(WrappingSequence.java:45)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence$1.get(WrappingSequence.java:50)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.CPUTimeMetricQueryRunner$1.wrap(CPUTimeMetricQueryRunner.java:74)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.WrappingSequence.accumulate(WrappingSequence.java:45)
 [druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.java.util.common.guava.Sequence.toList(Sequence.java:85) 
[druid-core-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.metadata.SegmentMetadataQueryRunnerFactory$2$1$1.call(SegmentMetadataQueryRunnerFactory.java:217)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
org.apache.druid.query.metadata.SegmentMetadataQueryRunnerFactory$2$1$1.call(SegmentMetadataQueryRunnerFactory.java:213)
 [druid-processing-0.17.1.jar:0.17.1]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
[?:1.8.0_221]
        at 
org.apache.druid.query.PrioritizedListenableFutureTask.run(PrioritizedExecutorService.java:247)
 [druid-processing-0.17.1.jar:0.17.1]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[?:1.8.0_221]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[?:1.8.0_221]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_221]
   2021-01-05T03:21:14,756 WARN 
[segmentMetadata_traffic__ops_feed__realtime__second__dev__big__segment_[2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z]]
 org.apache.druid.server.coordination.BatchDataSegmentAnnouncer - No path to 
unannounce 
segment[traffic__ops_feed__realtime__second__dev__big__segment_2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z_2020-12-02T02:33:58.519Z]
   2021-01-05T03:21:14,756 INFO 
[segmentMetadata_traffic__ops_feed__realtime__second__dev__big__segment_[2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z]]
 org.apache.druid.server.SegmentManager - Told to delete a queryable on 
dataSource[traffic__ops_feed__realtime__second__dev__big__segment] for 
interval[2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z] and 
version[2020-12-02T02:33:58.519Z] that I don't have.
   2021-01-05T03:21:14,760 WARN 
[segmentMetadata_traffic__ops_feed__realtime__second__dev__big__segment_[2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z]]
 org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Asked to 
cleanup something[DataSegment{binaryVersion=9, 
id=traffic__ops_feed__realtime__second__dev__big__segment_2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z_2020-12-02T02:33:58.519Z,
 loadSpec={type=>s3_zip, bucket=>pqm-druid-dev, 
key=>rtstorage/segments/traffic__ops_feed__realtime__second__dev__big__segment/2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z/2020-12-02T02:33:58.519Z/0/index.zip,
 S3Schema=>s3n}, dimensions=[video_cro_network_id, video_cro_network_name, 
distributor_network_id, distributor_network_name, profile_id, profile_name, 
is_active_device, is_filtered, service_type, platform, ad_unit_type, country, 
country_name, state, state_name, dma, dma_name, syscode, syscode_name, 
tv_network_id, tv_network_name, linear_campaign_type, spot_type], metric
 s=[ack_ad_click, ack_ad_complete, ack_ad_first_quartile, ack_ad_impression, 
ack_ad_mid_point, ack_ad_third_quartile, ack_err_adm_e_3p_comp, 
ack_err_adm_e_device_limit, ack_err_adm_e_io, ack_err_adm_e_no_ad, 
ack_err_adm_e_no_render, ack_err_adm_e_parse, ack_err_adm_e_render_init, 
ack_err_adm_e_security, ack_err_adm_e_timeout, ack_err_adm_e_unknown, 
ack_err_psn_abnormal_termination_of_playout, ack_err_psn_asset_info_invalid, 
ack_err_psn_bit_rate_mismatch, ack_err_psn_insertion_point_time_exceeded, 
ack_err_psn_message_validation_failed, ack_err_psn_timeout, 
ack_err_psn_unknown_message_reference, ack_err_vast_100, ack_err_vast_202, 
ack_err_vast_300, ack_err_vast_301, ack_err_vast_302, ack_err_vast_303, 
ack_err_vast_400, ack_err_vast_402, ack_err_vast_403, ack_err_vast_900, 
ad_delivered_ad, ad_delivered_ad_fallback, ad_delivered_ad_primary, 
ad_err_fallback_to_evergreen, ad_err_full_avail_no_variant_segment, 
ad_err_inactive_addressable_order, count, req_ad_request, 
req_ad_request_with_mid
 roll_slot, req_ad_request_with_video_slot, req_empty_response, 
req_empty_response_with_midroll_slot, req_empty_response_with_video_slot, 
req_err_break_duration_invalid, req_err_break_no_schedule_ad, 
req_err_no_mac_address, req_err_no_profile, req_err_no_signal_id, 
req_err_schedule_creative_validation_failed, req_err_schedule_not_found, 
req_err_signal_no_bind_break, req_err_station_not_found, 
req_err_syscode_not_found, req_resp_time_gt_1500ms, req_resp_time_lt_100ms, 
req_resp_time_lt_1500ms, req_resp_time_lt_300ms, req_resp_time_lt_500ms, 
slot_avails, slot_unfilled_avails], shardSpec=NumberedShardSpec{partitionNum=0, 
partitions=0}, lastCompactionState=null, size=3388721170}] that didn't exist.  
Skipping.
   2021-01-05T03:21:14,761 WARN 
[segmentMetadata_traffic__ops_feed__realtime__second__dev__big__segment_[2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z]]
 org.apache.druid.server.coordination.SegmentLoadDropHandler - Unable to delete 
segmentInfoCacheFile[/var/druid/segment-cache/info_dir/traffic__ops_feed__realtime__second__dev__big__segment_2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z_2020-12-02T02:33:58.519Z]
   2021-01-05T03:21:14,761 ERROR 
[qtp450476243-159[segmentMetadata_[traffic__ops_feed__realtime__second__dev__big__segment]_7571e94d-4e59-4486-bc0f-bd4d447d2e18]]
 org.apache.druid.server.QueryResource - Exception handling request: 
{class=org.apache.druid.server.QueryResource, exceptionType=class 
java.lang.RuntimeException, 
exceptionMessage=java.util.concurrent.ExecutionException: 
java.lang.RuntimeException: 
com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot construct 
instance of `org.apache.druid.segment.column.ColumnDescriptor` (although at 
least one Creator exists): no String-argument constructor/factory method to 
deserialize from String value ('valueType')
    at [Source: (StringReader); line: 1, column: 1], 
query={"queryType":"segmentMetadata","dataSource":{"type":"table","name":"traffic__ops_feed__realtime__second__dev__big__segment"},"intervals":{"type":"segments","segments":[{"itvl":"2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z","ver":"2020-12-02T02:33:58.519Z","part":0},{"itvl":"2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z","ver":"2020-12-02T02:33:58.519Z","part":1}]},"toInclude":{"type":"all"},"merge":false,"context":{"defaultTimeout":60000,"finalize":false,"maxQueuedBytes":50000000,"maxScatterGatherBytes":9223372036854775807,"queryFailTime":1609816934586,"queryId":"7571e94d-4e59-4486-bc0f-bd4d447d2e18","timeout":60000},"analysisTypes":[],"usingDefaultInterval":false,"lenientAggregatorMerge":false,"descending":false,"granularity":{"type":"all"}},
 peer=10.23.44.3} (java.lang.RuntimeException: 
java.util.concurrent.ExecutionException: java.lang.RuntimeException: 
com.fasterxml.jackson.databind.exc.MismatchedInputException: 
 Cannot construct instance of 
`org.apache.druid.segment.column.ColumnDescriptor` (although at least one 
Creator exists): no String-argument constructor/factory method to deserialize 
from String value ('valueType')
    at [Source: (StringReader); line: 1, column: 1])
   2021-01-05T03:21:14,809 ERROR 
[qtp450476243-146[segmentMetadata_[traffic__ops_feed__realtime__second__dev__big__segment]_e94db4a3-21ed-42de-9471-6b1a83e74071]]
 org.apache.druid.server.QueryResource - Exception handling request: 
{class=org.apache.druid.server.QueryResource, exceptionType=class 
java.lang.RuntimeException, 
exceptionMessage=java.util.concurrent.ExecutionException: 
java.nio.BufferUnderflowException, 
query={"queryType":"segmentMetadata","dataSource":{"type":"table","name":"traffic__ops_feed__realtime__second__dev__big__segment"},"intervals":{"type":"segments","segments":[{"itvl":"2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z","ver":"2020-12-02T02:33:58.519Z","part":0},{"itvl":"2020-12-01T00:00:00.000Z/2020-12-02T00:00:00.000Z","ver":"2020-12-02T02:33:58.519Z","part":1}]},"toInclude":{"type":"all"},"merge":false,"context":{"defaultTimeout":60000,"finalize":false,"maxQueuedBytes":50000000,"maxScatterGatherBytes":9223372036854775807,"queryFailTime":1609816934586,"query
 
Id":"e94db4a3-21ed-42de-9471-6b1a83e74071","timeout":60000},"analysisTypes":[],"usingDefaultInterval":false,"lenientAggregatorMerge":false,"descending":false,"granularity":{"type":"all"}},
 peer=10.23.43.148} (java.lang.RuntimeException: 
java.util.concurrent.ExecutionException: java.nio.BufferUnderflowException)
   2021-01-05T03:21:27,962 INFO [ZKCoordinator--3] 
org.apache.druid.server.coordination.SegmentLoadDropHandler - Loading segment 
traffic__ops_feed__realtime__second__dev__big__segment_2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z_2020-12-02T02:33:58.519Z
   2021-01-05T03:21:27,962 INFO [ZKCoordinator--3] 
org.apache.druid.storage.s3.S3DataSegmentPuller - Pulling index at 
path[CloudObjectLocation{bucket='pqm-druid-dev', 
path='rtstorage/segments/traffic__ops_feed__realtime__second__dev__big__segment/2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z/2020-12-02T02:33:58.519Z/0/index.zip'}]
 to 
outDir[/var/druid/segment-cache/traffic__ops_feed__realtime__second__dev__big__segment/2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z/2020-12-02T02:33:58.519Z/0]
   2021-01-05T03:22:14,707 WARN [ZKCoordinator--3] 
com.amazonaws.services.s3.internal.S3AbortableInputStream - Not all bytes were 
read from the S3ObjectInputStream, aborting HTTP connection. This is likely an 
error and may result in sub-optimal behavior. Request only the bytes you need 
via a ranged GET or drain the input stream after use.
   2021-01-05T03:22:14,707 WARN [ZKCoordinator--3] 
com.amazonaws.services.s3.internal.S3AbortableInputStream - Not all bytes were 
read from the S3ObjectInputStream, aborting HTTP connection. This is likely an 
error and may result in sub-optimal behavior. Request only the bytes you need 
via a ranged GET or drain the input stream after use.
   2021-01-05T03:22:14,707 INFO [ZKCoordinator--3] 
org.apache.druid.storage.s3.S3DataSegmentPuller - Loaded 3388721170 bytes from 
[CloudObjectLocation{bucket='pqm-druid-dev', 
path='rtstorage/segments/traffic__ops_feed__realtime__second__dev__big__segment/2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z/2020-12-02T02:33:58.519Z/0/index.zip'}]
 to 
[/var/druid/segment-cache/traffic__ops_feed__realtime__second__dev__big__segment/2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z/2020-12-02T02:33:58.519Z/0]
   2021-01-05T03:22:14,789 INFO [ZKCoordinator--3] 
org.apache.druid.server.coordination.BatchDataSegmentAnnouncer - Announcing 
segment[traffic__ops_feed__realtime__second__dev__big__segment_2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z_2020-12-02T02:33:58.519Z]
 at existing 
path[/druid/segments/druid-dev-8-historical-0.druid-dev-8-historical.druid-dev-8.svc.cluster.local:8083/druid-dev-8-historical-0.druid-dev-8-historical.druid-dev-8.svc.cluster.local:8083_historical__default_tier_2021-01-05T03:09:36.189Z_bc91ea1098a749438ad4654b430c5cf61474]
   2021-01-05T03:22:14,794 INFO [ZKCoordinator--3] 
org.apache.druid.server.coordination.ZkCoordinator - Completed request [LOAD: 
traffic__ops_feed__realtime__second__dev__big__segment_2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z_2020-12-02T02:33:58.519Z]
   2021-01-05T03:22:14,794 INFO [ZkCoordinator] 
org.apache.druid.server.coordination.ZkCoordinator - 
zNode[/druid/loadQueue/druid-dev-8-historical-0.druid-dev-8-historical.druid-dev-8.svc.cluster.local:8083/traffic__ops_feed__realtime__second__dev__big__segment_2020-12-01T00:00:00.000Z_2020-12-02T00:00:00.000Z_2020-12-02T02:33:58.519Z]
 was removed
   ```
   
   damage segment files -> lazy On start -> query failed -> segment unload 
automatically -> load segment according to coordinator -> query successfully.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to