glasser commented on issue #5789: Add stringLast and stringFirst aggregators extension URL: https://github.com/apache/incubator-druid/pull/5789#issuecomment-470404056 OK, I attempted a a trivial reproduction by working through the Kafka stream tutorial but removing `channel` from dimensionsSpec and changing `metricsSpec` to: ``` "metricsSpec": [{ "name": "channel", "fieldName": "channel", "type": "stringFirst", "maxStringBytes": 100 }], ``` This worked just fine (including actually publishing), so it's unclear what happened when I ran it in our cluster. In our cluster, this was the ingestion spec. Note that we use a custom parser implementation which parses some protobufs into MapBasedInputRow, but it should just end up mapping `sampled_trace_id` to a String. ``` { "type": "kafka", "dataSchema": { "dataSource": "trace_refs", "parser": { "type": "mdg_trace_refs", "parseSpec": { "format": "json", "timestampSpec": { "column": "timestamp", "format": "auto" }, "dimensionsSpec": { "dimensions": [{ "name": "gcs_bucket", "type": "string" }, { "name": "duration_bucket", "type": "long" }, { "name": "client_reference_id", "type": "string" }, { "name": "client_name", "type": "string" }, { "name": "client_version", "type": "string" }, { "name": "query_id", "type": "string" }, { "name": "query_name", "type": "string" }, { "name": "query_signature", "type": "string" }, { "name": "schema_hash", "type": "string" }, { "name": "schema_tag", "type": "string" }, { "name": "service_id", "type": "string" }, { "name": "service_version", "type": "string" }, { "name": "trace_id", "type": "string" }] } } }, "metricsSpec": [{ "name": "sampled_trace_id", "fieldName": "sampled_trace_id", "type": "stringFirst", "maxStringBytes": 100 }, { "name": "total_trace_size_bytes", "fieldName": "total_trace_size_bytes", "type": "longSum" }], "granularitySpec": { "type": "uniform", "segmentGranularity": "HOUR", "queryGranularity": "MINUTE", "rollup": true } }, "ioConfig": { "topic": "engine-reports-trace-processed", "consumerProperties": { "bootstrap.servers": "kafka:9092", "max.poll.records": 10000, "max.partition.fetch.bytes": 33554432 }, "taskCount": 1, "replicas": 1, "taskDuration": "PT1H", "lateMessageRejectionPeriod": "P31D", "earlyMessageRejectionPeriod": "PT1H", "useEarliestOffset": false }, "tuningConfig": { "type": "kafka", "logParseExceptions": true, "maxParseExceptions": 0, "maxSavedParseExceptions": 1, "basePersistDirectory": "/tmp/ignored" } } ``` I'll keep investigating, but definitely curious to hear if there's anything obviously strange here!
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
