xiangqiao123 commented on issue #9689: groupBy query: limit push down to segment scan is poor performance URL: https://github.com/apache/druid/issues/9689#issuecomment-613267976 > Thanks for the report @xiangqiao123 , do you have any additional details on the query you issued, such as what the limit was, any extractionFn or lookups, and any details about segment size, column type, if string approximate cardinality, or anything else you think might be useful in helping us determine why the performance degraded? 1. Query statement: { "dimensions": [ "host", "type", "status" ], "aggregations": [ { "type": "longSum", "fieldName": "__count__", "name": "SUM(__count__)" } ], "intervals": "2020-04-08T00:00:00+08:00/2020-04-09T00:00:00+08:00", "limitSpec": { "type": "default", "limit": 50000, "columns": [] }, "granularity": { "type": "period", "period": "PT1M", "timeZone": "Asia/Shanghai", "origin": "1970-01-01T00:00:00+08:00" }, "queryType": "groupBy", "dataSource": "druid_ds", "context": { "useCache": false, "populateCache": false, } } 2.This is about 50 segments,1G per segment
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
