ashwintumma23 commented on issue #17116: URL: https://github.com/apache/druid/issues/17116#issuecomment-2683651710
Hi @carltal , Thanks for sharing the sample data, I did try to load it to my local cluster, and I do see the correct results. Only change I had to do to your csv above, was to remove the quotes `"` before loading, and explicitly specifying the schema. ### Some screenshots for Data Loading    * Here's the ingestion spec that was used: ``` { "type": "index_parallel", "spec": { "ioConfig": { "type": "index_parallel", "inputSource": { "type": "local", "baseDir": "sqlIssueDebug", "filter": "druid_data.csv" }, "inputFormat": { "type": "csv", "skipHeaderRows": 1, "findColumnsFromHeader": false, "columns": [ "ts", "merchant_id", "payment_card_type" ] } }, "tuningConfig": { "type": "index_parallel", "partitionsSpec": { "type": "dynamic" } }, "dataSchema": { "dataSource": "sample_payment_data_csv_format", "timestampSpec": { "column": "ts", "format": "iso" }, "dimensionsSpec": { "dimensions": [ { "type": "long", "name": "merchant_id" }, "payment_card_type" ] }, "granularitySpec": { "queryGranularity": "none", "rollup": false, "segmentGranularity": "hour" } } } } ``` ### Querying * Shows only the `not null` and `non-AMEX` values  We can also have a screen sharing session to discuss further. Can find me on Apache Druid Community Slack. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
