techdocsmith commented on code in PR #12946: URL: https://github.com/apache/druid/pull/12946#discussion_r957947700
########## docs/querying/nested-columns.md: ########## @@ -0,0 +1,503 @@ +--- +id: nested-columns +title: "Nested columns" +sidebar_label: Nested columns +--- + +<!-- + ~ Licensed to the Apache Software Foundation (ASF) under one + ~ or more contributor license agreements. See the NOTICE file + ~ distributed with this work for additional information + ~ regarding copyright ownership. The ASF licenses this file + ~ to you under the Apache License, Version 2.0 (the + ~ "License"); you may not use this file except in compliance + ~ with the License. You may obtain a copy of the License at + ~ + ~ http://www.apache.org/licenses/LICENSE-2.0 + ~ + ~ Unless required by applicable law or agreed to in writing, + ~ software distributed under the License is distributed on an + ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + ~ KIND, either express or implied. See the License for the + ~ specific language governing permissions and limitations + ~ under the License. + --> + +> Nested columns is an experimental feature available starting in Apache Druid 24.0. Like most experimental features, functionality documented on this page is subject to change in future releases. However, the COMPLEX column type includes versioning to provide backward compatible support in future releases. We strongly encourage you you experiment with nested columns in your development environment to evaluate that they meet your use case. If so, you can use them in production scenarios. Review the release notes and this page to stay up to date with changes. + +Apache Druid supports directly storing nested data structures in `COMPLEX<json>` columns. `COMPLEX<json>` columns store a copy of the structured data in JSON format, and specialized internal columns and indexes for nested 'literal' values—STRING, LONG, and DOUBLE types. An optimized [virtual column](./virtual-columns.md#nested-field-virtual-column) allows Druid to read and filter these values at speeds consistent with standard Druid LONG, DOUBLE, and STRING columns. + +Druid [SQL JSON functions](./sql-json-functions.md) allow you to extract, transform, and create `COMPLEX<json>` values in SQL queries, using the specialized virtual columns where appropriate. You can use the [JSON nested columns functions](../misc/math-expr.md#nested-columns-functions) in [native queries](./querying.md) using [expression virtual columns](./virtual-columns.md#expression-virtual-column), and in native ingestion with a [`transformSpec`](../ingestion/ingestion-spec.md#transformspec). + +You can use the JSON functions in INSERT and REPLACE statements in SQL-based ingestion, or in a `transformSpec` in native ingestion. This is an alternative to using a [`flattenSpec`](../ingestion/data-formats.md#flattenspec) object to 'flatten' nested data for ingestion. + +### Example nested data + +The examples in this topic use the data in [nested_example_data.json](https://static.imply.io/data/nested_example_data.json). The file contains a simple fascimile of an order tracking and shipping table. + +When pretty-printed a sample row in `nested_example_data` looks like this: + +```json +{ + "time":"2022-6-14T10:32:08Z", + "product":"Keyboard", + "department":"Computers", + "shipTo":{ + "firstName": "Sandra", + "lastName": "Beatty", + "address": { + "street": "293 Grant Well", + "city": "Loischester", + "state": "FL", + "country": "TV", + "postalCode": "88845-0066" + }, + "phoneNumbers": [ + {"type":"primary","number":"1-788-771-7028 x8627" }, + {"type":"secondary","number":"1-460-496-4884 x887"} + ] + }, + "details"{"color":"plum","price":"40.00"} +} +``` + +## Native batch ingestion + +For native batch ingestion, you can use the [JSON nested columns functions](./sql-json-functions.md) to extract nested data as an alternative to using the [flattenSpec](../ingestion/data-formats.md#flattenspec) input format. + +To configure a dimension as a nested data type, include a `dimensions` object in the `dimensionsSpec` property of your ingestion spec. + +For example, the following ingestion spec instructs Druid to ingest `shipTo` and `details` as JSON-type nested dimensions: + +```json +{ + "type": "index_parallel", + "spec": { + "ioConfig": { + "type": "index_parallel", + "inputSource": { + "type": "http", + "uris": [ + "https://static.imply.io/data/nested_example_data.json" + ] + }, + "inputFormat": { + "type": "json" + } + }, + "dataSchema": { + "granularitySpec": { + "segmentGranularity": "day", + "queryGranularity": "none", + "rollup": false + }, + "dataSource": "nested_data_example", + "timestampSpec": { + "column": "time", + "format": "auto" + }, + "dimensionsSpec": { + "dimensions": [ + "product", + "department", + { + "type": "json", + "name": "shipTo" + }, + { + "type": "json", + "name": "details" + } + ] + }, + "transformSpec": {} + }, + "tuningConfig": { + "type": "index_parallel", + "partitionsSpec": { + "type": "dynamic" + } + } + } +} +``` + +### Transform data during batch ingestion + +You can use the [JSON nested columns functions](./sql-json-functions.md) to transform JSON data and reference the transformed data in your ingestion spec. + +To do this, include a `transforms` object in the `transformSpec` property of your ingestion spec. + +For example, the following ingestion spec extracts `firstName`, `lastName` and `address` from `shipTo` and creates a composite JSON object containing `product`, `details` and `department`. + +```json +{ + "type": "index_parallel", + "spec": { + "ioConfig": { + "type": "index_parallel", + "inputSource": { + "type": "http", + "uris": [ + "https://static.imply.io/data/nested_example_data.json" + ] + }, + "inputFormat": { + "type": "json" + } + }, + "dataSchema": { + "granularitySpec": { + "segmentGranularity": "day", + "queryGranularity": "none", + "rollup": false + }, + "dataSource": "nested_data_transform_example", + "timestampSpec": { + "column": "time", + "format": "auto" + }, + "dimensionsSpec": { + "dimensions": [ + "firstName", + "lastName", + { + "type": "json", + "name": "address" + }, + { + "type": "json", + "name": "productDetails" + } + ] + }, + "transformSpec": { + "transforms":[ + { "type":"expression", "name":"firstName", "expression":"json_value(shipTo, '$.firstName')"}, + { "type":"expression", "name":"lastName", "expression":"json_value(shipTo, '$.lastName')"}, + { "type":"expression", "name":"address", "expression":"json_query(shipTo, '$.address')"}, + { "type":"expression", "name":"productDetails", "expression":"json_object('product', product, 'details', details, 'department', department)"} + ] + } + }, + "tuningConfig": { + "type": "index_parallel", + "partitionsSpec": { + "type": "dynamic" + } + } + } +} +``` + +## SQL-based ingestion + +To ingest nested data using multi-stage query architecture, specify `COMPLEX<json>` as the column `type` when you define the row signature—`shipTo` and `details` in the following example ingestion spec: + + + +```sql +REPLACE INTO msq_nested_data_example OVERWRITE ALL +SELECT + TIME_PARSE("time") as __time, + product, + department, + shipTo, + details +FROM ( + SELECT * FROM + TABLE( + EXTERN( + '{"type":"http","uris":["https://static.imply.io/data/nested_example_data.json"]}', + '{"type":"json"}', + '[{"name":"time","type":"string"},{"name":"product","type":"string"},{"name":"department","type":"string"},{"name":"shipTo","type":"COMPLEX<json>"},{"name":"details","type":"COMPLEX<json>"}]' + ) + ) +) +PARTITIONED BY ALL +``` + +### Transform data during SQL-based ingestion + +You can use the [JSON nested columns functions](./sql-json-functions.md) to transform JSON data in your ingestion query. + +For example, the following ingestion query is the SQL-based version of the [batch example above](#transform-data-during-batch-ingestion)—it extracts `firstName`, `lastName` and `address` from `shipTo` and creates a composite JSON object containing `product`, `details` and `department`. + + + +```sql +REPLACE INTO msq_nested_data_transform_example OVERWRITE ALL +SELECT + TIME_PARSE("time") as __time, + JSON_VALUE(shipTo, '$.firstName') as firstName, + JSON_VALUE(shipTo, '$.lastName') as lastName, + JSON_QUERY(shipTo, '$.address') as address, + JSON_OBJECT('product':product,'details':details, 'department':department) as productDetails +FROM ( + SELECT * FROM + TABLE( + EXTERN( + '{"type":"http","uris":["https://static.imply.io/data/nested_example_data.json"]}', + '{"type":"json"}', + '[{"name":"time","type":"string"},{"name":"product","type":"string"},{"name":"department","type":"string"},{"name":"shipTo","type":"COMPLEX<json>"},{"name":"details","type":"COMPLEX<json>"}]' + ) + ) +) +PARTITIONED BY ALL +``` + +## Querying nested columns + +Once ingested, Druid stores the JSON-typed columns as native JSON objects and presents them as `COMPLEX<json>`. + +See the [Nested columns functions reference](./sql-json-functions.md) for information on the functions in the examples below. + +Druid supports a small, simplified subset of the [JSONPath syntax](https://github.com/json-path/JsonPath/blob/master/README.md) operators, primarily limited to extracting individual values from nested data structures. See the [SQL JSON functions](./sql-json-functions.md#jsonpath-syntax) page for details. + +### Displaying data types + +The following example illustrates how you can display the data types for your columns. Note that `details` and `shipTo` display as `COMPLEX<json>`. + +#### Example query: Display data types + + + +```sql +SELECT TABLE_NAME, COLUMN_NAME, DATA_TYPE +FROM INFORMATION_SCHEMA.COLUMNS +WHERE TABLE_NAME = 'nested_data_example' +``` + +Example query results: + +```json Review Comment: Torn here. I think that the reader may want to see the output the way it is delivered. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
