skuldshao commented on code in PR #18027: URL: https://github.com/apache/druid/pull/18027#discussion_r2136619777
########## docs/multi-stage-query/reference.md: ########## @@ -111,14 +111,15 @@ s3://export-bucket/export/query-6564a32f-2194-423a-912e-eead470a37c4-worker0-par Keep the following in mind when using EXTERN to export rows: - Only INSERT statements are supported. - Only `CSV` format is supported as an export format. -- Partitioning (`PARTITIONED BY`) and clustering (`CLUSTERED BY`) aren't supported with EXTERN statements. +- Partitioning (PARTITIONED BY) and clustering (CLUSTERED BY) aren't supported with EXTERN statements. - You can export to Amazon S3, Google GCS, or local storage. - The destination provided should contain no other files or directories. -When you export data, use the `rowsPerPage` context parameter to restrict the size of exported files. -When the number of rows in the result set exceeds the value of the parameter, Druid splits the output into multiple files. +When you export data, use SET to restrict `rowsPerPage` to control the size of exported files. For example: Review Comment: Updated! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
