GaneshPatil7517 opened a new pull request, #2557: URL: https://github.com/apache/iggy/pull/2557
## Summary Implements Issue #2540 - Redshift Sink Connector with S3 staging support. ## Features - ✅ S3 staging with automatic CSV file upload - ✅ Redshift COPY command execution via PostgreSQL wire protocol - ✅ IAM role authentication (recommended) or access key credentials - ✅ Configurable batch size and compression (gzip, lzop, bzip2, zstd) - ✅ Automatic table creation with customizable schema - ✅ Retry logic with exponential backoff for transient failures - ✅ Automatic cleanup of staged S3 files ## Configuration Options | Option | Description | Default | |--------|-------------|---------| | `connection_string` | Redshift cluster connection URL | Required | | `target_table` | Destination table name | Required | | `iam_role` | IAM role ARN for S3 access | Optional* | | `s3_bucket` | S3 bucket for staging | Required | | `s3_region` | AWS region | Required | | `batch_size` | Messages per batch | 10000 | | `compression` | COPY compression format | None | | `delete_staged_files` | Auto-cleanup toggle | true | | `auto_create_table` | Create table if missing | true | *Either `iam_role` or `aws_access_key_id`+`aws_secret_access_key` required ## Testing - 14 unit tests passing - Clippy clean with `-D warnings` Closes #2540 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
