nicusX commented on code in PR #179: URL: https://github.com/apache/flink-connector-aws/pull/179#discussion_r1831393631
########## docs/content/docs/connectors/datastream/dynamodb.md: ########## @@ -23,16 +23,140 @@ KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. --> +# Amazon DynamoDB Connector +The DynamoDB connector allows users to read/write from [Amazon DynamoDB](https://aws.amazon.com/dynamodb/). -# Amazon DynamoDB Sink +To read from DynamoDB, the connector allows users to read from the change data capture stream in [Amazon DynamoDB Streams](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html). -The DynamoDB sink writes to [Amazon DynamoDB](https://aws.amazon.com/dynamodb) using the [AWS v2 SDK for Java](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/home.html). Follow the instructions from the [Amazon DynamoDB Developer Guide](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/getting-started-step-1.html) -to setup a table. +To write to DynamoDB, the connector allows users to write directly to Amazon DynamoDB using the [BatchWriteItem API](https://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_BatchWriteItem.html). + +## Dependency + +Apache Flink ships the connector for users to utilize. To use the connector, add the following Maven dependency to your project: {{< connector_artifact flink-connector-dynamodb dynamodb >}} + +## Amazon DynamoDB Streams Source + +The DynamoDB streams source reads from [Amazon DynamoDB Streams](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html) using the [AWS v2 SDK for Java](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/home.html). +Follow the instructions from the [AWS docs](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html) to set up and configure the change data capture stream. + +### Usage + +The `DynamoDbStreamsSource` provides a fluent builder to construct an instance of the `DynamoDbStreamsSource`. +The code snippet below illustrates how to do so. + +{{< tabs "ec24a4ae-6a47-11ed-a1eb-0242ac120002" >}} +{{< tab "Java" >}} +```java +// Configure the DynamodbStreamsSource +Configuration sourceConfig = new Configuration(); +sourceConfig.set(DynamodbStreamsSourceConfigConstants.STREAM_INITIAL_POSITION, DynamodbStreamsSourceConfigConstants.InitialPosition.TRIM_HORIZON); // This is optional, by default connector will read from LATEST + +// Create a new DynamoDbStreamsSource to read from the specified DynamoDB Stream. +DynamoDbStreamsSource<String> dynamoDbStreamsSource = + DynamoDbStreamsSource.<String>builder() + .setStreamArn("arn:aws:dynamodb:us-east-1:1231231230:table/test/stream/2024-04-11T07:14:19.380") + .setSourceConfig(sourceConfig) + // User must implement their own deserialization schema to translate change data capture events into custom data types + .setDeserializationSchema(dynamodbDeserializationSchema) + .build(); + +StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); + +// Specify watermarking strategy and the name of the DynamoDB Streams Source operator. +// Specify return type using TypeInformation. +// Specify also UID of operator in line with Flink best practice. +DataStream<String> cdcEventsWithEventTimeWatermarks = env.fromSource(dynamoDbStreamsSource, WatermarkStrategy.<String>forMonotonousTimestamps().withIdleness(Duration.ofSeconds(1)), "DynamoDB Streams source") + .returns(TypeInformation.of(String.class)) + .uid("custom-uid"); +``` +{{< /tab >}} +{{< tab "Scala" >}} +```scala +// Configure the DynamodbStreamsSource +val sourceConfig = new Configuration() +sourceConfig.set(DynamodbStreamsSourceConfigConstants.STREAM_INITIAL_POSITION, DynamodbStreamsSourceConfigConstants.InitialPosition.TRIM_HORIZON) // This is optional, by default connector will read from LATEST + +// Create a new DynamoDbStreamsSource to read from the specified DynamoDB Stream. +val dynamoDbStreamsSource = DynamoDbStreamsSource.builder[String]() + .setStreamArn("arn:aws:dynamodb:us-east-1:1231231230:table/test/stream/2024-04-11T07:14:19.380") + .setSourceConfig(sourceConfig) + // User must implement their own deserialization schema to translate change data capture events into custom data types + .setDeserializationSchema(dynamodbDeserializationSchema) + .build() + +val env = StreamExecutionEnvironment.getExecutionEnvironment() + +// Specify watermarking strategy and the name of the DynamoDB Streams Source operator. +// Specify return type using TypeInformation. +// Specify also UID of operator in line with Flink best practice. +val cdcEventsWithEventTimeWatermarks = env.fromSource(dynamoDbStreamsSource, WatermarkStrategy.<String>forMonotonousTimestamps().withIdleness(Duration.ofSeconds(1)), "DynamoDB Streams source") + .uid("custom-uid") +``` +{{< /tab >}} +{{< /tabs >}} + +The above is a simple example of using the `DynamoDbStreamsSource`. +- The DynamoDB Stream being read from is specified using the stream ARN. +- Configuration for the `Source` is supplied using an instance of Flink's `Configuration` class. + The configuration keys can be taken from `AWSConfigOptions` (AWS-specific configuration) and `DynamodbStreamsSourceConfigConstants` (DynamoDB Streams Source configuration). +- The example specifies the starting position as `TRIM_HORIZON` (see [Configuring Starting Position](#configuring-starting-position) for more information). +- The deserialization format is as `SimpleStringSchema` (see [Deserialization Schema](#deserialization-schema) for more information). +- The distribution of shards across subtasks is controlled using the `UniformShardAssigner` (see [Shard Assignment Strategy](#shard-assignment-strategy) for more information). +- The example also specifies an increasing `WatermarkStrategy`, which means each record will be tagged with event time specified using `approximateCreationDateTime`. + Monotonically increasing watermarks will be generated, and subtasks will be considered idle if no record is emitted after 1 second. + +### Configuring Starting Position + +To specify the starting position of the `DynamodbStreamsSource`, users can set the `DynamodbStreamsSourceConfigConstants.STREAM_INITIAL_POSITION` in configuration. +- `LATEST`: read all shards of the stream starting from the latest record. +- `TRIM_HORIZON`: read all shards of the stream starting from the earliest record possible (data is trimmed by DynamoDb after 24 hours). Review Comment: Does the user has any way of getting the "snapshot" of the entire DDB table? This will be required for many use case, where the application keeps a copy of the DDB table in state and update it streaming in the changes. This is a common pattern when doing enrichment, for example -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
