[GitHub] [flink-connector-aws] hlteoh37 commented on a diff in pull request #26: [FLINK-25859][Connectors/DynamoDB][docs] DynamoDB sink documentation

2022-11-28 Thread GitBox


hlteoh37 commented on code in PR #26:
URL: 
https://github.com/apache/flink-connector-aws/pull/26#discussion_r1033621971


##
docs/content.zh/docs/connectors/datastream/dynamodb.md:
##
@@ -0,0 +1,171 @@
+---
+title: DynamoDB
+weight: 5
+type: docs
+aliases:
+- /zh/dev/connectors/dynamodb.html
+---
+
+
+# Amazon DynamoDB Sink
+
+The DynamoDB sink writes to [Amazon DynamoDB](https://aws.amazon.com/dynamodb) 
using the [AWS v2 SDK for 
Java](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/home.html).
 Follow the instructions from the [Amazon DynamoDB Developer 
Guide](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/getting-started-step-1.html)
+to setup a table.
+
+To use the connector, add the following Maven dependency to your project:
+
+{{< connector_artifact flink-connector-dynamodb 3.0.0 >}}
+
+{{< tabs "ec24a4ae-6a47-11ed-a1eb-0242ac120002" >}}
+{{< tab "Java" >}}
+```java
+Properties sinkProperties = new Properties();
+// Required
+sinkProperties.put(AWSConfigConstants.AWS_REGION, "eu-west-1");
+// Optional, provide via alternative routes e.g. environment variables
+sinkProperties.put(AWSConfigConstants.AWS_ACCESS_KEY_ID, "aws_access_key_id");
+sinkProperties.put(AWSConfigConstants.AWS_SECRET_ACCESS_KEY, 
"aws_secret_access_key");
+
+ElementConverter elementConverter = new 
CustomElementConverter();
+
+DynamoDbSink dynamoDbSink = 
+DynamoDbSink.builder()
+.setDynamoDbProperties(sinkProperties)  // Required
+.setTableName("my-dynamodb-table")  // Required
+.setElementConverter(elementConverter)  // Required
+.setOverwriteByPartitionKeys(singletonList("key"))  // Optional  
+.setFailOnError(false)  // Optional
+.setMaxBatchSize(25)// Optional
+.setMaxInFlightRequests(50) // Optional
+.setMaxBufferedRequests(10_000) // Optional
+.setMaxTimeInBufferMS(5000) // Optional
+.build();
+
+flinkStream.sinkTo(dynamoDbSink);
+```
+{{< /tab >}}
+{{< tab "Scala" >}}
+```scala
+val sinkProperties = new Properties()
+// Required
+sinkProperties.put(AWSConfigConstants.AWS_REGION, "eu-west-1")
+// Optional, provide via alternative routes e.g. environment variables
+sinkProperties.put(AWSConfigConstants.AWS_ACCESS_KEY_ID, "aws_access_key_id")
+sinkProperties.put(AWSConfigConstants.AWS_SECRET_ACCESS_KEY, 
"aws_secret_access_key")
+
+val elementConverter = new CustomElementConverter();
+
+val dynamoDbSink =
+DynamoDbSink.builder()
+.setDynamoDbProperties(sinkProperties)  // Required
+.setTableName("my-dynamodb-table")  // Required
+.setElementConverter(elementConverter)  // Required
+.setOverwriteByPartitionKeys(singletonList("key"))  // Optional
+.setFailOnError(false)  // Optional
+.setMaxBatchSize(25)// Optional
+.setMaxInFlightRequests(50) // Optional
+.setMaxBufferedRequests(10_000) // Optional
+.setMaxTimeInBufferMS(5000) // Optional
+.build()
+
+flinkStream.sinkTo(dynamoDbSink)
+```
+{{< /tab >}}
+{{< /tabs >}}
+
+## Configurations
+
+Flink's DynamoDB sink is created by using the static builder 
`DynamoDBSink.builder()`.
+
+1. __setDynamoDbProperties(Properties sinkProperties)__
+* Required.
+* Supplies credentials, region and other parameters to the DynamoDB client.
+2. __setTableName(String tableName)__
+* Required.
+* Name of the table to sink to.
+3. __setElementConverter(ElementConverter 
elementConverter)__
+* Required.
+* Converts generic records of type `InputType` to `DynamoDbWriteRequest`.
+4. _setOverwriteByPartitionKeys(List partitionKeys)_
+* Optional. Default: [].
+* Used to deduplicate write requests within each batch pushed to DynamoDB.
+5. _setFailOnError(boolean failOnError)_
+* Optional. Default: `false`.
+* Whether failed requests to write records are treated as fatal exceptions 
in the sink.
+6. _setMaxBatchSize(int maxBatchSize)_
+* Optional. Default: `500`.

Review Comment:
   This defaults to `25`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [flink-connector-aws] hlteoh37 commented on a diff in pull request #26: [FLINK-25859][Connectors/DynamoDB][docs] DynamoDB sink documentation

2022-11-23 Thread GitBox


hlteoh37 commented on code in PR #26:
URL: 
https://github.com/apache/flink-connector-aws/pull/26#discussion_r1030171950


##
docs/content.zh/docs/connectors/table/dynamodb.md:
##
@@ -0,0 +1,284 @@
+---
+title: Firehose
+weight: 5
+type: docs
+aliases:
+- /dev/table/connectors/dynamodb.html
+---
+
+
+
+# Amazon DynamoDB SQL Connector
+
+{{< label "Sink: Streaming Append & Upsert Mode" >}}
+
+The DynamoDB connector allows for writing data into [Amazon 
DynamoDB](https://aws.amazon.com/dynamodb).
+
+Dependencies
+
+
+{{< sql_download_table "dynamodb" >}}
+
+How to create a DynamoDB table
+-
+
+Follow the instructions from the [Amazon DynamoDB Developer 
Guide](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/getting-started-step-1.html)
 
+to set up a DynamoDB table. The following example shows how to create a table 
backed by a DynamoDB table with minimum required options:
+
+```sql
+CREATE TABLE DynamoDbTable (
+  `user_id` BIGINT,
+  `item_id` BIGINT,
+  `category_id` BIGINT,
+  `behavior` STRING
+)
+WITH (
+  'connector' = 'dynamodb',
+  'table-name' = 'user_behavior',
+  'aws.region' = 'us-east-2'
+);
+```
+
+Connector Options
+-
+
+
+
+
+  Option
+  Required
+  Default
+  Type
+  Description
+
+
+  Common Options
+
+
+
+
+  connector
+  required
+  (none)
+  String
+  Specify what connector to use. For DynamoDB use 
'dynamodb'.
+
+
+  table-name
+  required
+  (none)
+  String
+  Name of the DynamoDB table to use.
+
+
+  aws.region
+  required
+  (none)
+  String
+  The AWS region where the DynamoDB table is defined.
+
+
+  aws.endpoint
+  optional
+  (none)
+  String
+  The AWS endpoint for DynamoDB.
+
+
+  aws.trust.all.certificates
+  optional
+  false
+  Boolean
+  If true accepts all SSL certificates.
+
+
+
+
+  Authentication 
Options
+
+
+
+
+  aws.credentials.provider
+  optional
+  AUTO
+  String
+  A credentials provider to use when authenticating against the 
Kinesis endpoint. See Authentication for 
details.
+
+
+  aws.credentials.basic.accesskeyid
+  optional
+  (none)
+  String
+  The AWS access key ID to use when setting credentials provider type 
to BASIC.
+
+
+  aws.credentials.basic.secretkey
+  optional
+  (none)
+  String
+  The AWS secret key to use when setting credentials provider type to 
BASIC.
+
+
+  aws.credentials.profile.path
+  optional
+  (none)
+  String
+  Optional configuration for profile path if credential provider type 
is set to be PROFILE.
+
+
+  aws.credentials.profile.name
+  optional
+  (none)
+  String
+  Optional configuration for profile name if credential provider type 
is set to be PROFILE.
+
+
+  aws.credentials.role.arn
+  optional
+  (none)
+  String
+  The role ARN to use when credential provider type is set to 
ASSUME_ROLE or WEB_IDENTITY_TOKEN.
+
+
+  aws.credentials.role.sessionName
+  optional
+  (none)
+  String
+  The role session name to use when credential provider type is set to 
ASSUME_ROLE or WEB_IDENTITY_TOKEN.
+
+
+  aws.credentials.role.externalId
+  optional
+  (none)
+  String
+  The external ID to use when credential provider type is set to 
ASSUME_ROLE.
+

Review Comment:
   we're missing `aws.credentials.role.stsEndpoint` here



##
docs/content.zh/docs/connectors/table/dynamodb.md:
##
@@ -0,0 +1,284 @@
+---
+title: Firehose
+weight: 5
+type: docs
+aliases:
+- /dev/table/connectors/dynamodb.html
+---
+
+
+
+# Amazon DynamoDB SQL Connector
+
+{{< label "Sink: Streaming Append & Upsert Mode" >}}
+
+The DynamoDB connector allows for writing data into [Amazon 
DynamoDB](https://aws.amazon.com/dynamodb).
+
+Dependencies
+
+
+{{< sql_download_table "dynamodb" >}}
+
+How to create a DynamoDB table
+-
+
+Follow the instructions from the [Amazon DynamoDB Developer 
Guide](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/getting-started-step-1.html)
 
+to set up a DynamoDB table. The following example shows how to create a table 
backed by a DynamoDB table with minimum required options:
+
+```sql
+CREATE TABLE DynamoDbTable (
+  `user_id` BIGINT,
+  `item_id` BIGINT,
+  `category_id` BIGINT,
+  `behavior` STRING
+)
+WITH (
+  'connector' = 'dynamodb',
+  'table-name' = 'user_behavior',
+  'aws.region' = 'us-east-2'
+);
+```
+
+Connector Options
+-
+
+
+
+
+  Option
+  Required
+  Default
+  Type
+  Description
+
+
+  Common Options
+
+
+
+
+  connecto

[GitHub] [flink-connector-aws] hlteoh37 commented on a diff in pull request #26: [FLINK-25859][Connectors/DynamoDB][docs] DynamoDB sink documentation

2022-11-22 Thread GitBox


hlteoh37 commented on code in PR #26:
URL: 
https://github.com/apache/flink-connector-aws/pull/26#discussion_r1029864477


##
docs/content.zh/docs/connectors/datastream/dynamodb.md:
##
@@ -0,0 +1,170 @@
+---
+title: DynamoDB
+weight: 5
+type: docs
+---
+
+
+# Amazon DynamoDB Sink
+
+The DynamoDB sink writes to [Amazon DynamoDB](https://aws.amazon.com/dynamodb) 
using the [AWS v2 SDK for 
Java](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/home.html).
+
+Follow the instructions from the [Amazon DynamoDB Developer 
Guide](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/getting-started-step-1.html)
+to setup a table.
+
+To use the connector, add the following Maven dependency to your project:
+
+{{< artifact flink-connector-dynamodb >}}

Review Comment:
   I meant more the connector - flink mapping!
   
   e.g.
   `1.0.0_1.15` or `1.1.0_1.15`. How will we know whether to include `1.1.0` or 
`1.0.0`?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org