[ 
https://issues.apache.org/jira/browse/FLINK-13111?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jack Tuck updated FLINK-13111:
------------------------------
    Description: 
So far I have followed the instructions documented for Flink's kinesis 
connector to use a local Kinesis. (Using Flink 1.8 and Kinesis connector 1.8)

[https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kinesis.html#using-non-aws-kinesis-endpoints-for-testing]
{code:java}
Properties producerConfig = new Properties();
producerConfig.put(AWSConfigConstants.AWS_REGION, "us-east-1");
producerConfig.put(AWSConfigConstants.AWS_ACCESS_KEY_ID, "aws_access_key_id");
producerConfig.put(AWSConfigConstants.AWS_SECRET_ACCESS_KEY, 
"aws_secret_access_key");
producerConfig.put(AWSConfigConstants.AWS_ENDPOINT, 
"http://localhost:4567";);{code}

With a Flink producer, these instructions work and I can consume from local 
kinesis (I use Kinesalite).

However, with a Flink consumer, I get an exception that `aws.region` and 
`aws.endpoint` are not *both* allowed.
org.apache.flink.client.program.ProgramInvocationException: The main method 
caused an error: For FlinkKinesisConsumer either AWS region ('aws.region') or 
AWS endpoint ('aws.endpoint') must be set in the config.
Is this a bug in the connector? I found the PR which fixed this error for the 
consumer but not the consumer [https://github.com/apache/flink/pull/6045] .

I found a [workaround on Flink's mailing 
list]([http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Can-t-get-the-FlinkKinesisProducer-to-work-against-Kinesalite-for-tests-td23438.html]),
 but their issue is with the producer rather than the consumer but perhaps they 
got that the wrong way around, it is after all weird how there are two 
codepaths for consumer/producer.

  was:
So far I have followed the instructions documented for Flink's kinesis 
connector to use a local Kinesis. (Using Flink 1.8 and Kinesis connector 1.8)

[https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kinesis.html#using-non-aws-kinesis-endpoints-for-testing]
{code:java}
Properties producerConfig = new Properties();
producerConfig.put(AWSConfigConstants.AWS_REGION, "us-east-1");
producerConfig.put(AWSConfigConstants.AWS_ACCESS_KEY_ID, "aws_access_key_id");
producerConfig.put(AWSConfigConstants.AWS_SECRET_ACCESS_KEY, 
"aws_secret_access_key");
producerConfig.put(AWSConfigConstants.AWS_ENDPOINT, 
"http://localhost:4567";);{code}
With a Flink producer, these instructions work and I can consume from local 
kinesis (I use Kinesalite).

However, with a Flink consumer, I get an exception that `aws.region` and 
`aws.endpoint` are not *both* allowed.

Is this a bug in the connector? I found the PR which fixed this error for the 
consumer but not the consumer [https://github.com/apache/flink/pull/6045] .

I found a [workaround on Flink's mailing 
list]([http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Can-t-get-the-FlinkKinesisProducer-to-work-against-Kinesalite-for-tests-td23438.html]),
 but their issue is with the producer rather than the consumer but perhaps they 
got that the wrong way around, it is after all weird how there are two 
codepaths for consumer/producer.


> FlinkKinesisConsumer fails with endpoint and region used together
> -----------------------------------------------------------------
>
>                 Key: FLINK-13111
>                 URL: https://issues.apache.org/jira/browse/FLINK-13111
>             Project: Flink
>          Issue Type: Bug
>            Reporter: Jack Tuck
>            Priority: Major
>
> So far I have followed the instructions documented for Flink's kinesis 
> connector to use a local Kinesis. (Using Flink 1.8 and Kinesis connector 1.8)
> [https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kinesis.html#using-non-aws-kinesis-endpoints-for-testing]
> {code:java}
> Properties producerConfig = new Properties();
> producerConfig.put(AWSConfigConstants.AWS_REGION, "us-east-1");
> producerConfig.put(AWSConfigConstants.AWS_ACCESS_KEY_ID, "aws_access_key_id");
> producerConfig.put(AWSConfigConstants.AWS_SECRET_ACCESS_KEY, 
> "aws_secret_access_key");
> producerConfig.put(AWSConfigConstants.AWS_ENDPOINT, 
> "http://localhost:4567";);{code}
> With a Flink producer, these instructions work and I can consume from local 
> kinesis (I use Kinesalite).
> However, with a Flink consumer, I get an exception that `aws.region` and 
> `aws.endpoint` are not *both* allowed.
> org.apache.flink.client.program.ProgramInvocationException: The main method 
> caused an error: For FlinkKinesisConsumer either AWS region ('aws.region') or 
> AWS endpoint ('aws.endpoint') must be set in the config.
> Is this a bug in the connector? I found the PR which fixed this error for the 
> consumer but not the consumer [https://github.com/apache/flink/pull/6045] .
> I found a [workaround on Flink's mailing 
> list]([http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Can-t-get-the-FlinkKinesisProducer-to-work-against-Kinesalite-for-tests-td23438.html]),
>  but their issue is with the producer rather than the consumer but perhaps 
> they got that the wrong way around, it is after all weird how there are two 
> codepaths for consumer/producer.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to