[
https://issues.apache.org/jira/browse/FLINK-21062?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
xiaozilong updated FLINK-21062:
-------------------------------
Description:
The program will throws NPE when using the dynamic Index in elasticsearch
connector.
The DDL like:
{code:java}
create table bigoflow_logs_output(
jobName VARCHAR,
userName VARCHAR,
proctime TIMESTAMP
) with (
'connector' = 'elasticsearch-7',
'hosts' = 'http://127.0.0.1:9400',
'index' = 'flink2es-{proctime|yyyy-MM-dd}'
);
{code}
The problem mayby that the method `AbstractTimeIndexGenerator#open()` is not
called when `AbstractTimeIndexGenerator` is initialized.
The exception stack is as follows:
{code:java}
java.lang.NullPointerException: formatter
at java.util.Objects.requireNonNull(Objects.java:228) ~[?:1.8.0_60]
at java.time.LocalDateTime.format(LocalDateTime.java:1751) ~[?:1.8.0_60]
at
org.apache.flink.streaming.connectors.elasticsearch.table.IndexGeneratorFactory.lambda$createFormatFunction$27972a5d$3(IndexGeneratorFactory.java:161)
~[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0]
at
org.apache.flink.streaming.connectors.elasticsearch.table.IndexGeneratorFactory$1.generate(IndexGeneratorFactory.java:118)
[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0]
at
org.apache.flink.streaming.connectors.elasticsearch.table.RowElasticsearchSinkFunction.processUpsert(RowElasticsearchSinkFunction.java:103)
[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0]
at
org.apache.flink.streaming.connectors.elasticsearch.table.RowElasticsearchSinkFunction.process(RowElasticsearchSinkFunction.java:79)
[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0]
at
org.apache.flink.streaming.connectors.elasticsearch.table.RowElasticsearchSinkFunction.process(RowElasticsearchSinkFunction.java:44)
[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0]
at
org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkBase.invoke(ElasticsearchSinkBase.java:310)
[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0]
at
org.apache.flink.table.runtime.operators.sink.SinkOperator.processElement(SinkOperator.java:86)
[flink-table-blink_2.12-1.11.0.jar:1.11.0]
at
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:717)
[flink-dist_2.12-1.11.0.jar:1.11.0]
at
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:692)
[flink-dist_2.12-1.11.0.jar:1.11.0]
at
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:672)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:52)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:30)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
StreamExecCalc$14.processElement(Unknown Source)
[flink-table-blink_2.12-1.11.0.jar:?] at
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:717)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:692)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:672)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:52)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:30)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:104)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collectWithTimestamp(StreamSourceContexts.java:111)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecordsWithTimestamps(AbstractFetcher.java:352)
[flink-connector-kafka-base_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.partitionConsumerRecordsHandler(KafkaFetcher.java:244)
[flink-connector-kafka_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.runFetchLoop(KafkaFetcher.java:200)
[flink-connector-kafka_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:813)
[flink-connector-kafka-base_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:201)
[flink-dist_2.12-1.11.0.jar:1.11.0]
{code}
was:
The program will throws NPE when using the dynamic Index in elasticsearch
connector.
The DDL like:
{code:java}
create table bigoflow_logs_output(
jobName VARCHAR,
userName VARCHAR,
proctime TIMESTAMP
) with (
'connector' = 'elasticsearch-7',
'hosts' = 'http://127.0.0.1:9400',
'index' = 'flink2es-{proctime|yyyy-MM-dd}'
);
{code}
The problem mayby that the method `AbstractTimeIndexGenerator#open()` is not
called when `AbstractTimeIndexGenerator` is initialized.
The exception stack is as follows:
java.lang.NullPointerException: formatter at
java.util.Objects.requireNonNull(Objects.java:228) ~[?:1.8.0_60] at
java.time.LocalDateTime.format(LocalDateTime.java:1751) ~[?:1.8.0_60] at
org.apache.flink.streaming.connectors.elasticsearch.table.IndexGeneratorFactory.lambda$createFormatFunction$27972a5d$3(IndexGeneratorFactory.java:161)
~[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.elasticsearch.table.IndexGeneratorFactory$1.generate(IndexGeneratorFactory.java:118)
[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.elasticsearch.table.RowElasticsearchSinkFunction.processUpsert(RowElasticsearchSinkFunction.java:103)
[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.elasticsearch.table.RowElasticsearchSinkFunction.process(RowElasticsearchSinkFunction.java:79)
[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.elasticsearch.table.RowElasticsearchSinkFunction.process(RowElasticsearchSinkFunction.java:44)
[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkBase.invoke(ElasticsearchSinkBase.java:310)
[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.table.runtime.operators.sink.SinkOperator.processElement(SinkOperator.java:86)
[flink-table-blink_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:717)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:692)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:672)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:52)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:30)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
StreamExecCalc$14.processElement(Unknown Source)
[flink-table-blink_2.12-1.11.0.jar:?] at
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:717)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:692)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:672)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:52)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:30)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:104)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collectWithTimestamp(StreamSourceContexts.java:111)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecordsWithTimestamps(AbstractFetcher.java:352)
[flink-connector-kafka-base_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.partitionConsumerRecordsHandler(KafkaFetcher.java:244)
[flink-connector-kafka_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.runFetchLoop(KafkaFetcher.java:200)
[flink-connector-kafka_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:813)
[flink-connector-kafka-base_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
[flink-dist_2.12-1.11.0.jar:1.11.0] at
org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:201)
[flink-dist_2.12-1.11.0.jar:1.11.0]
> Meeting NPE when using the dynamic Index in elasticsearch connector
> -------------------------------------------------------------------
>
> Key: FLINK-21062
> URL: https://issues.apache.org/jira/browse/FLINK-21062
> Project: Flink
> Issue Type: Bug
> Components: Connectors / ElasticSearch
> Affects Versions: 1.12.0
> Reporter: xiaozilong
> Priority: Major
>
> The program will throws NPE when using the dynamic Index in elasticsearch
> connector.
> The DDL like:
> {code:java}
> create table bigoflow_logs_output(
> jobName VARCHAR,
> userName VARCHAR,
> proctime TIMESTAMP
> ) with (
> 'connector' = 'elasticsearch-7',
> 'hosts' = 'http://127.0.0.1:9400',
> 'index' = 'flink2es-{proctime|yyyy-MM-dd}'
> );
> {code}
> The problem mayby that the method `AbstractTimeIndexGenerator#open()` is not
> called when `AbstractTimeIndexGenerator` is initialized.
> The exception stack is as follows:
> {code:java}
> java.lang.NullPointerException: formatter
> at java.util.Objects.requireNonNull(Objects.java:228) ~[?:1.8.0_60]
> at java.time.LocalDateTime.format(LocalDateTime.java:1751) ~[?:1.8.0_60]
> at
> org.apache.flink.streaming.connectors.elasticsearch.table.IndexGeneratorFactory.lambda$createFormatFunction$27972a5d$3(IndexGeneratorFactory.java:161)
> ~[flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0]
> at
> org.apache.flink.streaming.connectors.elasticsearch.table.IndexGeneratorFactory$1.generate(IndexGeneratorFactory.java:118)
> [flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0]
> at
> org.apache.flink.streaming.connectors.elasticsearch.table.RowElasticsearchSinkFunction.processUpsert(RowElasticsearchSinkFunction.java:103)
> [flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0]
> at
> org.apache.flink.streaming.connectors.elasticsearch.table.RowElasticsearchSinkFunction.process(RowElasticsearchSinkFunction.java:79)
> [flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0]
> at
> org.apache.flink.streaming.connectors.elasticsearch.table.RowElasticsearchSinkFunction.process(RowElasticsearchSinkFunction.java:44)
> [flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0]
> at
> org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkBase.invoke(ElasticsearchSinkBase.java:310)
> [flink-connector-elasticsearch-base_2.12-1.11.0.jar:1.11.0]
> at
> org.apache.flink.table.runtime.operators.sink.SinkOperator.processElement(SinkOperator.java:86)
> [flink-table-blink_2.12-1.11.0.jar:1.11.0]
> at
> org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:717)
> [flink-dist_2.12-1.11.0.jar:1.11.0]
> at
> org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:692)
> [flink-dist_2.12-1.11.0.jar:1.11.0]
> at
> org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:672)
> [flink-dist_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:52)
> [flink-dist_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:30)
> [flink-dist_2.12-1.11.0.jar:1.11.0] at
> StreamExecCalc$14.processElement(Unknown Source)
> [flink-table-blink_2.12-1.11.0.jar:?] at
> org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:717)
> [flink-dist_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:692)
> [flink-dist_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:672)
> [flink-dist_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:52)
> [flink-dist_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:30)
> [flink-dist_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:104)
> [flink-dist_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collectWithTimestamp(StreamSourceContexts.java:111)
> [flink-dist_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecordsWithTimestamps(AbstractFetcher.java:352)
> [flink-connector-kafka-base_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.partitionConsumerRecordsHandler(KafkaFetcher.java:244)
> [flink-connector-kafka_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.runFetchLoop(KafkaFetcher.java:200)
> [flink-connector-kafka_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:813)
> [flink-connector-kafka-base_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
> [flink-dist_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
> [flink-dist_2.12-1.11.0.jar:1.11.0] at
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:201)
> [flink-dist_2.12-1.11.0.jar:1.11.0]
> {code}
>
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)