AHeise commented on a change in pull request #17019:
URL: https://github.com/apache/flink/pull/17019#discussion_r698545675
##########
File path:
flink-connectors/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/sink/KafkaWriter.java
##########
@@ -183,62 +181,83 @@ public void close() throws Exception {
closer.close();
}
- private KafkaWriterState recoverAndInitializeState(List<KafkaWriterState>
recoveredStates) {
- final int subtaskId = kafkaSinkContext.getParallelInstanceId();
- if (recoveredStates.isEmpty()) {
- final KafkaWriterState state =
- new KafkaWriterState(transactionalIdPrefix, subtaskId, 0);
- abortTransactions(getTransactionsToAbort(state, new
ArrayList<>()));
- return state;
+ private void abortLingeringTransactions(
+ List<KafkaWriterState> recoveredStates, long startCheckpointId) {
+ List<String> prefixesToAbort =
Lists.newArrayList(transactionalIdPrefix);
+
+ if (!recoveredStates.isEmpty()) {
+ KafkaWriterState lastState = recoveredStates.get(0);
+ if
(!lastState.getTransactionalIdPrefix().equals(transactionalIdPrefix)) {
+ prefixesToAbort.add(lastState.getTransactionalIdPrefix());
+ LOG.warn(
+ "Transactional id prefix from previous execution {}
has changed to {}.",
+ lastState.getTransactionalIdPrefix(),
+ transactionalIdPrefix);
+ }
}
- final Map<Integer, KafkaWriterState> taskOffsetMapping =
- recoveredStates.stream()
- .collect(
- Collectors.toMap(
- KafkaWriterState::getSubtaskId,
Function.identity()));
- checkState(
- taskOffsetMapping.containsKey(subtaskId),
- "Internal error: It is expected that state from previous
executions is distributed to the same subtask id.");
- final KafkaWriterState lastState = taskOffsetMapping.get(subtaskId);
- taskOffsetMapping.remove(subtaskId);
- abortTransactions(
- getTransactionsToAbort(lastState, new
ArrayList<>(taskOffsetMapping.values())));
- if
(!lastState.getTransactionalIdPrefix().equals(transactionalIdPrefix)) {
- LOG.warn(
- "Transactional id prefix from previous execution {} has
changed to {}.",
- lastState.getTransactionalIdPrefix(),
- transactionalIdPrefix);
- return new KafkaWriterState(transactionalIdPrefix, subtaskId, 0);
+
+ final Properties properties = new Properties();
+ properties.putAll(kafkaProducerConfig);
+ properties.put(ProducerConfig.TRANSACTIONAL_ID_CONFIG, "dummy");
Review comment:
Yes, but we can solve it in a better way where this is not needed.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]