[ 
https://issues.apache.org/jira/browse/KAFKA-13607?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17808526#comment-17808526
 ] 

Sergey Ivanov commented on KAFKA-13607:
---------------------------------------

Hello,

We also faced an issue with DefaultSslEngineFactory.java

In our case we use Kafka Connect with config provider, which has the following 
properties:
{code:java}
  "producer.override.ssl.truststore.type" : 
"${saas:topicconfig:ssl.truststore.type}",
  "producer.override.bootstrap.servers" : 
"${saas:topicconfig:bootstrap.servers}",
  "producer.override.ssl.truststore.certificates" : 
"${saas:topicconfig:ssl.truststore.certificates}",
  "producer.override.security.protocol" : 
"${saas:topicconfig:security.protocol}",
  "producer.override.ssl.keystore.key" : "${saas:topicconfig:ssl.keystore.key}",
  "producer.override.ssl.keystore.type" : 
"${saas:topicconfig:ssl.keystore.type}",
  "producer.override.sasl.jaas.config" : "${saas:topicconfig:sasl.jaas.config}",
  "producer.override.sasl.mechanism" : "${saas:topicconfig:sasl.mechanism}",
  "producer.override.ssl.keystore.certificate.chain" : 
"${saas:topicconfig:ssl.keystore.certificate.chain}"{code}
And base on real Kafka connection properties the Conmfig Provider includes 
coresponding values. For example, for Kafka with TLS but without mTLS (only 
server cert), it provides empty values for ssl.keystore.key and 
ssl.keystore.type (as you know Config Provider can't remove property from 
connector at all). But in the code here:

[https://github.com/apache/kafka/blob/3.5/clients/src/main/java/org/apache/kafka/common/security/ssl/DefaultSslEngineFactory.java#L274]

it doesn't check for empty key, only for null. So we got an error:
{code:java}
Caused by: org.apache.kafka.common.errors.InvalidConfigurationException: SSL 
private key can be specified only for PEM, but key store type is . {code}
Looks like the provided PR is helping us also: 
[https://github.com/apache/kafka/pull/11707]

> Cannot use PEM certificate coding when parent defined file-based
> ----------------------------------------------------------------
>
>                 Key: KAFKA-13607
>                 URL: https://issues.apache.org/jira/browse/KAFKA-13607
>             Project: Kafka
>          Issue Type: Bug
>          Components: clients, config, connect
>    Affects Versions: 2.7.1, 3.0.0
>            Reporter: Piotr Smolinski
>            Priority: Major
>
> The problem applies to the situation when we create a Kafka client based on 
> prepopulated config. If we have only partial control on the input we can 
> attempt to reset some values.
> KIP-651 added a new cool feature to use PEM coding of certificates as an 
> alternative to file stores. I have observed a problem in Confluent 
> Replicator. We have shifted the common configuration to the worker level and 
> assumed the connectors define only what is specific for them. The security 
> setup is mTLS, i.e. we need both client cert and trusted chain. Our default 
> configuration has both in #PKCS12 files, but we had to reverse the 
> replication direction and redefine the destination coordinates. For these we 
> have certificates, but having KIP-651 we could specify them as connector 
> params as opposed to the worker deployment change.
> It came out that we cannot override {*}ssl.keystore.location{*}, 
> {*}ssl.keystore.password{*}, etc. simply with empty values, because the code 
> in the *DefaultSslEngineFactory* checks if the entry is null. We can only 
> override it to empty string.
> *DefaultSslEngineFactory* should treat the unexpected configuration entries 
> as absent when they are {*}null{*}, but also when the given entry is an empty 
> string.
> For a workaround I have created a hacky patch that fixes the behaviour:
> [https://github.com/piotrsmolinski/kafka-ssl-fix]
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to