FrankYang0529 commented on code in PR #17844: URL: https://github.com/apache/kafka/pull/17844#discussion_r1858577288
########## docs/security.html: ########## @@ -847,27 +847,27 @@ <h3 class="anchor-heading"><a id="security_sasl" class="anchor-link"></a><a href Kafka supports <a href="https://tools.ietf.org/html/rfc7677">SCRAM-SHA-256</a> and SCRAM-SHA-512 which can be used with TLS to perform secure authentication. Under the default implementation of <code>principal.builder.class</code>, the username is used as the authenticated <code>Principal</code> for configuration of ACLs etc. The default SCRAM implementation in Kafka - stores SCRAM credentials in Zookeeper and is suitable for use in Kafka installations where Zookeeper - is on a private network. Refer to <a href="#security_sasl_scram_security">Security Considerations</a> + stores SCRAM credentials in KRaft controllers. Refer to <a href="#security_sasl_scram_security">Security Considerations</a> for more details.</p> <ol> <li><h5 class="anchor-heading"><a id="security_sasl_scram_credentials" class="anchor-link"></a><a href="#security_sasl_scram_credentials">Creating SCRAM Credentials</a></h5> - <p>The SCRAM implementation in Kafka uses Zookeeper as credential store. Credentials can be created in - Zookeeper using <code>kafka-configs.sh</code>. For each SCRAM mechanism enabled, credentials must be created + <p>The SCRAM implementation in Kafka uses KRaft controllers as credential store. Credentials can be created in + KRaft controllers using <code>kafka-storage.sh</code> or <code>kafka-configs.sh</code>. For each SCRAM mechanism enabled, credentials must be created by adding a config with the mechanism name. Credentials for inter-broker communication must be created - before Kafka brokers are started. Client credentials may be created and updated dynamically and updated - credentials will be used to authenticate new connections.</p> - <p>Create SCRAM credentials for user <i>alice</i> with password <i>alice-secret</i>: - <pre><code class="language-bash">$ bin/kafka-configs.sh --zookeeper localhost:2182 --zk-tls-config-file zk_tls_config.properties --alter --add-config 'SCRAM-SHA-256=[iterations=8192,password=alice-secret],SCRAM-SHA-512=[password=alice-secret]' --entity-type users --entity-name alice</code></pre> + before Kafka brokers are started. <code>kafka-storage.sh</code> can format storage with initial credentials. + Client credentials may be created and updated dynamically and updated credentials will be used to authenticate new connections. + <code>kafka-configs.sh</code> can be used to create and update credentials after Kafka brokers are started.</p> + <p>Create initial SCRAM credentials for user <i>admin</i> with password <i>admin-secret</i>: + <pre><code class="language-bash">$ bin/kafka-storage.sh format -t $(bin/kafka-storage.sh random-uuid) -c config/kraft/server.properties --add-scram 'SCRAM-SHA-256=[name="admin",password="admin-secret"]'</code></pre> + <p>Create SCRAM credentials for user <i>alice</i> with password <i>alice-secret</i> (refer to <a href="#security_sasl_scram_clientconfig">Configuring Kafka Clients</a> for client configuration): + <pre><code class="language-bash">$ bin/kafka-configs.sh --bootstrap-server localhost:9092 --alter --add-config 'SCRAM-SHA-256=[iterations=8192,password=alice-secret]' --entity-type users --entity-name alice --command-config client.properties</code></pre> <p>The default iteration count of 4096 is used if iterations are not specified. A random salt is created Review Comment: Yes, adding the condition to the sentence. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org