This is an automated email from the ASF dual-hosted git repository.

chia7712 pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/kafka.git


The following commit(s) were added to refs/heads/trunk by this push:
     new e303e9fdfb6 MINOR: Improve README.md formatting and consistency 
(#21327)
e303e9fdfb6 is described below

commit e303e9fdfb6bbaac66e9c8c4ab5e09cfb135a37c
Author: JeevanYewale <[email protected]>
AuthorDate: Tue Jan 20 21:35:19 2026 +0530

    MINOR: Improve README.md formatting and consistency (#21327)
    
    Converted indented code blocks to fenced code blocks (```bash) to enable
    syntax highlighting and the "copy" button feature.
    Removed redundant trailing # characters from headers to align with
    standard Markdown style.
    
    Reviewers: Sean Quah <[email protected]>, Chia-Ping Tsai
    <[email protected]>
---
 README.md | 255 ++++++++++++++++++++++++++++++++++++++------------------------
 1 file changed, 158 insertions(+), 97 deletions(-)

diff --git a/README.md b/README.md
index 49d1f9fd5cf..57b2b1a5b90 100644
--- a/README.md
+++ b/README.md
@@ -20,75 +20,100 @@ for the rest.
 
 Scala 2.13 is the only supported version in Apache Kafka.
 
-### Build a jar and run it ###
-    ./gradlew jar
+### Build a jar and run it
+```bash
+./gradlew jar
+```
 
 Follow instructions in https://kafka.apache.org/quickstart
 
-### Build source jar ###
-    ./gradlew srcJar
-
-### Build aggregated javadoc ###
-    ./gradlew aggregatedJavadoc --no-parallel
-
-### Build javadoc and scaladoc ###
-    ./gradlew javadoc
-    ./gradlew javadocJar # builds a javadoc jar for each module
-    ./gradlew scaladoc
-    ./gradlew scaladocJar # builds a scaladoc jar for each module
-    ./gradlew docsJar # builds both (if applicable) javadoc and scaladoc jars 
for each module
-
-### Run unit/integration tests ###
-    ./gradlew test  # runs both unit and integration tests
-    ./gradlew unitTest
-    ./gradlew integrationTest
-    ./gradlew test -Pkafka.test.run.flaky=true  # runs tests that are marked 
as flaky
-
-    
-### Force re-running tests without code change ###
-    ./gradlew test --rerun-tasks
-    ./gradlew unitTest --rerun-tasks
-    ./gradlew integrationTest --rerun-tasks
-
-### Running a particular unit/integration test ###
-    ./gradlew clients:test --tests RequestResponseTest
-    ./gradlew streams:integration-tests:test --tests RestoreIntegrationTest
-
-### Repeatedly running a particular unit/integration test with specific times 
by setting N ###
-    N=500; I=0; while [ $I -lt $N ] && ./gradlew clients:test --tests 
RequestResponseTest --rerun --fail-fast; do (( I=$I+1 )); echo "Completed run: 
$I"; sleep 1; done
-
-### Running a particular test method within a unit/integration test ###
-    ./gradlew core:test --tests 
kafka.api.ProducerFailureHandlingTest.testCannotSendToInternalTopic
-    ./gradlew clients:test --tests 
org.apache.kafka.clients.MetadataTest.testTimeToNextUpdate
-    ./gradlew streams:integration-tests:test --tests 
org.apache.kafka.streams.integration.RestoreIntegrationTest.shouldRestoreNullRecord
-
-### Running a particular unit/integration test with log4j output ###
+### Build source jar
+```bash
+./gradlew srcJar
+```
+
+### Build aggregated javadoc
+```bash
+./gradlew aggregatedJavadoc --no-parallel
+```
+
+### Build javadoc and scaladoc
+```bash
+./gradlew javadoc
+./gradlew javadocJar # builds a javadoc jar for each module
+./gradlew scaladoc
+./gradlew scaladocJar # builds a scaladoc jar for each module
+./gradlew docsJar # builds both (if applicable) javadoc and scaladoc jars for 
each module
+```
+
+### Run unit/integration tests
+```bash
+./gradlew test  # runs both unit and integration tests
+./gradlew unitTest
+./gradlew integrationTest
+./gradlew test -Pkafka.test.run.flaky=true  # runs tests that are marked as 
flaky
+```
+
+### Force re-running tests without code change
+```bash
+./gradlew test --rerun-tasks
+./gradlew unitTest --rerun-tasks
+./gradlew integrationTest --rerun-tasks
+```
+
+### Running a particular unit/integration test
+```bash
+./gradlew clients:test --tests RequestResponseTest
+./gradlew streams:integration-tests:test --tests RestoreIntegrationTest
+```
+
+### Repeatedly running a particular unit/integration test with specific times 
by setting N
+```bash
+N=500; I=0; while [ $I -lt $N ] && ./gradlew clients:test --tests 
RequestResponseTest --rerun --fail-fast; do (( I=$I+1 )); echo "Completed run: 
$I"; sleep 1; done
+```
+
+### Running a particular test method within a unit/integration test
+```bash
+./gradlew core:test --tests 
kafka.api.ProducerFailureHandlingTest.testCannotSendToInternalTopic
+./gradlew clients:test --tests 
org.apache.kafka.clients.MetadataTest.testTimeToNextUpdate
+./gradlew streams:integration-tests:test --tests 
org.apache.kafka.streams.integration.RestoreIntegrationTest.shouldRestoreNullRecord
+```
+
+### Running a particular unit/integration test with log4j output
 By default, there will be only small number of logs output while testing. You 
can adjust it by changing the `log4j2.yaml` file in the module's 
`src/test/resources` directory.
 
 For example, if you want to see more logs for clients project tests, you can 
modify [the 
line](https://github.com/apache/kafka/blob/trunk/clients/src/test/resources/log4j2.yaml#L35)
 in `clients/src/test/resources/log4j2.yaml` 
 to `level: INFO` and then run:
-    
-    ./gradlew cleanTest clients:test --tests NetworkClientTest   
+
+```bash
+./gradlew cleanTest clients:test --tests NetworkClientTest
+```
 
 And you should see `INFO` level logs in the file under the 
`clients/build/test-results/test` directory.
 
-### Specifying test retries ###
+### Specifying test retries
 Retries are disabled by default, but you can set maxTestRetryFailures and 
maxTestRetries to enable retries.
 
 The following example declares -PmaxTestRetries=1 and -PmaxTestRetryFailures=3 
to enable a failed test to be retried once, with a total retry limit of 3.
 
-    ./gradlew test -PmaxTestRetries=1 -PmaxTestRetryFailures=3
+```bash
+./gradlew test -PmaxTestRetries=1 -PmaxTestRetryFailures=3
+```
 
 See [Test Retry Gradle 
Plugin](https://github.com/gradle/test-retry-gradle-plugin) and 
[build.yml](.github/workflows/build.yml) for more details.
 
-### Generating test coverage reports ###
+### Generating test coverage reports
 Generate coverage reports for the whole project:
 
-    ./gradlew reportCoverage -PenableTestCoverage=true 
-Dorg.gradle.parallel=false
+```bash
+./gradlew reportCoverage -PenableTestCoverage=true -Dorg.gradle.parallel=false
+```
 
 Generate coverage for a single module, i.e.: 
 
-    ./gradlew clients:reportCoverage -PenableTestCoverage=true 
-Dorg.gradle.parallel=false
+```bash
+./gradlew clients:reportCoverage -PenableTestCoverage=true 
-Dorg.gradle.parallel=false
+```
 
 Coverage reports are located within the module's build directory, categorized 
by module type:
 
@@ -96,16 +121,20 @@ Core Module (:core): 
`core/build/reports/scoverageTest/index.html`
 
 Other Modules: `<module>/build/reports/jacoco/test/html/index.html`
 
-### Building a binary release gzipped tar ball ###
-    ./gradlew clean releaseTarGz
+### Building a binary release gzipped tar ball
+```bash
+./gradlew clean releaseTarGz
+```
 
 The release file can be found inside `./core/build/distributions/`.
 
-### Building auto generated messages ###
+### Building auto generated messages
 Sometimes it is only necessary to rebuild the RPC auto-generated message data 
when switching between branches, as they could
 fail due to code changes. You can just run:
- 
-    ./gradlew processMessages processTestMessages
+
+```bash
+./gradlew processMessages processTestMessages
+```
 
 See [Apache Kafka Message 
Definitions](clients/src/main/resources/common/message/README.md) for details 
on Apache Kafka message protocol.
 
@@ -113,33 +142,45 @@ See [Apache Kafka Message 
Definitions](clients/src/main/resources/common/message
 
 Using compiled files:
 
-    KAFKA_CLUSTER_ID="$(./bin/kafka-storage.sh random-uuid)"
-    ./bin/kafka-storage.sh format --standalone -t $KAFKA_CLUSTER_ID -c 
config/server.properties
-    ./bin/kafka-server-start.sh config/server.properties
+```bash
+KAFKA_CLUSTER_ID="$(./bin/kafka-storage.sh random-uuid)"
+./bin/kafka-storage.sh format --standalone -t $KAFKA_CLUSTER_ID -c 
config/server.properties
+./bin/kafka-server-start.sh config/server.properties
+```
 
 Using docker image:
 
-    docker run -p 9092:9092 apache/kafka:latest
+```bash
+docker run -p 9092:9092 apache/kafka:latest
+```
 
 See [docker/README.md](docker/README.md) for detailed information.
 
-### Cleaning the build ###
-    ./gradlew clean
+### Cleaning the build
+```bash
+./gradlew clean
+```
 
-### Running a task for a specific project ###
+### Running a task for a specific project
 This is for `core`, `examples` and `clients`
 
-    ./gradlew core:jar
-    ./gradlew core:test
+```bash
+./gradlew core:jar
+./gradlew core:test
+```
 
 Streams has multiple sub-projects, but you can run all the tests:
 
-    ./gradlew :streams:testAll
+```bash
+./gradlew :streams:testAll
+```
 
-### Listing all gradle tasks ###
-    ./gradlew tasks
+### Listing all gradle tasks
+```bash
+./gradlew tasks
+```
 
-### Building IDE project ####
+### Building IDE project
 *Note Please ensure that JDK17 is used when developing Kafka.*
 
 IntelliJ supports Gradle natively and it will automatically check Java syntax 
and compatibility for each module, even if
@@ -147,17 +188,21 @@ the Java version shown in the `Structure > Project 
Settings > Modules` may not b
 
 When it comes to Eclipse, run:
 
-    ./gradlew eclipse
+```bash
+./gradlew eclipse
+```
 
 The `eclipse` task has been configured to use `${project_dir}/build_eclipse` 
as Eclipse's build directory. Eclipse's default
 build directory (`${project_dir}/bin`) clashes with Kafka's scripts directory 
and we don't use Gradle's build directory
 to avoid known issues with this configuration.
 
-### Publishing the streams quickstart archetype artifact to maven ###
+### Publishing the streams quickstart archetype artifact to maven
 For the Streams archetype project, one cannot use gradle to upload to maven; 
instead the `mvn deploy` command needs to be called at the quickstart folder:
 
-    cd streams/quickstart
-    mvn deploy
+```bash
+cd streams/quickstart
+mvn deploy
+```
 
 Please note for this to work you should create/update user maven settings 
(typically, `${USER_HOME}/.m2/settings.xml`) to assign the following variables
 
@@ -182,63 +227,79 @@ Please note for this to work you should create/update 
user maven settings (typic
      </servers>
      ...
 
-### Installing all projects to the local Maven repository ###
+### Installing all projects to the local Maven repository
+
+```bash
+./gradlew -PskipSigning=true publishToMavenLocal
+```
 
-    ./gradlew -PskipSigning=true publishToMavenLocal
+### Installing specific projects to the local Maven repository
 
-### Installing specific projects to the local Maven repository ###
+```bash
+./gradlew -PskipSigning=true :streams:publishToMavenLocal
+```
 
-    ./gradlew -PskipSigning=true :streams:publishToMavenLocal
-    
-### Building the test jar ###
-    ./gradlew testJar
+### Building the test jar
+```bash
+./gradlew testJar
+```
 
-### Running code quality checks ###
+### Running code quality checks
 There are two code quality analysis tools that we regularly run, spotbugs and 
checkstyle.
 
-#### Checkstyle ####
+#### Checkstyle
 Checkstyle enforces a consistent coding style in Kafka.
 You can run checkstyle using:
 
-    ./gradlew checkstyleMain checkstyleTest spotlessCheck
+```bash
+./gradlew checkstyleMain checkstyleTest spotlessCheck
+```
 
 The checkstyle warnings will be found in 
`reports/checkstyle/reports/main.html` and 
`reports/checkstyle/reports/test.html` files in the
 subproject build directories. They are also printed to the console. The build 
will fail if Checkstyle fails.
 For experiments (or regression testing purposes) add 
`-PcheckstyleVersion=X.y.z` switch (to override project-defined checkstyle 
version).
 
-#### Spotless ####
+#### Spotless
 The import order is a part of static check. please call `spotlessApply` to 
optimize the imports of Java codes before filing pull request.
 
-    ./gradlew spotlessApply
+```bash
+./gradlew spotlessApply
+```
 
-#### Spotbugs ####
+#### Spotbugs
 Spotbugs uses static analysis to look for bugs in the code.
 You can run spotbugs using:
 
-    ./gradlew spotbugsMain spotbugsTest -x test
+```bash
+./gradlew spotbugsMain spotbugsTest -x test
+```
 
 The spotbugs warnings will be found in `reports/spotbugs/main.html` and 
`reports/spotbugs/test.html` files in the subproject build
 directories.  Use -PxmlSpotBugsReport=true to generate an XML report instead 
of an HTML one.
 
-### JMH microbenchmarks ###
+### JMH microbenchmarks
 We use [JMH](https://openjdk.java.net/projects/code-tools/jmh/) to write 
microbenchmarks that produce reliable results in the JVM.
-    
+
 See 
[jmh-benchmarks/README.md](https://github.com/apache/kafka/blob/trunk/jmh-benchmarks/README.md)
 for details on how to run the microbenchmarks.
 
-### Dependency Analysis ###
+### Dependency Analysis
 
 The gradle [dependency debugging 
documentation](https://docs.gradle.org/current/userguide/viewing_debugging_dependencies.html)
 mentions using the `dependencies` or `dependencyInsight` tasks to debug 
dependencies for the root project or individual subprojects.
 
 Alternatively, use the `allDeps` or `allDepInsight` tasks for recursively 
iterating through all subprojects:
 
-    ./gradlew allDeps
+```bash
+./gradlew allDeps
 
-    ./gradlew allDepInsight --configuration runtimeClasspath --dependency 
com.fasterxml.jackson.core:jackson-databind
+./gradlew allDepInsight --configuration runtimeClasspath --dependency 
com.fasterxml.jackson.core:jackson-databind
+```
 
 These take the same arguments as the builtin variants.
 
-### Determining if any dependencies could be updated ###
-    ./gradlew dependencyUpdates --no-parallel
+### Determining if any dependencies could be updated
+```bash
+./gradlew dependencyUpdates --no-parallel
+```
 
 ### Common build options ###
 
@@ -271,31 +332,31 @@ only safe if the Scala library version is the same at 
compile time and runtime.
 may depend on the kafka jar for integration tests where they may include a 
scala library with a different version), we don't enable it by
 default. See https://www.lightbend.com/blog/scala-inliner-optimizer for more 
details.
 
-### Upgrading Gradle version ###
+### Upgrading Gradle version
 
 See [gradle/wrapper/README.md](gradle/wrapper/README.md) for instructions on 
upgrading the Gradle version.
 
-### Running system tests ###
+### Running system tests
 
 See [tests/README.md](tests/README.md).
 
-### Using Trogdor for testing ###
+### Using Trogdor for testing
 
 We use Trogdor as a test framework for Apache Kafka. You can use it to run 
benchmarks and other workloads.
 
 See [trogdor/README.md](trogdor/README.md).
 
-### Running in Vagrant ###
+### Running in Vagrant
 
 See [vagrant/README.md](vagrant/README.md).
 
-### Kafka client examples ###
+### Kafka client examples
 
 See [examples/README.md](examples/README.md).
 
-### Contribution ###
+### Contribution
 
 Apache Kafka is interested in building the community; we would welcome any 
thoughts or [patches](https://issues.apache.org/jira/browse/KAFKA). You can 
reach us [on the Apache mailing lists](http://kafka.apache.org/contact.html).
 
 To contribute follow the instructions here:
- * https://kafka.apache.org/contributing.html 
+ * https://kafka.apache.org/contributing.html

Reply via email to