This is an automated email from the ASF dual-hosted git repository.

seanglover pushed a commit to branch seglo/artifact-id-rename
in repository 
https://gitbox.apache.org/repos/asf/incubator-pekko-connectors-kafka.git

commit 8e2fc32f656fc2674e208cfc9cbf8124207315e4
Author: Sean Glover <[email protected]>
AuthorDate: Tue Dec 27 23:18:15 2022 -0500

    rename
---
 CONTRIBUTING.md           | 32 ++++++++++---------------
 README.md                 | 59 +++++++++++++++--------------------------------
 RELEASING.md              |  4 ++--
 build.sbt                 | 26 ++++++++++-----------
 project/project-info.conf |  2 +-
 5 files changed, 46 insertions(+), 77 deletions(-)

diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 69987619..6985f17b 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -1,33 +1,27 @@
-# Welcome! Thank you for contributing to Alpakka Kafka!
+# Welcome! Thank you for contributing to the Pekko Kafka connector!
 
 We follow the standard GitHub [fork & 
pull](https://help.github.com/articles/using-pull-requests/#fork--pull) 
approach to pull requests. Just fork the official repo, develop in a branch, 
and submit a PR!
 
-You're always welcome to submit your PR straight away and start the discussion 
(without reading the rest of this wonderful doc, or the README.md). The goal of 
these notes is to make your experience contributing to Alpakka as smooth and 
pleasant as possible. We're happy to guide you through the process once you've 
submitted your PR.
+You're always welcome to submit your PR straight away and start the discussion 
(without reading the rest of this wonderful doc, or the README.md). The goal of 
these notes is to make your experience contributing to Pekko as smooth and 
pleasant as possible. We're happy to guide you through the process once you've 
submitted your PR.
 
-# The Akka Community
-
-Please check out [Get Involved](https://akka.io/get-involved/).
-
-# Contributing to Alpakka Kafka
+# Contributing to the Pekko Kafka connector
 
 ## General Workflow
 
-This is the process for committing code into master.
+This is the process for committing code into main.
 
-1. Make sure you have signed the Lightbend CLA, if not, [sign it 
online](http://www.lightbend.com/contribute/cla).
-
-1. To avoid duplicated effort, it might be good to check the [issue 
tracker](https://github.com/akka/alpakka/issues) and [existing pull 
requests](https://github.com/akka/alpakka/pulls) for existing work.
-   - If there is no ticket yet, feel free to [create 
one](https://github.com/akka/alpakka/issues/new) to discuss the problem and the 
approach you want to take to solve it.
+1. To avoid duplicated effort, it might be good to check the [issue 
tracker](https://github.com/apache/incubator-pekko-connectors-kafka/issues) and 
[existing pull 
requests](https://github.com/apache/incubator-pekko-connectors-kafka/pulls) for 
existing work.
+   - If there is no ticket yet, feel free to [create 
one](https://github.com/apache/incubator-pekko-connectors-kafka/issues/new) to 
discuss the problem and the approach you want to take to solve it.
 
 1. Perform your work according to the [pull request 
requirements](#pull-request-requirements).
 
-1. When the feature or fix is completed you should open a [Pull 
Request](https://help.github.com/articles/using-pull-requests) on 
[GitHub](https://github.com/akka/alpakka/pulls). 
+1. When the feature or fix is completed you should open a [Pull 
Request](https://help.github.com/articles/using-pull-requests) on 
[GitHub](https://github.com/apache/incubator-pekko-connectors-kafka/pulls). 
Prefix your PR title with a marker to show which module it affects (eg. "JMS", 
or "AWS S3").
 
-1. The Pull Request should be reviewed by other maintainers (as many as 
feasible/practical). Note that the maintainers can consist of outside 
contributors, both within and outside Lightbend. Outside contributors are 
encouraged to participate in the review process, it is not a closed process.
+1. The Pull Request should be reviewed by other maintainers (as many as 
feasible/practical). Outside contributors are encouraged to participate in the 
review process, it is not a closed process.
 
-1. After the review you should fix the issues (review comments, CI failures) 
by pushing a new commit for new review, iterating until the reviewers give 
their thumbs up and CI tests pass.
+1. After the review you should fix the issues (review comments, CI failures, 
compiler warnings) by pushing a new commit for new review, iterating until the 
reviewers give their thumbs up and CI tests pass.
 
-1. When the branch conflicts with its merge target (either by way of git merge 
conflict or failing CI tests), do **not** merge the target branch into your 
feature branch. Instead, rebase your branch onto the target branch and update 
it with `git push -f`.
+1. If the branch merge conflicts with its target, rebase your branch onto the 
target branch.
 
 ## Pull Request Requirements
 
@@ -43,13 +37,11 @@ For a Pull Request to be considered at all it has to meet 
these requirements:
 
 1. Regardless if the code introduces new features or fixes bugs or 
regressions, it must have comprehensive tests.
 
-1. The code must be well documented in the Lightbend's standard documentation 
format (see the [Documentation](#documentation) section below).
-
 1. The commit messages must properly describe the changes, see [further 
below](#creating-commits-and-writing-commit-messages).
 
 1. Do not use ``@author`` tags since it does not encourage [Collective Code 
Ownership](http://www.extremeprogramming.org/rules/collective.html). 
Contributors get the credit they deserve in the release notes.
 
-If these requirements are not met then the code should **not** be merged into 
master, or even reviewed - regardless of how good or important it is. No 
exceptions.
+If these requirements are not met then the code should **not** be merged into 
main, or even reviewed - regardless of how good or important it is. No 
exceptions.
 
 
 ## Documentation
@@ -105,7 +97,7 @@ Example:
 
 ## How To Enforce These Guidelines?
 
-1. [GitHub Actions](https://github.com/akka/alpakka-kafka/actions) 
automatically merges the code, builds it, runs the tests and sets Pull Request 
status accordingly of results in GitHub.
+1. [GitHub 
Actions](https://github.com/apache/incubator-pekko-connectors-kafka/actions) 
automatically merges the code, builds it, runs the tests and sets Pull Request 
status accordingly of results in GitHub.
 1. [Scalafmt](http://scalameta.org/scalafmt/) enforces some of the code style 
rules.
 1. [sbt-header plugin](https://github.com/sbt/sbt-header) manages consistent 
copyright headers in every source file.
 1. A GitHub bot checks whether you've signed the Lightbend CLA. 
diff --git a/README.md b/README.md
index dd799648..e50c153e 100644
--- a/README.md
+++ b/README.md
@@ -1,69 +1,48 @@
 Alpakka Kafka [![scaladex-badge][]][scaladex] 
[![maven-central-badge][]][maven-central] [![gh-actions-badge][]][gh-actions]
 =============
 
-[scaladex]:            
https://index.scala-lang.org/akka/alpakka-kafka/akka-stream-kafka/
-[scaladex-badge]:      
https://index.scala-lang.org/akka/alpakka-kafka/akka-stream-kafka/latest.svg?target=_2.13
-[maven-central]:       
https://maven-badges.herokuapp.com/maven-central/com.typesafe.akka/akka-stream-kafka_2.13
-[maven-central-badge]: 
https://maven-badges.herokuapp.com/maven-central/com.typesafe.akka/akka-stream-kafka_2.13/badge.svg
-[gh-actions]:          https://github.com/akka/alpakka-kafka/actions
-[gh-actions-badge]:    
https://github.com/akka/alpakka-kafka/workflows/CI/badge.svg?branch=master
+[scaladex]:            
https://index.scala-lang.org/apache/pekko-connector-kafka/
+[scaladex-badge]:      
https://index.scala-lang.org/apache/pekko-connector-kafka/latest.svg?target=_2.13
+[maven-central]:       
https://maven-badges.herokuapp.com/maven-central/org.apache.pekko/pekko-connectors-kafka_2.13
+[maven-central-badge]: 
https://maven-badges.herokuapp.com/maven-central/org.apache.pekko/pekko-connectors-kafka_2.13/badge.svg
+[gh-actions]:          
https://github.com/apache/incubator-pekko-connectors-kafka/actions
+[gh-actions-badge]:    
https://github.com/apache/incubator-pekko-connectors-kafka/workflows/CI/badge.svg?branch=main
 
 
 Systems don't come alone. In the modern world of microservices and cloud 
deployment, new components must interact with legacy systems, making 
integration an important key to success. Reactive Streams give us a 
technology-independent tool to let these heterogeneous systems communicate 
without overwhelming each other.
 
-The Alpakka project is an open source initiative to implement stream-aware, 
reactive, integration pipelines for Java and Scala. It is built on top of [Akka 
Streams](https://doc.akka.io/docs/akka/current/stream/index.html), and has been 
designed from the ground up to understand streaming natively and provide a DSL 
for reactive and stream-oriented programming, with built-in support for 
backpressure. Akka Streams is a [Reactive 
Streams](https://www.reactive-streams.org/) and JDK 9+ [java.ut [...]
+The Pekko connectors project is an open source initiative to implement 
stream-aware, reactive, integration pipelines for Java and Scala. It is built 
on top of [Akka 
Streams](https://doc.akka.io/docs/akka/current/stream/index.html), and has been 
designed from the ground up to understand streaming natively and provide a DSL 
for reactive and stream-oriented programming, with built-in support for 
backpressure. Akka Streams is a [Reactive 
Streams](https://www.reactive-streams.org/) and JDK 9+ [...]
 
-This repository contains the sources for the **Alpakka Kafka connector**. 
Which lets you connect [Apache Kafka](https://kafka.apache.org/) to Akka 
Streams. It was formerly known as **Akka Streams Kafka** and even **Reactive 
Kafka**.
+This repository contains the sources for the **Pekko Kafka connector**. Which 
lets you connect [Apache Kafka](https://kafka.apache.org/) to Pekko Streams.
 
-Akka Stream connectors to other technologies are listed in the [Alpakka 
repository](https://github.com/akka/alpakka).
+Pekko connectors to other technologies are listed in the [Pekko connectors 
repository](https://github.com/apache/incubator-pekko-connectors).
 
+The Pekko Kafka connector is a fork of [Alpakka 
Kafka](https://github.com/akka/alpakka-kafka) 3.0.1, prior to the Akka 
project's adoption of the Business Source License.
 
 Documentation
 -------------
 
-- [Alpakka reference](https://doc.akka.io/docs/alpakka/current/) documentation
-
-- **[Alpakka Kafka connector 
reference](https://doc.akka.io/docs/akka-stream-kafka/current/) documentation**
-
-To keep up with the latest Alpakka releases check out [Alpakka 
releases](https://github.com/akka/alpakka/releases) and [Alpakka Kafka 
releases](https://github.com/akka/alpakka-kafka/releases).
-
+**TODO add documentation links**
 
 Community
 ---------
 
-You can join these groups and chats to discuss and ask Akka and Alpakka 
related questions:
-
-- Forums: 
[discuss.lightbend.com](https://discuss.lightbend.com/c/akka/streams-and-alpakka)
-- Issue tracker: [![github: 
akka/alpakka-kafka](https://img.shields.io/badge/github%3A-issues-blue.svg?style=flat-square)](https://github.com/akka/alpakka-kafka/issues)
-
-In addition to that, you may enjoy following:
-
-- The [Akka Team Blog](https://akka.io/blog/)
-- [@akkateam](https://twitter.com/akkateam) on Twitter
-- Questions tagged [#alpakka on 
StackOverflow](https://stackoverflow.com/questions/tagged/alpakka)
-- Questions tagged [**#alpakka** on 
StackOverflow](https://stackoverflow.com/questions/tagged/alpakka)
+You can join these forums and chats to discuss and ask Pekko and Pekko 
connector related questions:
 
-The Kafka connector was originally created as **Reactive Kafka** by [<img 
src="https://files.softwaremill.com/logo/logo.svg"; alt="SoftwareMill logo" 
height="25">](https://softwaremill.com).
+- [GitHub discussions](https://github.com/apache/incubator-pekko/discussions): 
for questions and general discussion.
+- [Pekko dev mailing 
list](https://lists.apache.org/[email protected]): for Pekko 
connectors development discussions.
+- [GitHub 
issues](https://github.com/apache/incubator-pekko-connectors-kafka/issues): for 
bug reports and feature requests. Please search the existing issues before 
creating new ones. If you are unsure whether you have found a bug, consider 
asking in GitHub discussions or the mailing list first.
 
+The Pekko Kafka connector was originally created as **Reactive Kafka** by 
[<img src="https://files.softwaremill.com/logo/logo.svg"; alt="SoftwareMill 
logo" height="25">](https://softwaremill.com).
 
 Contributing
 ------------
 
-[Lightbend](https://www.lightbend.com/) is the steward of Akka and Alpakka.
+Contributions are very welcome. If you have an idea on how to improve Pekko, 
don't hesitate to create an issue or submit a pull request.
 
-Contributions are *very* welcome! Lightbend appreciates community 
contributions by both those new to Alpakka and those more experienced.
-
-Alpakka depends on the community to keep up with the ever-growing number of 
technologies with which to integrate. Please step up and share the successful 
Akka Stream integrations you implement with the Alpakka community.
-
-If you find an issue that you'd like to see fixed, the quickest way to make 
that happen is to implement the fix and submit a pull request.
-
-Refer to the [CONTRIBUTING.md](CONTRIBUTING.md) file for more details about 
the workflow, and general hints on how to prepare your pull request.
-
-You can also ask for clarifications or guidance in GitHub issues directly.
+See 
[CONTRIBUTING.md](https://github.com/apache/incubator-pekko/blob/main/CONTRIBUTING.md)
 for details on the development workflow and how to create your pull request.
 
 Caveat Emptor
 -------------
 
-Alpakka components are not always binary compatible between releases. API 
changes that are not backward compatible might be introduced as we refine and 
simplify based on your feedback. A module may be dropped in any release without 
prior deprecation. 
-
-Support for the Alpakka Kafka connector is available via Lightbend's [Akka 
Platform subscription](https://www.lightbend.com/akka-platform#subscription).
+Pekko connectors are not always binary compatible between releases. API 
changes that are not backward compatible might be introduced as we refine and 
simplify based on your feedback. A module may be dropped in any release without 
prior deprecation.
diff --git a/RELEASING.md b/RELEASING.md
index 58938853..795d2721 100644
--- a/RELEASING.md
+++ b/RELEASING.md
@@ -1,9 +1,9 @@
 # Releasing
 
-Create a new issue from the [Alpakka Kafka Release Train Issue 
Template](docs/release-train-issue-template.md) and follow the steps.
+Create a new issue from the [Pekko Kafka Connector Release Train Issue 
Template](docs/release-train-issue-template.md) and follow the steps.
 
 ```bash
-~/alpakka> scripts/create-release-issue.sh `version-to-be-released`
+~/pekko-connectors-kafka> scripts/create-release-issue.sh 
`version-to-be-released`
 ```
 
 ### Releasing only updated docs
diff --git a/build.sbt b/build.sbt
index 7481f524..3aaf5a7e 100644
--- a/build.sbt
+++ b/build.sbt
@@ -2,8 +2,6 @@ import com.typesafe.tools.mima.core.{ Problem, ProblemFilters }
 
 enablePlugins(AutomateHeaderPlugin)
 
-name := "akka-stream-kafka"
-
 val Nightly = sys.env.get("EVENT_NAME").contains("schedule")
 
 // align ignore-prefixes in scripts/link-validator.conf
@@ -115,7 +113,7 @@ val commonSettings = Def.settings(
   projectInfoVersion := (if (isSnapshot.value) "snapshot" else version.value),
   sonatypeProfileName := "com.typesafe")
 
-lazy val `alpakka-kafka` =
+lazy val `pekko-connectors-kafka` =
   project
     .in(file("."))
     .enablePlugins(ScalaUnidocPlugin)
@@ -127,11 +125,11 @@ lazy val `alpakka-kafka` =
       ScalaUnidoc / unidoc / unidocProjectFilter := inProjects(core, testkit),
       onLoadMessage :=
         """
-            |** Welcome to the Alpakka Kafka connector! **
+            |** Welcome to the Pekko Kafka connector! **
             |
             |The build has three main modules:
             |  core - the Kafka connector sources
-            |  clusterSharding - Akka Cluster External Sharding with Alpakka 
Kafka
+            |  cluster-sharding - Akka Cluster External Sharding with the 
Pekko Kafka connector
             |  tests - tests, Docker based integration tests, code for the 
documentation
             |  testkit - framework for testing the connector
             |
@@ -163,7 +161,7 @@ lazy val `alpakka-kafka` =
             |  benchmarks/IntegrationTest/testOnly *.AlpakkaKafkaPlainConsumer
             |    run a single benchmark backed by Docker containers
           """.stripMargin)
-    .aggregate(core, testkit, clusterSharding, tests, benchmarks, docs)
+    .aggregate(core, testkit, `cluster-sharding`, tests, benchmarks, docs)
 
 lazy val core = project
   .enablePlugins(AutomateHeaderPlugin)
@@ -171,7 +169,7 @@ lazy val core = project
   .settings(commonSettings)
   .settings(VersionGenerator.settings)
   .settings(
-    name := "akka-stream-kafka",
+    name := "pekko-connectors-kafka",
     AutomaticModuleName.settings("akka.stream.alpakka.kafka"),
     libraryDependencies ++= Seq(
       "com.typesafe.akka" %% "akka-stream" % akkaVersion,
@@ -186,7 +184,7 @@ lazy val testkit = project
   .disablePlugins(SitePlugin)
   .settings(commonSettings)
   .settings(
-    name := "akka-stream-kafka-testkit",
+    name := "pekko-connectors-kafka-testkit",
     AutomaticModuleName.settings("akka.stream.alpakka.kafka.testkit"),
     JupiterKeys.junitJupiterVersion := "5.8.2",
     libraryDependencies ++= Seq(
@@ -198,14 +196,14 @@ lazy val testkit = project
     mimaPreviousArtifacts := Set.empty, // temporarily disable mima checks
     mimaBinaryIssueFilters += 
ProblemFilters.exclude[Problem]("akka.kafka.testkit.internal.*"))
 
-lazy val clusterSharding = project
+lazy val `cluster-sharding` = project
   .in(file("./cluster-sharding"))
   .dependsOn(core)
   .enablePlugins(AutomateHeaderPlugin)
   .disablePlugins(SitePlugin)
   .settings(commonSettings)
   .settings(
-    name := "akka-stream-kafka-cluster-sharding",
+    name := "pekko-connectors-kafka-cluster-sharding",
     AutomaticModuleName.settings("akka.stream.alpakka.kafka.cluster.sharding"),
     libraryDependencies ++= Seq(
       "com.typesafe.akka" %% "akka-cluster-sharding-typed" % akkaVersion),
@@ -213,7 +211,7 @@ lazy val clusterSharding = project
   )
 
 lazy val tests = project
-  .dependsOn(core, testkit, clusterSharding)
+  .dependsOn(core, testkit, `cluster-sharding`)
   .enablePlugins(AutomateHeaderPlugin)
   .disablePlugins(MimaPlugin, SitePlugin)
   .configs(IntegrationTest.extend(Test))
@@ -221,7 +219,7 @@ lazy val tests = project
   .settings(Defaults.itSettings)
   .settings(headerSettings(IntegrationTest))
   .settings(
-    name := "akka-stream-kafka-tests",
+    name := "pekko-connectors-kafka-tests",
     libraryDependencies ++= Seq(
       "com.typesafe.akka" %% "akka-discovery" % akkaVersion,
       "com.google.protobuf" % "protobuf-java" % "3.19.1", // use the same 
version as in scalapb
@@ -257,7 +255,7 @@ lazy val docs = project
   .disablePlugins(MimaPlugin)
   .settings(commonSettings)
   .settings(
-    name := "Alpakka Kafka",
+    name := "Apache Pekko Kafka Connector",
     publish / skip := true,
     makeSite := makeSite.dependsOn(LocalRootProject / ScalaUnidoc / doc).value,
     previewPath := (Paradox / siteSubdirName).value,
@@ -318,7 +316,7 @@ lazy val benchmarks = project
   .settings(Defaults.itSettings)
   .settings(headerSettings(IntegrationTest))
   .settings(
-    name := "akka-stream-kafka-benchmarks",
+    name := "pekko-connectors-kafka-benchmarks",
     publish / skip := true,
     IntegrationTest / parallelExecution := false,
     libraryDependencies ++= Seq(
diff --git a/project/project-info.conf b/project/project-info.conf
index 5c78f235..439dd6e5 100644
--- a/project/project-info.conf
+++ b/project/project-info.conf
@@ -8,7 +8,7 @@ project-info {
       new-tab: false
     }
     issues: {
-      url: "https://github.com/akka/alpakka-kafka/issues";
+      url: "https://github.com/apache/incubator-pekko-connectors-kafka/issues";
       text: "Github issues"
     }
     release-notes: {


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to