kaldesai commented on code in PR #639:
URL: 
https://github.com/apache/incubator-kie-kogito-docs/pull/639#discussion_r1615976276


##########
serverlessworkflow/modules/ROOT/pages/cloud/operator/configuring-knative-eventing-resources.adoc:
##########
@@ -1,60 +1,425 @@
 = Knative Eventing
+:sectnums:
+
 :compat-mode!:
 // Metadata:
-:description: Configuration of knatve eventing deployed by the operator
+:description: Configuration of knative eventing deployed by the operator
 :keywords: kogito, sonataflow, workflow, serverless, operator, kubernetes, 
knative, knative-eventing, events
 
-This document describes how you can configure the workflows to let operator 
create the Knative eventing resources on Kubernetes.
+This document describes how to configure the workflows, and the supporting 
services, to use link:{knative_eventing_url}[Knative Eventing] as preferred 
eventing system.
 
-{operator_name} can analyze the event definitions from the `spec.flow` and 
create `SinkBinding`/`Trigger` based on the type of the event. Then the 
workflow service can utilize them for event communications.
+In general, the following events are produced in a {product_name} installation:
 
-[NOTE]
+* Workflow outgoing and incoming business events.
+* {product_name} system events sent from the workflow to the Data Index and 
Job Service respectively.
+* {product_name} system events sent from the Jobs Service to the Data Index 
Service.
+
+[IMPORTANT]
 ====
- Alternativelly, you can follow our 
xref:use-cases/advanced-developer-use-cases/event-orchestration/consume-produce-events-with-knative-eventing.adoc#ref-example-sw-event-definition-knative[advanced
 guide] that uses Java and Quarkus to introduce this feature.
+These content of this guide must be used only when you work with workflows 
using the `preview` and `gitops` profiles.
 ====
 
+To produce a successful configuration you must follow this procedure:
+
 == Prerequisite
+
 1. The {operator_name} installed. See 
xref:cloud/operator/install-serverless-operator.adoc[] guide.
-2. Knative is installed on the cluster and Knative Eventing is initiated with 
a `KnativeEventing` CR.
-3. A broker named `default` is created. Currently, all Triggers created by the 
{operator_name} will read events from `default`
+2. The link:{knative_eventing_url}[Knative Eventing] system is installed and 
property initiated in the cluster.
+
+== Configuring the Knative Broker
+
+Create a Knative Broker to define the event mesh to collect the events with a 
resource like this:
+
+[source,yaml]
+----
+apiVersion: eventing.knative.dev/v1
+kind: Broker
+metadata:
+  name: default
+  namespace: example-namespace
+----
+
+For more information on Knative Brokers 
link:{knative_eventing_broker_url}[see].
+
+[NOTE]
+====
+The example creates an in-memory broker for simplicity. In production 
environments you must use a production ready broker, like the 
link:{knative_eventing_kafka_broker_url}[Knative Kafka] broker.

Review Comment:
   ```suggestion
   The example creates an in-memory broker for simplicity. In production 
environments, you must use a production-ready broker, like the 
link:{knative_eventing_kafka_broker_url}[Knative Kafka] broker.
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to