This is an automated email from the ASF dual-hosted git repository.
dhanak pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/incubator-kie-kogito-docs.git
The following commit(s) were added to refs/heads/main by this push:
new 22b022b97 NO-ISSUE: Fix recurring minor typos (#607)
22b022b97 is described below
commit 22b022b9738d7416c94192b2cb34f1b761058973
Author: Tomáš David <[email protected]>
AuthorDate: Tue Mar 26 09:11:03 2024 -0400
NO-ISSUE: Fix recurring minor typos (#607)
Fix recurring minor typos
---
CONTRIBUTING.md | 10 +++++-----
serverlessworkflow/modules/ROOT/pages/cloud/index.adoc | 4 ++--
.../pages/cloud/operator/build-and-deploy-workflows.adoc | 4 ++--
.../ROOT/pages/cloud/operator/building-custom-images.adoc | 2 +-
.../operator/configuring-knative-eventing-resources.adoc | 2 +-
.../ROOT/pages/cloud/operator/enabling-jobs-service.adoc | 4 ++--
.../pages/cloud/operator/install-serverless-operator.adoc | 2 +-
.../ROOT/pages/cloud/operator/supporting-services.adoc | 4 ++--
.../modules/ROOT/pages/cloud/operator/using-persistence.adoc | 2 +-
.../pages/cloud/operator/workflow-status-conditions.adoc | 2 +-
.../modules/ROOT/pages/core/custom-functions-support.adoc | 6 +++---
.../pages/core/defining-an-input-schema-for-workflows.adoc | 2 +-
.../ROOT/pages/core/understanding-jq-expressions.adoc | 2 +-
.../ROOT/pages/getting-started/java-embedded-workflows.adoc | 4 ++--
.../ROOT/pages/getting-started/preparing-environment.adoc | 6 +++---
serverlessworkflow/modules/ROOT/pages/index.adoc | 8 ++++----
.../modules/ROOT/pages/integrations/core-concepts.adoc | 4 ++--
.../modules/ROOT/pages/job-services/core-concepts.adoc | 4 ++--
.../orchestrating-third-party-services-with-oauth2.adoc | 2 +-
.../configuring-openapi-services-endpoints.adoc | 2 +-
.../quarkus-dev-ui-workflow-definition-page.adoc | 12 ++++++------
.../quarkus-dev-ui-workflow-instances-page.adoc | 4 ++--
.../serverless-logic-web-tools-github-integration.adoc | 4 ++--
.../serverless-logic-web-tools-openshift-integration.adoc | 2 +-
...ic-web-tools-redhat-application-services-integration.adoc | 6 +++---
.../swf-editor-chrome-extension.adoc | 2 +-
.../callbacks/callback-state-example.adoc | 2 +-
.../data-index/data-index-as-quarkus-dev-service.adoc | 2 +-
.../deployments/deploying-on-openshift.adoc | 4 ++--
.../build-workflow-image-with-quarkus-cli.adoc | 2 +-
.../getting-started/create-your-first-workflow-service.adoc | 2 +-
.../working-with-serverless-workflow-quarkus-examples.adoc | 2 +-
.../pages/use-cases/advanced-developer-use-cases/index.adoc | 4 ++--
.../integrations/custom-functions-knative.adoc | 2 +-
.../integrations/custom-functions-python.adoc | 2 +-
.../timeouts/timeout-showcase-example.adoc | 2 +-
36 files changed, 65 insertions(+), 65 deletions(-)
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index a323126cc..32301e5cb 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -6,15 +6,15 @@ We accept all kinds of contributions:
2. Opening [an
issue](https://github.com/apache/incubator-kie-kogito-docs/issues/new) by
describing what problem you see that we need to fix
3. Opening [a PR](https://github.com/apache/incubator-kie-kogito-docs/compare)
if you see a typo, broken link, or any other minor changes.
-> To include a new guide or documentation content, please **open an issue
first** so we can discuss in more detail what needs to be done. We use
[Issues](https://github.com/apache/incubator-kie-kogito-docs/issues) to track
our tasks. Please include a good title and thorought description.
+> To include a new guide or documentation content, please **open an issue
first** so we can discuss in more detail what needs to be done. We use
[Issues](https://github.com/apache/incubator-kie-kogito-docs/issues) to track
our tasks. Please include a good title and thorough description.
## Including a new guide
-1. Open a [an
issue](https://github.com/apache/incubator-kie-kogito-docs/issues/new) provide
a description and link any pull-requests realted to the guide.
+1. Open [an
issue](https://github.com/apache/incubator-kie-kogito-docs/issues/new) provide
a description and link any pull-requests related to the guide.
2. Write the guide.
3. Add a link to the guide in
[serverlessworkflow/modules/ROOT/nav.adoc](serverlessworkflow/modules/ROOT/nav.adoc)
4. Add a card for the guide in
[serverlessworkflow/modules/ROOT/pages/index.adoc](serverlessworkflow/modules/ROOT/pages/index.adoc)
-5. Submit a [a PR](https://github.com/apache/incubator-kie-kogito-docs/compare)
+5. Submit [a PR](https://github.com/apache/incubator-kie-kogito-docs/compare)
## Opening an Issue
@@ -84,7 +84,7 @@ Use active voice.
:x: _Passive:_ The Limits window is used to specify the minimum and maximum
values.
:white_check_mark: _Active:_ In the Limits window, specify the minimum and
maximum values.
-Use second person (you). Avoid first person (I, we, us). Be gender neutral.
Use the appropriate tone. Write for a global audience.
+Use second person (you). Avoid first person (I, we, us). Be gender-neutral.
Use the appropriate tone. Write for a global audience.
:x: We can add a model to the project that we created in the previous step.
:white_check_mark: You can add a model to the project that you created in the
previous step.
@@ -203,7 +203,7 @@ Content
====
```
-Similarly you can have other admonitions:
+Similarly, you can have other admonitions:
- `TIP`
- `IMPORTANT`
diff --git a/serverlessworkflow/modules/ROOT/pages/cloud/index.adoc
b/serverlessworkflow/modules/ROOT/pages/cloud/index.adoc
index 59b944cbe..4780864ce 100644
--- a/serverlessworkflow/modules/ROOT/pages/cloud/index.adoc
+++ b/serverlessworkflow/modules/ROOT/pages/cloud/index.adoc
@@ -13,7 +13,7 @@ The cards below list all features included in the platform to
deploy workflow ap
[NOTE]
====
-Eventually, these two options will converge, and the {operator_name} will also
be able to handle full Quarkus projects. So if you opt-in to use Quarkus now
and manually deploy your workflows, bear in mind that it's on the project's
roadmap to integrate the Quarkus experience with the Operator.
+Eventually, these two options will converge, and the {operator_name} will also
be able to handle full Quarkus projects. So if you opt in to use Quarkus now
and manually deploy your workflows, bear in mind that it's on the project's
roadmap to integrate the Quarkus experience with the Operator.
====
[.card-section]
@@ -128,7 +128,7 @@ Learn about the known issues and feature Roadmap of the
{operator_name}
[.card-section]
== Kubernetes with Quarkus
-For Java developers, you can use Quarkus and a few add-ons to help you build
and deploy the application in a Kubernetes cluster. {product_name} also
generates basic Kubernetes objects YAML files to help you getting started. The
application should be managed by a Kubernetes administrator.
+For Java developers, you can use Quarkus and a few add-ons to help you build
and deploy the application in a Kubernetes cluster. {product_name} also
generates basic Kubernetes objects YAML files to help you to get started. The
application should be managed by a Kubernetes administrator.
[.card]
--
diff --git
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/build-and-deploy-workflows.adoc
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/build-and-deploy-workflows.adoc
index 820a42ca4..077a76892 100644
---
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/build-and-deploy-workflows.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/build-and-deploy-workflows.adoc
@@ -481,7 +481,7 @@ spec:
end: true
----
-Save a file in your local file system with this contents named
`greetings-workflow.yaml` then run:
+Save a file in your local file system with this content named
`greetings-workflow.yaml` then run:
[source,bash,subs="attributes+"]
----
@@ -562,7 +562,7 @@ metadata:
After editing the resource, the operator will start a new build of the
workflow. Once this is finished, the workflow will be notified and updated
accordingly.
-If the build fails, but the workflow has a working deployment, the operator
won't rollout a new deployment.
+If the build fails, but the workflow has a working deployment, the operator
won't roll out a new deployment.
Ideally you should use this feature if there's a problem with your workflow or
the initial build revision.
diff --git
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/building-custom-images.adoc
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/building-custom-images.adoc
index 1f44fbea0..ad2cdc8ba 100644
---
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/building-custom-images.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/building-custom-images.adoc
@@ -100,7 +100,7 @@ CMD ["/home/kogito/launch/run-app-devmode.sh"] <8>
----
<1> The dev mode image as the base image
-<2> Change to super user to run privileged actions
+<2> Change to superuser to run privileged actions
<3> Install additional packages
<4> Change back to the default user without admin privileges
<5> Add a new binary path to the `PATH`
diff --git
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/configuring-knative-eventing-resources.adoc
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/configuring-knative-eventing-resources.adoc
index 696554c44..499e44da5 100644
---
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/configuring-knative-eventing-resources.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/configuring-knative-eventing-resources.adoc
@@ -10,7 +10,7 @@ This document describes how you can configure the workflows
to let operator crea
== Prerequisite
1. Knative is installed on the cluster and Knative Eventing is initiated with
a `KnativeEventing` CR.
-2. A broker named `default` is created. Currently all Triggers created by the
{operator_name} will read events from `default`
+2. A broker named `default` is created. Currently, all Triggers created by the
{operator_name} will read events from `default`
== Configuring the workflow
diff --git
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/enabling-jobs-service.adoc
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/enabling-jobs-service.adoc
index bb8da60f3..7d42dbd31 100644
---
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/enabling-jobs-service.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/enabling-jobs-service.adoc
@@ -45,7 +45,7 @@ sonataflow-platform-jobs-service-cdf85d969-sbwkj 1/1
Running 0
Keep in mind that this setup is not recommended for production environments,
especially because the data does not persist when the pod restarts.
=== Using an existing PostgreSQL service
-For robust environments it is recommened to use an dedicated database service
and configure Jobs Service to make use of it. Currently, the Jobs Service
+For robust environments it is recommened to use a dedicated database service
and configure Jobs Service to make use of it. Currently, the Jobs Service
only supports PostgreSQL database.
Configuring Jobs Service to communicate with an existing PostgreSQL instance
is supported in two ways. In both cases it requires providing the persistence
@@ -57,7 +57,7 @@ By default, the persistence specification defined in the
`SonataFlow` workflow's
==== Using the persistence field defined in the `SonataFlowPlatform` CR
Using the persistence configuration in the `SonataFlowPlatform` CR located in
the same namespace requires to have the `SonataFlow` CR persistence field
configured
to have an empty `{}` value, signaling the Operator to derive the persistence
from the active `SonataFlowPlatform`, when available. If no persistence is
defined
-the operator will fallback to the ephemeral persistence previously described.
+the operator will fall back to the ephemeral persistence previously described.
[source,yaml,subs="attributes+"]
---
diff --git
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/install-serverless-operator.adoc
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/install-serverless-operator.adoc
index 25e7e1555..b36115ff0 100644
---
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/install-serverless-operator.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/install-serverless-operator.adoc
@@ -11,7 +11,7 @@
:kubernetes_operator_uninstall_url:
https://olm.operatorframework.io/docs/tasks/uninstall-operator/
:operatorhub_url: https://operatorhub.io/
-This guide describes how to install the {operator_name} in a Kubernetes or
OpenShift cluster. The operator is in an
xref:/cloud/operator/known-issues.adoc[early development stage] (community
only) and has been tested on OpenShift {openshift_version_min}+, Kubernetes
{kubernetes_version}+, and link:{minikube_url}[Minikube].
+This guide describes how to install the {operator_name} in a Kubernetes or
OpenShift cluster. The operator is in an
xref:cloud/operator/known-issues.adoc[early development stage] (community only)
and has been tested on OpenShift {openshift_version_min}+, Kubernetes
{kubernetes_version}+, and link:{minikube_url}[Minikube].
.Prerequisites
* A Kubernetes or OpenShift cluster with admin privileges. Alternatively, you
can use Minikube or KIND.
diff --git
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/supporting-services.adoc
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/supporting-services.adoc
index de33f3862..1aacc7bb6 100644
---
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/supporting-services.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/supporting-services.adoc
@@ -6,7 +6,7 @@
// links
:kogito_serverless_operator_url:
https://github.com/apache/incubator-kie-kogito-serverless-operator/
-By default, workflows use an embedded version of
xref:../../data-index/data-index-core-concepts.adoc[Data Index]. This document
describes how to deploy supporting services, like Data Index, on a cluster
using the link:{kogito_serverless_operator_url}[{operator_name}].
+By default, workflows use an embedded version of
xref:data-index/data-index-core-concepts.adoc[Data Index]. This document
describes how to deploy supporting services, like Data Index, on a cluster
using the link:{kogito_serverless_operator_url}[{operator_name}].
[IMPORTANT]
====
@@ -125,7 +125,7 @@ These cluster-wide services can be overridden in any
namespace, by configuring t
== Additional resources
-* xref:../../data-index/data-index-service.adoc[]
+* xref:data-index/data-index-service.adoc[]
* xref:cloud/operator/enabling-jobs-service.adoc[]
* xref:cloud/operator/known-issues.adoc[]
diff --git
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/using-persistence.adoc
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/using-persistence.adoc
index 1c5323f65..a94ad32bd 100644
---
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/using-persistence.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/using-persistence.adoc
@@ -11,7 +11,7 @@ This document describes how to configure a SonataFlow
instance to use persistenc
Kubernetes's pods are stateless by definition. In some scenarios, this can be
a challenge for workloads that require maintaining the status of
the application regardless of the pod's lifecycle. In the case of
{product_name}, the context of the workflow is lost when the pod restarts.
-If your workflow requires recovery from such scenarios, you must to make these
additions to your workflow CR:
+If your workflow requires recovery from such scenarios, you have to make these
additions to your workflow CR:
Use the `persistence` field in the `SonataFlow` workflow spec to define the
database service located in the same cluster.
There are 2 ways to accomplish this:
diff --git
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/workflow-status-conditions.adoc
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/workflow-status-conditions.adoc
index 05df329ca..36f90a779 100644
---
a/serverlessworkflow/modules/ROOT/pages/cloud/operator/workflow-status-conditions.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/cloud/operator/workflow-status-conditions.adoc
@@ -80,7 +80,7 @@ The following table lists the possible Conditions.
| Running
| False
| AttemptToRedeployFailed
-| If the Workflow Deployment is not available, the operator will try to
rollout the Deployment three times before entering this stage. Check the
message in this Condition and the Workflow Pod logs for more info
+| If the Workflow Deployment is not available, the operator will try to roll
out the Deployment three times before entering this stage. Check the message in
this Condition and the Workflow Pod logs for more info
|===
diff --git
a/serverlessworkflow/modules/ROOT/pages/core/custom-functions-support.adoc
b/serverlessworkflow/modules/ROOT/pages/core/custom-functions-support.adoc
index 706ac7cc5..f299d1c80 100644
--- a/serverlessworkflow/modules/ROOT/pages/core/custom-functions-support.adoc
+++ b/serverlessworkflow/modules/ROOT/pages/core/custom-functions-support.adoc
@@ -586,7 +586,7 @@ This particular endpoint expects as body a JSON object
whose field `numbers` is
If `inputNumbers` contains `1`, `2`, and `3`, the output of the call will be
`1*3+2*3+3*3=18.
-In case you want to specify headers in your HTTP request, you might do it by
adding arguments starting with the `HEADER_` prefix. Therefore if you add
`"HEADER_ce_id": "123"` to the previous argument set, you will be adding a
header named `ce_id` with the value `123` to your request. A similar approach
might be used to add query params to a GET request, in that case, you must add
arguments starting with the `QUERY_` prefix. Note that you can also use {}
notation for replacements of quer [...]
+In case you want to specify headers in your HTTP request, you might do it by
adding arguments starting with the `HEADER_` prefix. Therefore, if you add
`"HEADER_ce_id": "123"` to the previous argument set, you will be adding a
header named `ce_id` with the value `123` to your request. A similar approach
might be used to add query params to a GET request, in that case, you must add
arguments starting with the `QUERY_` prefix. Note that you can also use {}
notation for replacements of que [...]
For example, given the following function definition that performs a `get`
request
@@ -639,7 +639,7 @@ It must contain a Java class that inherits from
`WorkItemTypeHandler`. Its respo
+
The runtime project consists of a `WorkflowWorkItemHandler` implementation,
which name must match with the one provided to `WorkItemNodeFactory` during the
deployment phase, and a `WorkItemHandlerConfig` bean that registers that
handler with that name.
+
-When a Serverless Workflow function is called, Kogito identifies the proper
`WorkflowWorkItemHandler` instance to be used for that function type (using the
handler name associated with that type by the deployment project) and then
invokes the `internalExecute` method. The `Map` parameter contains the function
arguments defined in the workflow, and the `WorkItem` parameter contains the
metadata information added to the handler by the deployment project. Hence, the
`executeWorkItem` implem [...]
+When a Serverless Workflow function is called, Kogito identifies the proper
`WorkflowWorkItemHandler` instance to be used for that function type (using the
handler name associated with that type by the deployment project) and then
invokes the `internalExecute` method. The `Map` parameter contains the function
arguments defined in the workflow, and the `WorkItem` parameter contains the
metadata information added to the handler by the deployment project. Hence, the
`executeWorkItem` implem [...]
=== Custom function type example
@@ -666,7 +666,7 @@ The `operation` starts with `rpc`, which is the custom type
identifier, and cont
A Kogito addon that defines the `rpc` custom type must be developed for this
function definition to be identified. It is consist of a
link:{kogito_sw_examples_url}/serverless-workflow-custom-type/serverless-workflow-custom-rpc-deployment[deployment
project] and a
link:{kogito_sw_examples_url}/serverless-workflow-custom-type/serverless-workflow-custom-rpc[runtime
project].
-The deployment project is responsible for extending the
link:{kogito_sw_examples_url}/serverless-workflow-custom-type/serverless-workflow-custom-rpc-deployment/src/main/java/org/kie/kogito/examples/sw/services/RPCCustomTypeHandler.java[`WorkItemTypeHandler`]
and setup the `WorkItemNodeFactory` as follows:
+The deployment project is responsible for extending the
link:{kogito_sw_examples_url}/serverless-workflow-custom-type/serverless-workflow-custom-rpc-deployment/src/main/java/org/kie/kogito/examples/sw/services/RPCCustomTypeHandler.java[`WorkItemTypeHandler`]
and setup of the `WorkItemNodeFactory` as follows:
.Example of the RPC function Java implementation
diff --git
a/serverlessworkflow/modules/ROOT/pages/core/defining-an-input-schema-for-workflows.adoc
b/serverlessworkflow/modules/ROOT/pages/core/defining-an-input-schema-for-workflows.adoc
index 1b3509ff5..119162c40 100644
---
a/serverlessworkflow/modules/ROOT/pages/core/defining-an-input-schema-for-workflows.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/core/defining-an-input-schema-for-workflows.adoc
@@ -25,7 +25,7 @@ In the previous definition, the `schema` property is a URI,
which holds the path
== Output schema
-Serverless Workflow specification does not support JSON output schema until
version 0.9. Therefore {product_name} is implementing it as a
link:{spec_doc_url}#extensions[Serverless Workflow specification extension].
Output schema is applied after workflow execution to verify that the output
model has the expected format. It is also useful for Swagger generation
purposes.
+Serverless Workflow specification does not support JSON output schema until
version 0.9. Therefore, {product_name} is implementing it as a
link:{spec_doc_url}#extensions[Serverless Workflow specification extension].
Output schema is applied after workflow execution to verify that the output
model has the expected format. It is also useful for Swagger generation
purposes.
Similar to Input schema, you must specify the URL to the JSON schema, using
`outputSchema` as follows:
diff --git
a/serverlessworkflow/modules/ROOT/pages/core/understanding-jq-expressions.adoc
b/serverlessworkflow/modules/ROOT/pages/core/understanding-jq-expressions.adoc
index d7cd35ade..53ec1ede3 100644
---
a/serverlessworkflow/modules/ROOT/pages/core/understanding-jq-expressions.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/core/understanding-jq-expressions.adoc
@@ -15,7 +15,7 @@ The workflow expressions in the
link:{spec_doc_url}#workflow-expressions[Serverl
This document describes the usage of jq expressions in functions, switch state
conditions, action function arguments, data filtering, and event publishing.
-JQ expression might be tricky to master, for non trivial cases, it is
recommended to use helper tools like link:{jq_play}[JQ Play] to validate the
expression before including it in the workflow file.
+JQ expression might be tricky to master, for non-trivial cases, it is
recommended to use helper tools like link:{jq_play}[JQ Play] to validate the
expression before including it in the workflow file.
[[ref-example-jq-expression-function]]
== Example of jq expression in functions
diff --git
a/serverlessworkflow/modules/ROOT/pages/getting-started/java-embedded-workflows.adoc
b/serverlessworkflow/modules/ROOT/pages/getting-started/java-embedded-workflows.adoc
index c312a70e1..868f9864a 100644
---
a/serverlessworkflow/modules/ROOT/pages/getting-started/java-embedded-workflows.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/getting-started/java-embedded-workflows.adoc
@@ -1,7 +1,7 @@
= Workflow embedded execution in Java
This guide uses a standard Java virtual machine and a small set of Maven
dependencies to execute a link:{spec_doc_url}[CNCF Serverless Workflow]
definition. Therefore, it is assumed you are fluent both in Java and Maven.
-The workflow definition to be executed can be read from a `.json` or `.yaml`
file or programmatically defined using the {product_name} fluent API.
+The workflow definition to be executed can be read from a `.json` or `.yaml`
file or programmatically defined using the {product_name} fluent API.
.Prerequisites
. Install https://openjdk.org/[OpenJDK] {java_min_version}
. Install https://maven.apache.org/index.html[Apache Maven]
{maven_min_version}.
@@ -42,7 +42,7 @@ public class DefinitionFileExecutor {
<1> Reads the workflow file definition from the project root directory
<2> Creates a static workflow application object. It is done within the try
block since the instance is `Closeable`. This is the reference that allow you
to execute workflow definitions.
<3> Reads the Serverless Workflow Java SDK `Workflow` object from the file.
-<4> Execute the workflow, passing `Workflow` reference and no parameters (an
empty Map). The result of the workflow execution: process instance id and
workflow output model, can accessed using `result` variable.
+<4> Execute the workflow, passing `Workflow` reference and no parameters (an
empty Map). The result of the workflow execution: process instance id and
workflow output model, can be accessed using `result` variable.
<5> Prints the workflow model in the configured standard output.
If you compile and execute this Java class, you will see the following log in
your configured standard output:
diff --git
a/serverlessworkflow/modules/ROOT/pages/getting-started/preparing-environment.adoc
b/serverlessworkflow/modules/ROOT/pages/getting-started/preparing-environment.adoc
index 73a1feb38..def32643a 100644
---
a/serverlessworkflow/modules/ROOT/pages/getting-started/preparing-environment.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/getting-started/preparing-environment.adoc
@@ -6,7 +6,7 @@ If you are new, start with the minimal one.
[[proc-minimal-local-environment-setup]]
== Minimal local environment setup
-Recommended steps to setup your local development environment. By completing
these steps you are able to
+Recommended steps to set up your local development environment. By completing
these steps you are able to
start the development on your local machine using our guides.
.Procedure
@@ -26,7 +26,7 @@ If you have used
https://knative.dev/docs/install/quickstart-install/[Knative us
Please note, that if the knative quickstart procedure is not used, you need to
install Knative Serving and Eventing manually. See
<<proc-additional-options-for-local-environment>>.
-.To startup the selected cluster without quickstart, use the following command:
+.To start up the selected cluster without quickstart, use the following
command:
[tabs]
====
Minikube with Docker::
@@ -85,7 +85,7 @@ If you are interested in our Java and Quarkus development
path, consider complet
.Procedure
. Install https://openjdk.org/[OpenJDK] {java_min_version} and configure
`JAVA_HOME` appropriately by adding it to the `PATH`.
. Install https://maven.apache.org/index.html[Apache Maven]
{maven_min_version}.
-. Install https://quarkus.io/guides/cli-tooling[Quarkus CLI] corresponding to
the currently supported version by {product_name}. Currently it is
{quarkus_version}.
+. Install https://quarkus.io/guides/cli-tooling[Quarkus CLI] corresponding to
the currently supported version by {product_name}. Currently, it is
{quarkus_version}.
[[proc-additional-options-for-local-environment]]
== Additional options for local environment setup
diff --git a/serverlessworkflow/modules/ROOT/pages/index.adoc
b/serverlessworkflow/modules/ROOT/pages/index.adoc
index 2797c1ff4..5d72fb764 100644
--- a/serverlessworkflow/modules/ROOT/pages/index.adoc
+++ b/serverlessworkflow/modules/ROOT/pages/index.adoc
@@ -24,7 +24,7 @@ Learn about the recommended environment to start your journey
with {product_name
[.card-title]
xref:getting-started/preparing-environment.adoc[]
[.card-description]
-An all-in-one guide to prepare you environment for you to be uncover the full
potential of {product_name} guides on your local environment.
+An all-in-one guide to prepare your environment for you to uncover the full
potential of {product_name} guides on your local environment.
--
[.card]
@@ -59,7 +59,7 @@ An all-in-one starting guide. Learn how to create, run &
deploy your first {prod
[.card-title]
xref:getting-started/java-embedded-workflows.adoc[]
[.card-description]
-Learn about how to executed your workflows (existing files or define them
programatically) using Java and Maven.
+Learn about how to execute your workflows (existing files or define them
programmatically) using Java and Maven.
--
[.card-section]
@@ -177,7 +177,7 @@ Learn how to use {serverless_logic_web_tools_name} for
creating and managing wor
[.card-title]
xref:tooling/serverless-workflow-editor/swf-editor-chrome-extension.adoc[Chrome
extension for Serverless Workflow editor on GitHub]
[.card-description]
-Learn how to install and use the Chrome extension for Serverless Workflow
editor to view and edit workflows directly in Github.
+Learn how to install and use the Chrome extension for Serverless Workflow
editor to view and edit workflows directly in GitHub.
--
[.card-section]
@@ -322,7 +322,7 @@ Go deeper in details about Data Index as standalone service
deployment.
== Use Cases
Collection of guides showcasing core concepts of {product_name} or providing a
solution to specific problem in our domain.
-In the `Advanced Developer Use Cases` section, you can find guides that use
Java and Quarkus to create {product_name} applications. These guides allow
users to vastly customize their applications depending on their use case. Good
undertsanding and knowledge of these technologies is expected.
+In the `Advanced Developer Use Cases` section, you can find guides that use
Java and Quarkus to create {product_name} applications. These guides allow
users to vastly customize their applications depending on their use case. Good
understanding and knowledge of these technologies is expected.
[.card-section]
== Advanced Developer Use Cases
diff --git
a/serverlessworkflow/modules/ROOT/pages/integrations/core-concepts.adoc
b/serverlessworkflow/modules/ROOT/pages/integrations/core-concepts.adoc
index c8e1a3e79..b8454c38f 100644
--- a/serverlessworkflow/modules/ROOT/pages/integrations/core-concepts.adoc
+++ b/serverlessworkflow/modules/ROOT/pages/integrations/core-concepts.adoc
@@ -1,7 +1,7 @@
= Introduction
-This guides describes the possibilities of workflow services integrations.
-Currently we showcase these in advanced development guides. See additional
resources.
+This guide describes the possibilities of workflow services integrations.
+Currently, we showcase these in advanced development guides. See additional
resources.
== Additional resources
diff --git
a/serverlessworkflow/modules/ROOT/pages/job-services/core-concepts.adoc
b/serverlessworkflow/modules/ROOT/pages/job-services/core-concepts.adoc
index 1cd2fdf6a..d5cce1902 100644
--- a/serverlessworkflow/modules/ROOT/pages/job-services/core-concepts.adoc
+++ b/serverlessworkflow/modules/ROOT/pages/job-services/core-concepts.adoc
@@ -537,7 +537,7 @@ Using environment variables::
|`QUARKUS_PROFILE`
|Set the quarkus profile with the value `kafka-events_support` to enable the
kafka messaging based Job Service Eventing API.
-|By default the kafka eventing api is disabled.
+|By default, the kafka eventing api is disabled.
|`KOGITO_JOBS_SERVICE_KAFKA_JOB_STATUS_CHANGE_EVENTS`
|`true` to establish if the Job Status Change events must be propagated.
@@ -566,7 +566,7 @@ Using system properties with java like names::
|quarkus.profile
|Set the quarkus profile with the value `kafka-events_support` to enable the
kafka messaging based Job Service Eventing API.
-|By default the kafka eventing api is disabled.
+|By default, the kafka eventing api is disabled.
|`kogito.jobs-service.kafka.job-status-change-events`
|`true` to establish if the Job Status Change events must be propagated.
diff --git
a/serverlessworkflow/modules/ROOT/pages/security/orchestrating-third-party-services-with-oauth2.adoc
b/serverlessworkflow/modules/ROOT/pages/security/orchestrating-third-party-services-with-oauth2.adoc
index e5af5d57d..1347fcce0 100644
---
a/serverlessworkflow/modules/ROOT/pages/security/orchestrating-third-party-services-with-oauth2.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/security/orchestrating-third-party-services-with-oauth2.adoc
@@ -39,7 +39,7 @@ When you use the Acme Financial Services, you can query the
exchange rates using
* Orchestration with services provided by Acme and currency exchange
calculations.
* Authentication requirements to access the services provided by Acme.
-* Potential vendor lock-in problems, in case you want to change the provider
in future.
+* Potential vendor lock-in problems, in case you want to change the provider
in the future.
* Domain-specific validations and optimizations.
The further sections describes how an end-to-end solution is created in the
`serverless-workflow-oauth2-orchestration-quarkus` example application. To see
the source code of `serverless-workflow-oauth2-orchestration-quarkus` example
application, you can clone the
link:{kogito_examples_repository_url}[kogito-examples] repository in GitHub and
select the
`serverless-workflow-examples/serverless-workflow-oauth2-orchestration-quarkus`
directory.
diff --git
a/serverlessworkflow/modules/ROOT/pages/service-orchestration/configuring-openapi-services-endpoints.adoc
b/serverlessworkflow/modules/ROOT/pages/service-orchestration/configuring-openapi-services-endpoints.adoc
index ab8568f2f..6e83474ee 100644
---
a/serverlessworkflow/modules/ROOT/pages/service-orchestration/configuring-openapi-services-endpoints.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/service-orchestration/configuring-openapi-services-endpoints.adoc
@@ -115,7 +115,7 @@ A Kubernetes service endpoint can be used as a service URL
if the target service
=== Using URI alias
-As an alternative to `kogito.sw.operationIdStrategy`, you can assign an alias
name to an URI by using `workflow-uri-definitions` custom
link:{spec_doc_url}#extensions[extension]. Then you can use that alias as
configuration key and in function definitions.
+As an alternative to `kogito.sw.operationIdStrategy`, you can assign an alias
name to a URI by using `workflow-uri-definitions` custom
link:{spec_doc_url}#extensions[extension]. Then you can use that alias as
configuration key and in function definitions.
.Example workflow
[source,json]
diff --git
a/serverlessworkflow/modules/ROOT/pages/testing-and-troubleshooting/quarkus-dev-ui-extension/quarkus-dev-ui-workflow-definition-page.adoc
b/serverlessworkflow/modules/ROOT/pages/testing-and-troubleshooting/quarkus-dev-ui-extension/quarkus-dev-ui-workflow-definition-page.adoc
index 345de35b5..4ab20697c 100644
---
a/serverlessworkflow/modules/ROOT/pages/testing-and-troubleshooting/quarkus-dev-ui-extension/quarkus-dev-ui-workflow-definition-page.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/testing-and-troubleshooting/quarkus-dev-ui-extension/quarkus-dev-ui-workflow-definition-page.adoc
@@ -40,7 +40,7 @@ The {product_name} Dev UI extension allows you to use both
mechanisms.
=== Starting new Workflow instances using REST
If you want to start a new workflow instance using the workflow REST endpoint,
just click on the *Start new Workflow*
button of any of the workflow in the *Workflow Definitions* table, then you'll
be redirected to the *Start New Workflow*
-page where you could setup the data and Business Key that will be used to
start the new workflow instance.
+page where you could set up the data and Business Key that will be used to
start the new workflow instance.
=== Filling up the Workflow data
Depending on your workflow configuration the page can provide different
mechanisms to fill the workflow data.
@@ -57,7 +57,7 @@
image::testing-and-troubleshooting/quarkus-dev-ui-extension/kogito-swf-tools-sta
[NOTE]
====
-For more information about how to setup the Input Schema Definition on your
{product_name}, please take a look at the
+For more information about how to set up the Input Schema Definition on your
{product_name}, please take a look at the
xref:core/defining-an-input-schema-for-workflows.adoc[Input Schema for
{product_name}] section.
====
@@ -67,13 +67,13 @@ If the *Business Key* field is blank, then an
auto-generated business key is def
=== Starting the new Workflow instance
By clicking on the *Start* button will POST the workflow data and the Business
Key to the workflow REST endpoint. If the
-workflow instance starts successfully, a success alert appears on the top of
the screen, which contains the
+workflow instance starts successfully, a success alert appears at the top of
the screen, which contains the
*Go to workflow list* link to navigate to the
xref:testing-and-troubleshooting/quarkus-dev-ui-extension/quarkus-dev-ui-workflow-instances-page.adoc[Workflow
Instances page].
.Example of workflow successful starting notification
image::testing-and-troubleshooting/quarkus-dev-ui-extension/kogito-swf-tools-start-workflow-success-alert.png[]
-If there is an issue while starting a workflow, then a failure alert appears
on the top of the screen, containing the*View Details* and *Go to workflow
list* options. The *View Details* enables you to view the error message.
+If there is an issue while starting a workflow, then a failure alert appears
at the top of the screen, containing the*View Details* and *Go to workflow
list* options. The *View Details* enables you to view the error message.
.Example of workflow starting failure notification
image::testing-and-troubleshooting/quarkus-dev-ui-extension/kogito-swf-tools-start-workflow-fail-alert.png[]
@@ -95,13 +95,13 @@ Once there, you will have to fill out the form with the
Cloud Event information:
.Starting a workflow using a cloud event
image::testing-and-troubleshooting/quarkus-dev-ui-extension/kogito-swf-tools-trigger-cloud-events.png[]
-Click the *Trigger* button to trigger the cloud event. If the workflow
instance starts successfully, a success alert appears on the top of the screen,
which contains the
+Click the *Trigger* button to trigger the cloud event. If the workflow
instance starts successfully, a success alert appears at the top of the screen,
which contains the
*Go to workflow list* link to navigate to the
xref:testing-and-troubleshooting/quarkus-dev-ui-extension/quarkus-dev-ui-workflow-instances-page.adoc[Workflow
Instances page].
.Example of workflow successful starting notification
image::testing-and-troubleshooting/quarkus-dev-ui-extension/kogito-swf-tools-trigger-cloud-event-start-success-alert.png[]
-If there is an issue while starting a workflow, then a failure alert appears
on the top of the screen, containing *View Details* and *Go to workflow list*
options. The *View Details* enables you to view the error message.
+If there is an issue while starting a workflow, then a failure alert appears
at the top of the screen, containing *View Details* and *Go to workflow list*
options. The *View Details* enables you to view the error message.
.Example of trigger workflow failure alert
image::testing-and-troubleshooting/quarkus-dev-ui-extension/kogito-swf-tools-trigger-cloud-event-start-error-alert.png[]
diff --git
a/serverlessworkflow/modules/ROOT/pages/testing-and-troubleshooting/quarkus-dev-ui-extension/quarkus-dev-ui-workflow-instances-page.adoc
b/serverlessworkflow/modules/ROOT/pages/testing-and-troubleshooting/quarkus-dev-ui-extension/quarkus-dev-ui-workflow-instances-page.adoc
index 5716dbe40..5374a664d 100644
---
a/serverlessworkflow/modules/ROOT/pages/testing-and-troubleshooting/quarkus-dev-ui-extension/quarkus-dev-ui-workflow-instances-page.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/testing-and-troubleshooting/quarkus-dev-ui-extension/quarkus-dev-ui-workflow-instances-page.adoc
@@ -121,8 +121,8 @@ Once there, you will have to fill out the form with the
Cloud Event information:
.Sending a Cloud Event to an active workflow instance.
image::testing-and-troubleshooting/quarkus-dev-ui-extension/kogito-swf-tools-workflow-instances-cloud-event.png[]
-Additionally, you can use the *Send Cloud Event* action present available on
the instance actions kebab. By using it you
-will be lead to the *Trigger Cloud Event* page, but in this case the *Instance
Id* field will be already filled with
+Additionally, you can use the *Send Cloud Event* action present available on
the instance actions kebab. By using it, you
+will be led to the *Trigger Cloud Event* page, but in this case the *Instance
Id* field will be already filled with
the selected workflow id.
.*Send Cloud Event* button in the actions kebab.
diff --git
a/serverlessworkflow/modules/ROOT/pages/tooling/serverless-logic-web-tools/serverless-logic-web-tools-github-integration.adoc
b/serverlessworkflow/modules/ROOT/pages/tooling/serverless-logic-web-tools/serverless-logic-web-tools-github-integration.adoc
index 86cb95f4d..5a0eb2107 100644
---
a/serverlessworkflow/modules/ROOT/pages/tooling/serverless-logic-web-tools/serverless-logic-web-tools-github-integration.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/tooling/serverless-logic-web-tools/serverless-logic-web-tools-github-integration.adoc
@@ -17,7 +17,7 @@ You can generate a token from your GitHub account and add the
token to the {serv
* You have an account in GitHub.
.Procedure
-. Go to
link:{serverless_logic_web_tools_url}[{serverless_logic_web_tools_name}] web
application, and click the *Cogwheel* (⚙️) on the top-right corner of the
screen.
+. Go to
link:{serverless_logic_web_tools_url}[{serverless_logic_web_tools_name}] web
application, and click the *Cogwheel* (⚙️) in the top-right corner of the
screen.
. Go to the *GitHub* tab.
. In the *GitHub* tab, click the *Add access token* button and a window will
be shown.
. Click *Create a new token* option.
@@ -43,7 +43,7 @@ For more information, see
<<proc-setting-github-token-serverless-logic-web-tools
.Procedure
. In the {serverless_logic_web_tools_name} web application, create or open a
workspace.
. Add or edit the existing files in the workspace.
-. Click *Share -> Github: Create Repository*.
+. Click *Share -> GitHub: Create Repository*.
. Name your repository and set the repository as *Public* or *Private*.
. (Optional) Select the *Use Quarkus Accelerator* to create a repository with
a base Quarkus project and move the workspace files to `src/main/resources`
folder.
+
diff --git
a/serverlessworkflow/modules/ROOT/pages/tooling/serverless-logic-web-tools/serverless-logic-web-tools-openshift-integration.adoc
b/serverlessworkflow/modules/ROOT/pages/tooling/serverless-logic-web-tools/serverless-logic-web-tools-openshift-integration.adoc
index 914f90ee9..59d039f92 100644
---
a/serverlessworkflow/modules/ROOT/pages/tooling/serverless-logic-web-tools/serverless-logic-web-tools-openshift-integration.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/tooling/serverless-logic-web-tools/serverless-logic-web-tools-openshift-integration.adoc
@@ -34,7 +34,7 @@ A new page opens containing your new API token along with `oc
cli` login command
image::tooling/serverless-logic-web-tools/serverless-logic-web-tools-openshift-info.png[]
--
-. Go to the {serverless_logic_web_tools_name} web application, click the
*Cogwheel* (⚙️) on the top-right corner and go to the *OpenShift* tab.
+. Go to the {serverless_logic_web_tools_name} web application, click the
*Cogwheel* (⚙️) in the top-right corner and go to the *OpenShift* tab.
. Click the *Add connection* button and a window will be shown.
. Enter your OpenShift project name in the *Namespace (project)* field.
. Enter the value copied value of `--server` flag in the *Host* field.
diff --git
a/serverlessworkflow/modules/ROOT/pages/tooling/serverless-logic-web-tools/serverless-logic-web-tools-redhat-application-services-integration.adoc
b/serverlessworkflow/modules/ROOT/pages/tooling/serverless-logic-web-tools/serverless-logic-web-tools-redhat-application-services-integration.adoc
index f1e13dfd4..981670ebf 100644
---
a/serverlessworkflow/modules/ROOT/pages/tooling/serverless-logic-web-tools/serverless-logic-web-tools-redhat-application-services-integration.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/tooling/serverless-logic-web-tools/serverless-logic-web-tools-redhat-application-services-integration.adoc
@@ -35,7 +35,7 @@ A modal displaying your *Client ID* and *Client Secret*
appears.
--
. If you already have a service account, find your *Client ID* and *Client
Secret*.
-. In the {serverless_logic_web_tools_name}, click the *Cogwheel* (⚙️) on the
top-right corner and go to the *Service Account* tab.
+. In the {serverless_logic_web_tools_name}, click the *Cogwheel* (⚙️) in the
top-right corner and go to the *Service Account* tab.
. Click on the *Add service account* button and a window will be shown.
. Enter your *Client ID* and *Client Secret* in the respective fields.
. Click *Apply*.
@@ -78,7 +78,7 @@ You must select the role as Manager or Administrator to have
the read and write
====
.. Click *Save*.
-.. Click on the menu on the top-right corner of the screen.
+.. Click on the menu in the top-right corner of the screen.
.. Click *Connection*.
+
A drawer opens containing the required connection and authentication
information.
@@ -87,7 +87,7 @@ A drawer opens containing the required connection and
authentication information
--
. If you already have a Service Registry, find the value of *Core Registry
API* of your Service Registry.
-. In the {serverless_logic_web_tools_name} web application, click the
*Cogwheel* (⚙️) on the top-right corner and go to the *Service Registry* tab.
+. In the {serverless_logic_web_tools_name} web application, click the
*Cogwheel* (⚙️) in the top-right corner and go to the *Service Registry* tab.
. Click on the *Add service registry* button and a window will be shown.
. Enter a name for your registry.
+
diff --git
a/serverlessworkflow/modules/ROOT/pages/tooling/serverless-workflow-editor/swf-editor-chrome-extension.adoc
b/serverlessworkflow/modules/ROOT/pages/tooling/serverless-workflow-editor/swf-editor-chrome-extension.adoc
index b8a8884f1..e2b75588b 100644
---
a/serverlessworkflow/modules/ROOT/pages/tooling/serverless-workflow-editor/swf-editor-chrome-extension.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/tooling/serverless-workflow-editor/swf-editor-chrome-extension.adoc
@@ -57,7 +57,7 @@ For more information, see
<<proc-install-chrome-extension-sw-editor, Installing
image::tooling/serverless-workflow-editor/swf-editor-in-github-readonly.png[]
--
-. To change the read-only mode to edit mode, click the pencil icon on the
top-right corner of the screen.
+. To change the read-only mode to edit mode, click the pencil icon in the
top-right corner of the screen.
+
--
.Serverless Workflow file in GitHub(edit mode)
diff --git
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/callbacks/callback-state-example.adoc
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/callbacks/callback-state-example.adoc
index cbc3a6680..7b2c0dc67 100644
---
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/callbacks/callback-state-example.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/callbacks/callback-state-example.adoc
@@ -105,7 +105,7 @@ The `serverless-workflow-callback-quarkus` example
application requires an exter
Apache Kafka uses topics to publish or consume messages. In the
`serverless-workflow-callback-quarkus` example application, two topics are
used, matching the name of the CloudEvent types that are defined in the
workflow, such as `resume` and `wait`. The `resume` and `wait` CloudEvent types
are configured in the
link:{kogito_sw_examples_url}/serverless-workflow-callback-quarkus/src/main/resources/application.properties[`application.properties`]
file.
-For more information about using Apache Kafka with events, see
link:xref:use-cases/advanced-developer-use-cases/event-orchestration/consume-producing-events-with-kafka.adoc[Consuming
and producing events using Apache Kafka].
+For more information about using Apache Kafka with events, see
xref:use-cases/advanced-developer-use-cases/event-orchestration/consume-producing-events-with-kafka.adoc[Consuming
and producing events using Apache Kafka].
--
+
diff --git
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/data-index/data-index-as-quarkus-dev-service.adoc
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/data-index/data-index-as-quarkus-dev-service.adoc
index 54e4a5a7b..f093e6d1b 100644
---
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/data-index/data-index-as-quarkus-dev-service.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/data-index/data-index-as-quarkus-dev-service.adoc
@@ -66,7 +66,7 @@ The following table serves as a quick reference for commonly
{data_index_ref} co
| Yes
|`QUARKUS_DATASOURCE_DB_KIND`
-a|The kind of database to connect: `postgresql`,..
+a|The kind of database to connect: `postgresql`, ...
|string
|
|Yes
diff --git
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/deployments/deploying-on-openshift.adoc
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/deployments/deploying-on-openshift.adoc
index a6e8cc5ae..5163c43ae 100644
---
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/deployments/deploying-on-openshift.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/deployments/deploying-on-openshift.adoc
@@ -96,7 +96,7 @@ After checking the prerequisites, prepare the project that
will be used to deplo
[TIP]
====
-You can use the
link:build-workflow-image-with-quarkus-cli.html#proc-building-serverless-workflow-application-using-native-image[native
image] for a faster startup.
+You can use the
xref:use-cases/advanced-developer-use-cases/getting-started/build-workflow-image-with-quarkus-cli.adoc#proc-building-serverless-workflow-application-using-native-image[native
image] for a faster startup.
====
@@ -160,7 +160,7 @@ You can read further the next sections which explain
different approaches to dep
[NOTE]
====
-In the next steps you will notice the value **{k8s_registry}** being used. It
is the internal OpenShift's registry address where the images of the
deployments will pulled from. Note that, the Container Image pushed in the
previous step will be queried as
`{k8s_registry}/{default_namespace}/serverless-workflow-greeting-quarkus:1.0`
+In the next steps you will notice the value **{k8s_registry}** being used. It
is the internal OpenShift's registry address where the images of the
deployments will be pulled from. Note that, the Container Image pushed in the
previous step will be queried as
`{k8s_registry}/{default_namespace}/serverless-workflow-greeting-quarkus:1.0`
====
* <<proc-deploy-sw-application-knative-cli,Using Knative CLI (kn)>>
diff --git
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/getting-started/build-workflow-image-with-quarkus-cli.adoc
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/getting-started/build-workflow-image-with-quarkus-cli.adoc
index e82b974b5..2fce8322b 100644
---
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/getting-started/build-workflow-image-with-quarkus-cli.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/getting-started/build-workflow-image-with-quarkus-cli.adoc
@@ -13,7 +13,7 @@ This document describes how to build a {product_name}
container image using the
.Prerequisites
include::./../../../../pages/_common-content/getting-started-requirement-quarkus.adoc[]
-* You have setup your environment according to
xref:getting-started/preparing-environment.adoc#proc-advanced-local-environment-setup[advanced
environment setup] guide and you cluster is ready.
+* You have set up your environment according to the
xref:getting-started/preparing-environment.adoc#proc-advanced-local-environment-setup[advanced
environment setup] guide and your cluster is ready.
* Optionally, GraalVM {graalvm_min_version} is installed. See
xref:getting-started/preparing-environment.adoc#proc-additional-options-for-local-environment[]
Quarkus provides a few extensions to build container images, such as `Jib`,
`docker`, `s2i`, and `buildpacks`. For more information about the Quarkus
extensions, see the link:{quarkus_container_images_url}[Quarkus documentation].
diff --git
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/getting-started/create-your-first-workflow-service.adoc
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/getting-started/create-your-first-workflow-service.adoc
index 899e3a217..ed51dc422 100644
---
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/getting-started/create-your-first-workflow-service.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/getting-started/create-your-first-workflow-service.adoc
@@ -18,7 +18,7 @@ This document describes how to create a workflow application
that serves a `hell
image::getting-started/hello-world-workflow.png[]
.Prerequisites
-* You have setup your environment according to
xref:getting-started/preparing-environment.adoc#proc-advanced-local-environment-setup[advanced
environment setup] guide and you cluster is ready.
+* You have set up your environment according to the
xref:getting-started/preparing-environment.adoc#proc-advanced-local-environment-setup[advanced
environment setup] guide and your cluster is ready.
For more information about the tooling and the required dependencies, see
xref:getting-started/getting-familiar-with-our-tooling.adoc[Getting familiar
with {product_name} tooling].
diff --git
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/getting-started/working-with-serverless-workflow-quarkus-examples.adoc
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/getting-started/working-with-serverless-workflow-quarkus-examples.adoc
index 4f48bfa1d..58ba1e59c 100644
---
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/getting-started/working-with-serverless-workflow-quarkus-examples.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/getting-started/working-with-serverless-workflow-quarkus-examples.adoc
@@ -12,7 +12,7 @@
This document describes how to work with {product_name} example applications.
.Prerequisites
-* You have setup your environment according to
xref:getting-started/preparing-environment.adoc#proc-advanced-local-environment-setup[advanced
environment setup] guide and you cluster is ready.
+* You have set up your environment according to the
xref:getting-started/preparing-environment.adoc#proc-advanced-local-environment-setup[advanced
environment setup] guide and your cluster is ready.
[[proc-using-example-application]]
== Using an example application
diff --git
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/index.adoc
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/index.adoc
index 6f80ad58a..d98a456bd 100644
---
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/index.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/index.adoc
@@ -5,7 +5,7 @@
:keywords: cloud, kubernetes, docker, image, podman, openshift, pipelines
// other
-.Prerequsites
-* You have setup your environment according to
xref:getting-started/preparing-environment.adoc#proc-advanced-local-environment-setup[advanced
environment setup] guide and you cluster is ready.
+.Prerequisites
+* You have set up your environment according to the
xref:getting-started/preparing-environment.adoc#proc-advanced-local-environment-setup[advanced
environment setup] guide and your cluster is ready.
{product_name} allows developers to implement workflow applications for
advanced use cases using Quarkus and Java.
\ No newline at end of file
diff --git
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/integrations/custom-functions-knative.adoc
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/integrations/custom-functions-knative.adoc
index 793ca2c5a..bb6843351 100644
---
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/integrations/custom-functions-knative.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/integrations/custom-functions-knative.adoc
@@ -9,7 +9,7 @@
This document describes how to call Knative services using {product_name}
custom functions. The procedure described in this document is based on the
link:{kogito_sw_examples_url}/serverless-workflow-custom-function-knative[`serverless-workflow-custom-function-knative`]
example application.
-For more details about the Knative custom function, see
xref:core/custom-functions-support.adoc#knative-custom-function[Custom
functions for your {product_name} service].
+For more details about the Knative custom function, see
xref:core/custom-functions-support.adoc#con-func-knative[Custom functions for
your {product_name} service].
.Prerequisites
diff --git
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/integrations/custom-functions-python.adoc
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/integrations/custom-functions-python.adoc
index c22ce81e8..ef7308941 100644
---
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/integrations/custom-functions-python.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/integrations/custom-functions-python.adoc
@@ -84,7 +84,7 @@ The following example defines a function that invokes a
standard Python function
}
----
-Once you have defined the function, you might call it passing the expected
arguments. In the case of factorial, a integer stored in property `x` of the
workflow model.
+Once you have defined the function, you might call it passing the expected
arguments. In the case of factorial, an integer stored in property `x` of the
workflow model.
[source,json]
----
diff --git
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/timeouts/timeout-showcase-example.adoc
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/timeouts/timeout-showcase-example.adoc
index 9cf54752a..48a309dbf 100644
---
a/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/timeouts/timeout-showcase-example.adoc
+++
b/serverlessworkflow/modules/ROOT/pages/use-cases/advanced-developer-use-cases/timeouts/timeout-showcase-example.adoc
@@ -5,7 +5,7 @@
:description: Timeouts showcase in Serverless Workflow
:keywords: kogito, workflow, serverless, timer, timeout
-The timeouts showcase is designed to show how to configure and execute
workflows that use timeouts, according to different deployment scenarios.
+The timeouts showcase is designed to show how to configure and execute
workflows that use timeouts, according to the different deployment scenarios.
While all the scenarios contain the same set of workflows, they are provided
as independent example projects, to facilitate the execution and understanding
of each case.
The following workflows are provided:
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]