This is an automated email from the ASF dual-hosted git repository. davsclaus pushed a commit to branch openai-article in repository https://gitbox.apache.org/repos/asf/camel-website.git
commit 57ff112b5e08f30f4a1d419289c7a5fba5037d61 Author: Claus Ibsen <[email protected]> AuthorDate: Tue Jan 13 14:36:37 2026 +0100 Reformat blog --- content/blog/2026/01/camel-openai/index.md | 128 ++++++++++++++++++----------- 1 file changed, 80 insertions(+), 48 deletions(-) diff --git a/content/blog/2026/01/camel-openai/index.md b/content/blog/2026/01/camel-openai/index.md index 0884e68f..617a7e4a 100644 --- a/content/blog/2026/01/camel-openai/index.md +++ b/content/blog/2026/01/camel-openai/index.md @@ -3,26 +3,34 @@ title: "Simple LLM Integration with Camel OpenAI Component" date: 2026-01-13 draft: false authors: [ ibek ] -categories: ["Camel", "AI"] +categories: [ "Camel", "AI" ] preview: "A deep dive into the new camel-openai component for chat completion with OpenAI and OpenAI-compatible APIs" --- -The integration of Large Language Models (LLMs) into enterprise applications has become increasingly important. Whether you're building intelligent document processing pipelines, automated customer support systems, or data privacy solutions, the ability to seamlessly connect your integration flows with AI capabilities is essential. +The integration of Large Language Models (LLMs) into enterprise applications has become increasingly important. Whether +you're building intelligent document processing pipelines, automated customer support systems, or data privacy +solutions, the ability to seamlessly connect your integration flows with AI capabilities is essential. -Apache Camel 4.17 introduces the new `camel-openai` component, providing native integration with OpenAI and OpenAI-compatible APIs for chat completion. In this post, we'll explore the component's features and demonstrate a practical use case: Personal Identifiable Information (PII) redaction using structured output. +Apache Camel 4.17 introduces the new `camel-openai` component, providing native integration with OpenAI and +OpenAI-compatible APIs for chat completion. In this post, we'll explore the component's features and demonstrate a +practical use case: Personal Identifiable Information (PII) redaction using structured output. ## Component Overview -The `camel-openai` component leverages the official [openai-java SDK](https://github.com/openai/openai-java) to provide robust integration with OpenAI's chat completion API. It offers a lightweight alternative to LangChain4j and Spring AI for straightforward use cases, eliminating the need to instantiate `ChatModel` objects. Here are the key features: +The `camel-openai` component leverages the official [openai-java SDK](https://github.com/openai/openai-java) to provide +robust integration with OpenAI's chat completion API. It offers a lightweight alternative to LangChain4j and Spring AI +for straightforward use cases, eliminating the need to instantiate `ChatModel` objects. Here are the key features: - **Chat Completion**: Send prompts and receive AI-generated responses - **Structured Output**: Get responses in a specific format using `outputClass` or `jsonSchema` - **Streaming Responses**: Process responses chunk by chunk for real-time applications - **Conversation Memory**: Maintain context across multiple exchanges within a route -- **Multi-modal Input**: Support for text files and images with vision-capable models -- **OpenAI-Compatible APIs**: Works with OpenAI, Google Vertex AI, Mistral, Groq, vLLM, NVIDIA NIM, Ollama, Llama.cpp server, and other providers +- **Multi-Modal Input**: Support for text files and images with vision-capable models +- **OpenAI-Compatible APIs**: Works with OpenAI, Google Vertex AI, Mistral, Groq, vLLM, NVIDIA NIM, Ollama, Llama.cpp + server, and other providers -**Note**: This component is not designed for autonomous agentic workflows. It explicitly excludes function calling (tools). For complex agent architectures, we recommend using LangChain4j or Spring AI. +**Note**: This component is not designed for autonomous agentic workflows. It explicitly excludes function calling ( +tools). For complex agent architectures, we recommend using LangChain4j or Spring AI. ## Getting Started @@ -67,7 +75,9 @@ The component requires setting the API key using, `apiKey` parameter, or `OPENAI ## PII Redaction Example: Structured Output in Action -Let's explore a practical example that demonstrates the power of structured output with JSON schemas. This example uses an LLM to identify and redact Personal Identifiable Information from text, returning results in a structured JSON format. +Let's explore a practical example that demonstrates the power of structured output with JSON schemas. This example uses +an LLM to identify and redact Personal Identifiable Information from text, returning results in a structured JSON +format. ### The Route Definition @@ -99,10 +109,11 @@ Let's explore a practical example that demonstrates the power of structured outp - to: "stream:out" ``` -Note: We should set the temperature because in 4.17.0 release, temperature default is being set to 1.0 (this will be removed to respect inference server's configuration in the next release) +NOTE: We should set the temperature because in 4.17.0 release, temperature default is being set to 1.0 (this will be +removed to respect inference server's configuration in the next release) The route uses: -- **Low temperature (0.15)**: For consistent, deterministic responses +- **Low Temperature (0.15)**: For consistent, deterministic responses - **JSON Schema**: Enforces the structure of the output - **System Message**: Instructs the model on its role and behavior @@ -112,40 +123,58 @@ The `pii.schema.json` defines the expected output structure: ```json { - "type": "object", - "properties": { - "detectedPII": { - "type": "array", - "description": "A list of all PII entities detected and redacted.", - "items": { - "type": "object", - "properties": { - "span": { - "type": "string", - "description": "The original text value that was detected." - }, - "type": { - "type": "string", - "enum": ["PERSON", "EMAIL", "PHONE", "CREDIT_CARD", "NATIONAL_ID", "ADDRESS", "OTHER"], - "description": "The category of the detected PII" - }, - "action": { - "type": "string", - "enum": ["MASKED", "REDACTED"], - "description": "The action taken on the data" - } - }, - "required": ["span", "type", "action"], - "additionalProperties": false - } - }, - "sanitizedText": { + "type": "object", + "properties": { + "detectedPII": { + "type": "array", + "description": "A list of all PII entities detected and redacted.", + "items": { + "type": "object", + "properties": { + "span": { + "type": "string", + "description": "The original text value that was detected." + }, + "type": { + "type": "string", + "enum": [ + "PERSON", + "EMAIL", + "PHONE", + "CREDIT_CARD", + "NATIONAL_ID", + "ADDRESS", + "OTHER" + ], + "description": "The category of the detected PII" + }, + "action": { "type": "string", - "description": "The input text with all PII replaced by placeholders" - } + "enum": [ + "MASKED", + "REDACTED" + ], + "description": "The action taken on the data" + } + }, + "required": [ + "span", + "type", + "action" + ], + "additionalProperties": false + } }, - "required": ["detectedPII", "sanitizedText"], - "additionalProperties": false + "sanitizedText": { + "type": "string", + "description": "The input text with all PII replaced by placeholders" + } + }, + "required": [ + "detectedPII", + "sanitizedText" + ], + "additionalProperties": false } ``` @@ -155,11 +184,9 @@ Configure the component in `application.properties`: ```properties camel.jbang.dependencies=camel-openai - camel.component.openai.apiKey={{env:OPENAI_API_KEY}} camel.component.openai.baseUrl={{env:OPENAI_BASE_URL}} camel.component.openai.model={{env:OPENAI_MODEL}} - camel.main.durationMaxMessages=1 ``` @@ -201,17 +228,22 @@ echo 'Customer John Doe (email: [email protected]) requested a refund for ord ## Conclusion -The `camel-openai` component in Apache Camel 4.17 provides a powerful and flexible way to integrate LLM capabilities into your integration flows. With support for structured output, streaming, conversation memory, and compatibility with various OpenAI-compatible providers, you can build sophisticated AI-powered integrations with minimal code. +The `camel-openai` component in Apache Camel 4.17 provides a powerful and flexible way to integrate LLM capabilities +into your integration flows. With support for structured output, streaming, conversation memory, and compatibility with +various OpenAI-compatible providers, you can build sophisticated AI-powered integrations with minimal code. -The PII redaction example demonstrates how structured output with JSON schemas enables reliable, parseable responses from LLMs - a critical capability for enterprise applications where consistent data formats are required. +The PII redaction example demonstrates how structured output with JSON schemas enables reliable, parseable responses +from LLMs - a critical capability for enterprise applications where consistent data formats are required. For more information, check out: -- [Apache Camel OpenAI Component Documentation](https://camel.apache.org/components/next/openai-component.html) + +- [Apache Camel OpenAI Component Documentation](/components/next/openai-component.html) - [Camel OpenAI examples](https://github.com/apache/camel-jbang-examples/openai) - [Apache Camel 4.17 Release Notes](/releases/release-4.17.0/) - [Apache Camel 4.17 What's New](/blog/2026/01/camel417-whatsnew/) -We'd love to hear about what you build with the OpenAI component. Share your experiences on the Apache Camel mailing list or join us on Zulip chat! +We'd love to hear about what you build with the OpenAI component. Share your experiences on the Apache Camel mailing +list or join us on Zulip chat! Happy integrating!
