This is an automated email from the ASF dual-hosted git repository.

kxiao pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git


The following commit(s) were added to refs/heads/master by this push:
     new 99c52fe7f89 [doc](observability) add Langfuse document (#3239)
99c52fe7f89 is described below

commit 99c52fe7f890a0b5d7390060d5426623432d35c7
Author: bingquanzhao <[email protected]>
AuthorDate: Tue Jan 20 19:02:44 2026 +0800

    [doc](observability) add Langfuse document (#3239)
---
 docs/ecosystem/observability/langfuse.md           | 331 ++++++++++++++++++++
 .../current/ecosystem/observability/langfuse.md    | 333 +++++++++++++++++++++
 .../ecosystem/observability/langfuse.md            | 333 +++++++++++++++++++++
 .../ecosystem/observability/langfuse.md            | 333 +++++++++++++++++++++
 .../ecosystem/observability/langfuse.md            | 333 +++++++++++++++++++++
 sidebars.ts                                        |   2 +
 static/images/ecomsystem/langfuse/langfuse_1.png   | Bin 0 -> 433896 bytes
 static/images/ecomsystem/langfuse/langfuse_2.png   | Bin 0 -> 382872 bytes
 static/images/ecomsystem/langfuse/langfuse_3.png   | Bin 0 -> 430704 bytes
 .../ecosystem/observability/langfuse.md            | 331 ++++++++++++++++++++
 .../ecosystem/observability/langfuse.md            | 331 ++++++++++++++++++++
 .../ecosystem/observability/langfuse.md            | 331 ++++++++++++++++++++
 versioned_sidebars/version-2.1-sidebars.json       |   2 +
 versioned_sidebars/version-3.x-sidebars.json       |   2 +
 versioned_sidebars/version-4.x-sidebars.json       |   2 +
 15 files changed, 2664 insertions(+)

diff --git a/docs/ecosystem/observability/langfuse.md 
b/docs/ecosystem/observability/langfuse.md
new file mode 100644
index 00000000000..3f010c0f067
--- /dev/null
+++ b/docs/ecosystem/observability/langfuse.md
@@ -0,0 +1,331 @@
+---
+{
+    "title": "Langfuse on Doris",
+    "language": "en"
+}
+---
+
+# Langfuse on Doris
+
+## About Langfuse
+
+Langfuse is an open-source LLM engineering platform that provides 
comprehensive observability solutions for large language model applications. It 
offers the following core features:
+
+- **Tracing**: Complete recording of LLM application call chains and execution 
flows
+- **Evaluation**: Multi-dimensional model performance evaluation and quality 
analysis
+- **Prompt Management**: Centralized management and version control of prompt 
templates
+- **Metrics Monitoring**: Real-time monitoring of application performance, 
cost, and quality metrics
+
+This document provides detailed instructions on how to deploy a Langfuse 
solution using Apache Doris as the analytics backend, fully leveraging Doris's 
powerful OLAP analytics capabilities to process large-scale LLM application 
data.
+
+
+## System Architecture
+
+The Langfuse on Doris solution uses a microservices architecture with the 
following core components:
+
+| Component         | Ports     | Description                                  
                                        |
+|-----------------|-----------|----------------------------------------------------------------------------------|
+| Langfuse Web    | 3000      | Web interface and API service for user 
interaction and data ingestion                                                  
     |
+| Langfuse Worker | 3030      | Async task processing for data processing and 
analytics tasks                                                               |
+| PostgreSQL      | 5432      | Transactional data storage for user 
configuration and metadata                                                      
         |
+| Redis           | 6379      | Cache layer and message queue for improved 
system response performance                                                     
           |
+| MinIO           | 9090      | Object storage service for raw events and 
multi-modal attachments                                                         
     |
+| Doris Fe        | 9030 8030 | Doris Frontend, part of the Doris 
architecture, responsible for receiving user requests, query parsing and 
planning, metadata management, and node management                      |
+| Doris Be        | 8040 8050 | Doris Backends, part of the Doris 
architecture, responsible for data storage and query plan execution. Data is 
split into shards and stored with multiple replicas in BE nodes. |
+
+::: note
+
+When deploying Apache Doris, you can choose between integrated compute-storage 
architecture or disaggregated compute-storage architecture based on your 
hardware environment and business requirements.
+For Langfuse deployment, Docker Doris is not recommended for production 
environments. The FE and BE components included in Docker are intended for 
users to quickly experience the Langfuse on Doris capabilities.
+
+:::
+
+```mermaid
+flowchart TB
+    User["UI, API, SDKs"]
+    subgraph vpc["VPC"]
+        Web["Web Server<br/>(langfuse/langfuse)"]
+        Worker["Async Worker<br/>(langfuse/worker)"]
+        Postgres["Postgres - OLTP<br/>(Transactional Data)"]
+        Cache["Redis/Valkey<br/>(Cache, Queue)"]
+        Doris["Doris - OLAP<br/>(Observability Data)"]
+        S3["S3 / Blob Storage<br/>(Raw events, multi-modal attachments)"]
+    end
+    LLM["LLM API/Gateway<br/>(optional)"]
+
+    User --> Web
+    Web --> S3
+    Web --> Postgres
+    Web --> Cache
+    Web --> Doris
+    Web -.->|"optional for playground"| LLM
+
+    Cache --> Worker
+    Worker --> Doris
+    Worker --> Postgres
+    Worker --> S3
+    Worker -.->|"optional for evals"| LLM
+```
+
+## Deployment Requirements
+
+### Software Environment
+
+| Component | Version | Description |
+|------|----------|------|
+| Docker | 20.0+ | Container runtime environment |
+| Docker Compose | 2.0+ | Container orchestration tool |
+| Apache Doris | 2.1.10+ | Analytics database, requires separate deployment |
+
+### Hardware Resources
+
+| Resource Type | Minimum | Recommended | Description |
+|----------|----------|----------|------|
+| Memory | 8GB | 16GB+ | Supports multi-service concurrent operation |
+| Disk | 50GB | 100GB+ | Storage for container data and logs |
+| Network | 1Gbps | 10Gbps | Ensures data transfer performance |
+
+### Prerequisites
+
+1. **Doris Cluster Preparation**
+    - Ensure the Doris cluster is running properly with stable performance
+    - Verify that FE HTTP port (default 8030) and query port (default 9030) 
are network accessible
+    - Langfuse will automatically create the required database and table 
structures in Doris after startup
+
+2. **Network Connectivity**
+    - Deployment environment can access Docker Hub to pull images
+    - Langfuse services can access the relevant ports of the Doris cluster
+    - Clients can access the Langfuse Web service port
+
+:::tip Deployment Recommendation
+It is recommended to use Docker to deploy Langfuse service components (Web, 
Worker, Redis, PostgreSQL), but Doris is recommended to be deployed separately 
for better performance and stability. Please refer to the official 
documentation for detailed Doris deployment guide.
+:::
+
+## Configuration Parameters
+
+Langfuse services require multiple environment variables to support the proper 
operation of each component:
+
+### Doris Analytics Backend Configuration
+
+| Parameter | Example Value | Description |
+|---------|--------|------|
+| `LANGFUSE_ANALYTICS_BACKEND` | `doris` | Specify Doris as the analytics 
backend |
+| `DORIS_FE_HTTP_URL` | `http://localhost:8030` | Doris FE HTTP service 
address |
+| `DORIS_FE_QUERY_PORT` | `9030` | Doris FE query port |
+| `DORIS_DB` | `langfuse` | Doris database name |
+| `DORIS_USER` | `root` | Doris username |
+| `DORIS_PASSWORD` | `123456` | Doris password |
+| `DORIS_MAX_OPEN_CONNECTIONS` | `100` | Maximum database connections |
+| `DORIS_REQUEST_TIMEOUT_MS` | `300000` | Request timeout in milliseconds |
+
+### Basic Service Configuration
+
+| Parameter | Example Value | Description |
+|---------|--------|------|
+| `DATABASE_URL` | 
`postgresql://postgres:postgres@langfuse-postgres:5432/postgres` | PostgreSQL 
database connection URL |
+| `NEXTAUTH_SECRET` | `your-debug-secret-key-here-must-be-long-enough` | 
NextAuth authentication key for session encryption |
+| `SALT` | `your-super-secret-salt-with-at-least-32-characters-for-encryption` 
| Data encryption salt (at least 32 characters) |
+| `ENCRYPTION_KEY` | 
`0000000000000000000000000000000000000000000000000000000000000000` | Data 
encryption key (64 characters) |
+| `NEXTAUTH_URL` | `http://localhost:3000` | Langfuse Web service address |
+| `TZ` | `UTC` | System timezone setting |
+
+### Redis Cache Configuration
+
+| Parameter | Example Value              | Description |
+|---------|------------------|------|
+| `REDIS_HOST` | `langfuse-redis` | Redis service host address |
+| `REDIS_PORT` | `6379`           | Redis service port |
+| `REDIS_AUTH` | `myredissecret`  | Redis authentication password |
+| `REDIS_TLS_ENABLED` | `false`          | Whether to enable TLS encryption |
+| `REDIS_TLS_CA` | `-`              | TLS CA certificate path |
+| `REDIS_TLS_CERT` | `-`              | TLS client certificate path |
+| `REDIS_TLS_KEY` | `-`              | TLS private key path |
+
+### Data Migration Configuration
+
+| Parameter | Example Value | Description |
+|---------|--------|------|
+| `LANGFUSE_ENABLE_BACKGROUND_MIGRATIONS` | `false` | Disable background 
migrations (must be disabled when using Doris) |
+| `LANGFUSE_AUTO_DORIS_MIGRATION_DISABLED` | `false` | Enable Doris auto 
migration |
+
+
+## Docker Compose Deployment
+
+### Pre-deployment Preparation
+
+Here we provide a compose example that can be started directly. Modify the 
configuration according to your requirements.
+
+### Download docker compose
+
+```shell
+wget 
https://apache-doris-releases.oss-cn-beijing.aliyuncs.com/extension/docker-langfuse-doris.tar.gz
+```
+
+The directory structure for compose file and configuration file is as follows:
+
+```text
+docker-langfuse-doris
+├── docker-compose.yml
+└── doris-config
+    └── fe_custom.conf
+```
+
+### Deployment Steps
+
+### 1 . Start compose
+
+```Bash
+docker compose up -d
+```
+
+```Bash
+# Check
+$ docker compose up -d
+[+] Running 9/9
+ ✔ Network docker-langfuse-doris_doris_internal  Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Network docker-langfuse-doris_default         Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Container doris_fe                            Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-postgres                   Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-redis                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-minio                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container doris_be                            Healthy                       
                                                                                
                                                                                
       54.3s 
+ ✔ Container langfuse-worker                     Started                       
                                                                                
                                                                                
       54.8s 
+ ✔ Container langfuse-web                        Started
+```
+
+### 3. Verify Deployment
+
+Check service status:
+
+When all service statuses show as Healthy, the compose has started 
successfully.
+
+```Bash
+$ docker compose ps
+NAME                IMAGE                             COMMAND                  
SERVICE           CREATED         STATUS                        PORTS
+doris_be            apache/doris:be-2.1.11            "bash entry_point.sh"    
doris_be          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8040->8040/tcp, :::8040->8040/tcp, 0.0.0.0:8060->8060/tcp, 
:::8060->8060/tcp, 0.0.0.0:9050->9050/tcp, :::9050->9050/tcp, 
0.0.0.0:9060->9060/tcp, :::9060->9060/tcp
+doris_fe            apache/doris:fe-2.1.11            "bash init_fe.sh"        
doris_fe          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8030->8030/tcp, :::8030->8030/tcp, 0.0.0.0:9010->9010/tcp, 
:::9010->9010/tcp, 0.0.0.0:9030->9030/tcp, :::9030->9030/tcp
+langfuse-minio      minio/minio                       "sh -c 'mkdir -p /da…"   
minio             2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:19090->9000/tcp, :::19090->9000/tcp, 127.0.0.1:19091->9001/tcp
+langfuse-postgres   postgres:latest                   "docker-entrypoint.s…"   
postgres          2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:5432->5432/tcp
+langfuse-redis      redis:7                           "docker-entrypoint.s…"   
redis             2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:16379->6379/tcp
+langfuse-web        selectdb/langfuse-web:latest      "dumb-init -- ./web/…"   
langfuse-web      2 minutes ago   Up About a minute (healthy)   
0.0.0.0:13000->3000/tcp, :::13000->3000/tcp
+langfuse-worker     selectdb/langfuse-worker:latest   "dumb-init -- ./work…"   
langfuse-worker   2 minutes ago   Up About a minute (healthy)   
0.0.0.0:3030->3030/tcp, :::3030->3030/tcp
+```
+
+
+#### 4. Service Initialization
+
+After deployment is complete, access and initialize the service as follows:
+
+**Access Langfuse Web Interface**:
+- URL: http://localhost:3000
+
+**Initialization Steps**:
+1. Open your browser and navigate to http://localhost:3000
+2. Create an administrator account and log in
+3. Create a new organization and project
+4. Obtain the project's API Keys (Public Key and Secret Key)
+5. Configure the authentication information required for SDK integration
+
+
+# Examples
+
+## Using Langfuse SDK
+
+```Python
+import os
+# Instead of: import openai
+from langfuse.openai import OpenAI
+# from langfuse import observe
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+
+# use OpenAI client
+client = OpenAI()
+
+
+# ask a question
+question = "What are the key features of the Doris observability solution? 
Please answer concisely."
+print(f"question: {question}")
+
+completion = client.chat.completions.create(
+    model="gpt-4o",
+    messages=[
+        {"role": "user", "content": question}
+    ]
+)
+response = completion.choices[0].message.content
+print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LangChain SDK
+
+```Python
+import os
+from langfuse.langchain import CallbackHandler
+from langchain_openai import ChatOpenAI
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+# Create your LangChain components (using OpenAI API)
+llm = ChatOpenAI(
+    model="gpt-4o"
+)
+
+# ask a question
+question = "What are the key features of the Doris observability solution? 
Please answer concisely."
+print(f"question: {question} \n")
+
+# Run your chain with Langfuse tracing
+try:
+    # Initialize the Langfuse handler
+    langfuse_handler = CallbackHandler()
+    response = llm.invoke(question, config={"callbacks": [langfuse_handler]})
+    print(f"response: {response.content}")
+except Exception as e:
+    print(f"Error during chain execution: {e}")
+```
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LlamaIndex SDK
+
+```Python
+import os
+from langfuse import get_client
+from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
+from llama_index.llms.openai import OpenAI
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+langfuse = get_client()
+
+
+# Initialize LlamaIndex instrumentation
+LlamaIndexInstrumentor().instrument()
+
+
+# Set up the OpenAI class with the required model
+llm = OpenAI(model="gpt-4o")
+
+
+# ask a question
+question = "What are the key features of the Doris observability solution? 
Please answer concisely."
+print(f"question: {question} \n")
+ 
+with langfuse.start_as_current_span(name="llama-index-trace"):
+    response = llm.complete(question)
+    print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_3.png)
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/observability/langfuse.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/observability/langfuse.md
new file mode 100644
index 00000000000..92f445036c3
--- /dev/null
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/observability/langfuse.md
@@ -0,0 +1,333 @@
+---
+{
+    "title": "Langfuse on Doris",
+    "language": "zh-CN"
+}
+---
+
+# Langfuse on Doris
+
+## 关于 Langfuse
+
+Langfuse 是一个开源的 LLM 工程平台,专门为大语言模型应用提供全面的可观测性解决方案。它主要提供以下核心功能:
+
+- **链路追踪**:完整记录 LLM 应用的调用链路和执行流程
+- **性能评估**:提供多维度的模型性能评估和质量分析
+- **提示管理**:集中管理和版本控制提示词模板
+- **指标监控**:实时监控应用性能、成本和质量指标
+
+本文档将详细介绍如何部署基于 Apache Doris 作为分析后端的 Langfuse 解决方案,充分利用 Doris 强大的 OLAP 
分析能力来处理大规模的 LLM 应用数据。
+
+
+## 系统架构
+
+Langfuse on Doris 解决方案采用微服务架构,包含以下核心组件:
+
+| 组件              | 端口        | 功能说明                                           
                                  |
+|-----------------|-----------|----------------------------------------------------------------------------------|
+| Langfuse Web    | 3000      | Web 界面和 API 服务,提供用户交互和数据接入                     
                                  |
+| Langfuse Worker | 3030      | 异步任务处理,负责数据处理和分析任务                             
                                  |
+| PostgreSQL      | 5432      | 事务性数据存储,保存用户配置和元数据                             
                                  |
+| Redis           | 6379      | 缓存层和消息队列,提升系统响应性能                              
                                  |
+| MinIO           | 9090      | 对象存储服务,存储原始事件和多模态附件                            
                                  |
+| Doris Fe        | 9030 8030 | Doris frontend, Doris 
架构的一部分,主要负责接收用户请求、查询解析和规划、元数据管理以及节点管理                      |
+| Doris Be        | 8040 8050 | Doris Backends ,Doris 
架构的一部分,主要负责数据存储和查询计划的执行。数据会被切分成数据分片(Shard),在 BE 中以多副本方式存储。 |
+
+::: note
+
+在部署 Apache Doris 时,可以根据硬件环境与业务需求选择存算一体架构或存算分离架构。
+在 Langfuse 部署中,生产环境不建议使用 Docker Doris,Docker 中带有的 Fe,Be 部分为了方便用户快速体验Langfuse 
on Doris 的能力
+
+:::
+
+```mermaid
+flowchart TB
+    User["UI, API, SDKs"]
+    subgraph vpc["VPC"]
+        Web["Web Server<br/>(langfuse/langfuse)"]
+        Worker["Async Worker<br/>(langfuse/worker)"]
+        Postgres["Postgres - OLTP<br/>(Transactional Data)"]
+        Cache["Redis/Valkey<br/>(Cache, Queue)"]
+        Doris["Doris - OLAP<br/>(Observability Data)"]
+        S3["S3 / Blob Storage<br/>(Raw events, multi-modal attachments)"]
+    end
+    LLM["LLM API/Gateway<br/>(optional)"]
+
+    User --> Web
+    Web --> S3
+    Web --> Postgres
+    Web --> Cache
+    Web --> Doris
+    Web -.->|"optional for playground"| LLM
+
+    Cache --> Worker
+    Worker --> Doris
+    Worker --> Postgres
+    Worker --> S3
+    Worker -.->|"optional for evals"| LLM
+```
+
+## 部署要求
+
+### 软件环境
+
+| 组件 | 版本要求 | 说明 |
+|------|----------|------|
+| Docker | 20.0+ | 容器运行环境 |
+| Docker Compose | 2.0+ | 容器编排工具 |
+| Apache Doris | 2.1.10+ | 分析数据库,需独立部署 |
+
+### 硬件资源
+
+| 资源类型 | 最低要求 | 推荐配置 | 说明 |
+|----------|----------|----------|------|
+| 内存 | 8GB | 16GB+ | 支持多服务并发运行 |
+| 磁盘 | 50GB | 100GB+ | 存储容器数据和日志 |
+| 网络 | 1Gbps | 10Gbps | 确保数据传输性能 |
+
+### 前置条件
+
+1. **Doris 集群准备**
+    - 确保 Doris 集群正常运行且性能稳定
+    - 验证 FE HTTP 端口(默认 8030)和查询端口(默认 9030)网络可达
+    - Langfuse 启动后将自动在 Doris 中创建所需的数据库和表结构
+
+2. **网络连通性**
+    - 部署环境能够访问 Docker Hub 拉取镜像
+    - Langfuse 服务能够访问 Doris 集群的相关端口
+    - 客户端能够访问 Langfuse Web 服务端口
+
+:::tip 部署建议
+推荐使用 Docker 部署 Langfuse 服务组件(Web、Worker、Redis、PostgreSQL),但 Doris 
建议独立部署以获得更好的性能和稳定性。详细的 Doris 部署指南请参考官方文档。
+:::
+
+## 配置参数
+
+Langfuse 服务需要配置多个环境变量来支持各个组件的正常运行:
+
+### Doris 分析后端配置
+
+| 参数名称 | 示例值 | 说明 |
+|---------|--------|------|
+| `LANGFUSE_ANALYTICS_BACKEND` | `doris` | 指定使用 Doris 作为分析后端 |
+| `DORIS_FE_HTTP_URL` | `http://localhost:8030` | Doris FE HTTP 服务地址 |
+| `DORIS_FE_QUERY_PORT` | `9030` | Doris FE 查询端口 |
+| `DORIS_DB` | `langfuse` | Doris 数据库名称 |
+| `DORIS_USER` | `root` | Doris 用户名 |
+| `DORIS_PASSWORD` | `123456` | Doris 密码 |
+| `DORIS_MAX_OPEN_CONNECTIONS` | `100` | 最大数据库连接数 |
+| `DORIS_REQUEST_TIMEOUT_MS` | `300000` | 请求超时时间(毫秒) |
+
+### 基础服务配置
+
+| 参数名称 | 示例值 | 说明 |
+|---------|--------|------|
+| `DATABASE_URL` | 
`postgresql://postgres:postgres@langfuse-postgres:5432/postgres` | PostgreSQL 
数据库连接地址 |
+| `NEXTAUTH_SECRET` | `your-debug-secret-key-here-must-be-long-enough` | 
NextAuth 认证密钥,用于会话加密 |
+| `SALT` | `your-super-secret-salt-with-at-least-32-characters-for-encryption` 
| 数据加密盐值(至少32字符) |
+| `ENCRYPTION_KEY` | 
`0000000000000000000000000000000000000000000000000000000000000000` | 
数据加密密钥(64字符) |
+| `NEXTAUTH_URL` | `http://localhost:3000` | Langfuse Web 服务地址 |
+| `TZ` | `UTC` | 系统时区设置 |
+
+### Redis 缓存配置
+
+| 参数名称 | 示例值              | 说明 |
+|---------|------------------|------|
+| `REDIS_HOST` | `langfuse-redis` | Redis 服务主机地址 |
+| `REDIS_PORT` | `6379`           | Redis 服务端口 |
+| `REDIS_AUTH` | `myredissecret`  | Redis 认证密码 |
+| `REDIS_TLS_ENABLED` | `false`          | 是否启用 TLS 加密 |
+| `REDIS_TLS_CA` | `-`              | TLS CA 证书路径 |
+| `REDIS_TLS_CERT` | `-`              | TLS 客户端证书路径 |
+| `REDIS_TLS_KEY` | `-`              | TLS 私钥路径 |
+
+### 数据迁移配置
+
+| 参数名称 | 示例值 | 说明 |
+|---------|--------|------|
+| `LANGFUSE_ENABLE_BACKGROUND_MIGRATIONS` | `false` | 禁用后台迁移(使用 Doris 时需要关闭) |
+| `LANGFUSE_AUTO_DORIS_MIGRATION_DISABLED` | `false` | 启用 Doris 自动迁移 |
+
+
+## Docker Compose 部署
+
+### 启动前准备
+
+这里我们提供一个可以直接启动的compose 示例,配置根据需求进行修改
+
+### 下载 docker compose
+
+```shell
+wget 
https://apache-doris-releases.oss-cn-beijing.aliyuncs.com/extension/docker-langfuse-doris.tar.gz
+```
+
+compose 文件与配置文件路径结构如下
+
+```text
+docker-langfuse-doris
+├── docker-compose.yml
+└── doris-config
+    └── fe_custom.conf
+```
+
+### 部署步骤
+
+### 1 . 启动 compose
+
+```Bash
+docker compose up -d
+```
+
+```Bash
+# 检查
+$ docker compose up -d
+[+] Running 9/9
+ ✔ Network docker-langfuse-doris_doris_internal  Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Network docker-langfuse-doris_default         Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Container doris_fe                            Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-postgres                   Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-redis                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-minio                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container doris_be                            Healthy                       
                                                                                
                                                                                
       54.3s 
+ ✔ Container langfuse-worker                     Started                       
                                                                                
                                                                                
       54.8s 
+ ✔ Container langfuse-web                        Started
+```
+
+### 3. 验证部署
+
+检查服务状态:
+
+当服务状态都为 Healthy 说明 compose 启动成功
+
+```Bash
+$ docker compose ps
+NAME                IMAGE                             COMMAND                  
SERVICE           CREATED         STATUS                        PORTS
+doris_be            apache/doris:be-2.1.11            "bash entry_point.sh"    
doris_be          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8040->8040/tcp, :::8040->8040/tcp, 0.0.0.0:8060->8060/tcp, 
:::8060->8060/tcp, 0.0.0.0:9050->9050/tcp, :::9050->9050/tcp, 
0.0.0.0:9060->9060/tcp, :::9060->9060/tcp
+doris_fe            apache/doris:fe-2.1.11            "bash init_fe.sh"        
doris_fe          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8030->8030/tcp, :::8030->8030/tcp, 0.0.0.0:9010->9010/tcp, 
:::9010->9010/tcp, 0.0.0.0:9030->9030/tcp, :::9030->9030/tcp
+langfuse-minio      minio/minio                       "sh -c 'mkdir -p /da…"   
minio             2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:19090->9000/tcp, :::19090->9000/tcp, 127.0.0.1:19091->9001/tcp
+langfuse-postgres   postgres:latest                   "docker-entrypoint.s…"   
postgres          2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:5432->5432/tcp
+langfuse-redis      redis:7                           "docker-entrypoint.s…"   
redis             2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:16379->6379/tcp
+langfuse-web        selectdb/langfuse-web:latest      "dumb-init -- ./web/…"   
langfuse-web      2 minutes ago   Up About a minute (healthy)   
0.0.0.0:13000->3000/tcp, :::13000->3000/tcp
+langfuse-worker     selectdb/langfuse-worker:latest   "dumb-init -- ./work…"   
langfuse-worker   2 minutes ago   Up About a minute (healthy)   
0.0.0.0:3030->3030/tcp, :::3030->3030/tcp
+```
+
+
+#### 4. 服务初始化
+
+部署完成后,通过以下方式访问和初始化服务:
+
+**访问 Langfuse Web 界面**:
+- 地址:http://localhost:3000
+
+**初始化步骤**:
+1. 打开浏览器访问 http://localhost:3000
+2. 创建管理员账户并登录
+3. 创建新组织与新项目
+4. 获取项目的 API Keys(Public Key 和 Secret Key)
+5. 配置 SDK 集成所需的认证信息
+
+
+# Examples
+
+## Using Langfuse SDK
+
+```Python
+import os
+# Instead of: import openai
+from langfuse.openai import OpenAI
+# from langfuse import observe
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+
+# use OpenAI client to access DeepSeek API
+client = OpenAI(
+    base_url="https://api.deepseek.com";
+)
+
+
+# ask a question
+question = "Doris 可观测性解决方案的特点是什么?回答简洁清晰"
+print(f"question: {question}")
+
+completion = client.chat.completions.create(
+    model="deepseek-chat",
+    messages=[
+        {"role": "user", "content": question}
+    ]
+)
+response = completion.choices[0].message.content
+print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LangChain SDK
+
+```Python
+import os
+from langfuse.langchain import CallbackHandler
+from langchain_openai import ChatOpenAI
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+# Create your LangChain components (using DeepSeek API)
+llm = ChatOpenAI(
+    model="deepseek-chat",
+    openai_api_base="https://api.deepseek.com";
+)
+
+# ask a question
+question = "Doris 可观测性解决方案的特点是什么?回答简洁清晰"
+print(f"question: {question} \n")
+
+# Run your chain with Langfuse tracing
+try:
+    # Initialize the Langfuse handler
+    langfuse_handler = CallbackHandler()
+    response = llm.invoke(question, config={"callbacks": [langfuse_handler]})
+    print(f"response: {response.content}")
+except Exception as e:
+    print(f"Error during chain execution: {e}")
+```
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LlamaIndex SDK
+
+```Python
+from langfuse import get_client
+from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
+from llama_index.llms.deepseek import DeepSeek
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+langfuse = get_client()
+
+
+# Initialize LlamaIndex instrumentation
+LlamaIndexInstrumentor().instrument()
+
+
+# Set up the DeepSeek class with the required model and API key
+llm = DeepSeek(model="deepseek-chat")
+
+
+# ask a question
+question = "Doris 可观测性解决方案的特点是什么?回答简洁清晰"
+print(f"question: {question} \n")
+ 
+with langfuse.start_as_current_span(name="llama-index-trace"):
+    response = llm.complete(question)
+    print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_3.png)
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/observability/langfuse.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/observability/langfuse.md
new file mode 100644
index 00000000000..92f445036c3
--- /dev/null
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/observability/langfuse.md
@@ -0,0 +1,333 @@
+---
+{
+    "title": "Langfuse on Doris",
+    "language": "zh-CN"
+}
+---
+
+# Langfuse on Doris
+
+## 关于 Langfuse
+
+Langfuse 是一个开源的 LLM 工程平台,专门为大语言模型应用提供全面的可观测性解决方案。它主要提供以下核心功能:
+
+- **链路追踪**:完整记录 LLM 应用的调用链路和执行流程
+- **性能评估**:提供多维度的模型性能评估和质量分析
+- **提示管理**:集中管理和版本控制提示词模板
+- **指标监控**:实时监控应用性能、成本和质量指标
+
+本文档将详细介绍如何部署基于 Apache Doris 作为分析后端的 Langfuse 解决方案,充分利用 Doris 强大的 OLAP 
分析能力来处理大规模的 LLM 应用数据。
+
+
+## 系统架构
+
+Langfuse on Doris 解决方案采用微服务架构,包含以下核心组件:
+
+| 组件              | 端口        | 功能说明                                           
                                  |
+|-----------------|-----------|----------------------------------------------------------------------------------|
+| Langfuse Web    | 3000      | Web 界面和 API 服务,提供用户交互和数据接入                     
                                  |
+| Langfuse Worker | 3030      | 异步任务处理,负责数据处理和分析任务                             
                                  |
+| PostgreSQL      | 5432      | 事务性数据存储,保存用户配置和元数据                             
                                  |
+| Redis           | 6379      | 缓存层和消息队列,提升系统响应性能                              
                                  |
+| MinIO           | 9090      | 对象存储服务,存储原始事件和多模态附件                            
                                  |
+| Doris Fe        | 9030 8030 | Doris frontend, Doris 
架构的一部分,主要负责接收用户请求、查询解析和规划、元数据管理以及节点管理                      |
+| Doris Be        | 8040 8050 | Doris Backends ,Doris 
架构的一部分,主要负责数据存储和查询计划的执行。数据会被切分成数据分片(Shard),在 BE 中以多副本方式存储。 |
+
+::: note
+
+在部署 Apache Doris 时,可以根据硬件环境与业务需求选择存算一体架构或存算分离架构。
+在 Langfuse 部署中,生产环境不建议使用 Docker Doris,Docker 中带有的 Fe,Be 部分为了方便用户快速体验Langfuse 
on Doris 的能力
+
+:::
+
+```mermaid
+flowchart TB
+    User["UI, API, SDKs"]
+    subgraph vpc["VPC"]
+        Web["Web Server<br/>(langfuse/langfuse)"]
+        Worker["Async Worker<br/>(langfuse/worker)"]
+        Postgres["Postgres - OLTP<br/>(Transactional Data)"]
+        Cache["Redis/Valkey<br/>(Cache, Queue)"]
+        Doris["Doris - OLAP<br/>(Observability Data)"]
+        S3["S3 / Blob Storage<br/>(Raw events, multi-modal attachments)"]
+    end
+    LLM["LLM API/Gateway<br/>(optional)"]
+
+    User --> Web
+    Web --> S3
+    Web --> Postgres
+    Web --> Cache
+    Web --> Doris
+    Web -.->|"optional for playground"| LLM
+
+    Cache --> Worker
+    Worker --> Doris
+    Worker --> Postgres
+    Worker --> S3
+    Worker -.->|"optional for evals"| LLM
+```
+
+## 部署要求
+
+### 软件环境
+
+| 组件 | 版本要求 | 说明 |
+|------|----------|------|
+| Docker | 20.0+ | 容器运行环境 |
+| Docker Compose | 2.0+ | 容器编排工具 |
+| Apache Doris | 2.1.10+ | 分析数据库,需独立部署 |
+
+### 硬件资源
+
+| 资源类型 | 最低要求 | 推荐配置 | 说明 |
+|----------|----------|----------|------|
+| 内存 | 8GB | 16GB+ | 支持多服务并发运行 |
+| 磁盘 | 50GB | 100GB+ | 存储容器数据和日志 |
+| 网络 | 1Gbps | 10Gbps | 确保数据传输性能 |
+
+### 前置条件
+
+1. **Doris 集群准备**
+    - 确保 Doris 集群正常运行且性能稳定
+    - 验证 FE HTTP 端口(默认 8030)和查询端口(默认 9030)网络可达
+    - Langfuse 启动后将自动在 Doris 中创建所需的数据库和表结构
+
+2. **网络连通性**
+    - 部署环境能够访问 Docker Hub 拉取镜像
+    - Langfuse 服务能够访问 Doris 集群的相关端口
+    - 客户端能够访问 Langfuse Web 服务端口
+
+:::tip 部署建议
+推荐使用 Docker 部署 Langfuse 服务组件(Web、Worker、Redis、PostgreSQL),但 Doris 
建议独立部署以获得更好的性能和稳定性。详细的 Doris 部署指南请参考官方文档。
+:::
+
+## 配置参数
+
+Langfuse 服务需要配置多个环境变量来支持各个组件的正常运行:
+
+### Doris 分析后端配置
+
+| 参数名称 | 示例值 | 说明 |
+|---------|--------|------|
+| `LANGFUSE_ANALYTICS_BACKEND` | `doris` | 指定使用 Doris 作为分析后端 |
+| `DORIS_FE_HTTP_URL` | `http://localhost:8030` | Doris FE HTTP 服务地址 |
+| `DORIS_FE_QUERY_PORT` | `9030` | Doris FE 查询端口 |
+| `DORIS_DB` | `langfuse` | Doris 数据库名称 |
+| `DORIS_USER` | `root` | Doris 用户名 |
+| `DORIS_PASSWORD` | `123456` | Doris 密码 |
+| `DORIS_MAX_OPEN_CONNECTIONS` | `100` | 最大数据库连接数 |
+| `DORIS_REQUEST_TIMEOUT_MS` | `300000` | 请求超时时间(毫秒) |
+
+### 基础服务配置
+
+| 参数名称 | 示例值 | 说明 |
+|---------|--------|------|
+| `DATABASE_URL` | 
`postgresql://postgres:postgres@langfuse-postgres:5432/postgres` | PostgreSQL 
数据库连接地址 |
+| `NEXTAUTH_SECRET` | `your-debug-secret-key-here-must-be-long-enough` | 
NextAuth 认证密钥,用于会话加密 |
+| `SALT` | `your-super-secret-salt-with-at-least-32-characters-for-encryption` 
| 数据加密盐值(至少32字符) |
+| `ENCRYPTION_KEY` | 
`0000000000000000000000000000000000000000000000000000000000000000` | 
数据加密密钥(64字符) |
+| `NEXTAUTH_URL` | `http://localhost:3000` | Langfuse Web 服务地址 |
+| `TZ` | `UTC` | 系统时区设置 |
+
+### Redis 缓存配置
+
+| 参数名称 | 示例值              | 说明 |
+|---------|------------------|------|
+| `REDIS_HOST` | `langfuse-redis` | Redis 服务主机地址 |
+| `REDIS_PORT` | `6379`           | Redis 服务端口 |
+| `REDIS_AUTH` | `myredissecret`  | Redis 认证密码 |
+| `REDIS_TLS_ENABLED` | `false`          | 是否启用 TLS 加密 |
+| `REDIS_TLS_CA` | `-`              | TLS CA 证书路径 |
+| `REDIS_TLS_CERT` | `-`              | TLS 客户端证书路径 |
+| `REDIS_TLS_KEY` | `-`              | TLS 私钥路径 |
+
+### 数据迁移配置
+
+| 参数名称 | 示例值 | 说明 |
+|---------|--------|------|
+| `LANGFUSE_ENABLE_BACKGROUND_MIGRATIONS` | `false` | 禁用后台迁移(使用 Doris 时需要关闭) |
+| `LANGFUSE_AUTO_DORIS_MIGRATION_DISABLED` | `false` | 启用 Doris 自动迁移 |
+
+
+## Docker Compose 部署
+
+### 启动前准备
+
+这里我们提供一个可以直接启动的compose 示例,配置根据需求进行修改
+
+### 下载 docker compose
+
+```shell
+wget 
https://apache-doris-releases.oss-cn-beijing.aliyuncs.com/extension/docker-langfuse-doris.tar.gz
+```
+
+compose 文件与配置文件路径结构如下
+
+```text
+docker-langfuse-doris
+├── docker-compose.yml
+└── doris-config
+    └── fe_custom.conf
+```
+
+### 部署步骤
+
+### 1 . 启动 compose
+
+```Bash
+docker compose up -d
+```
+
+```Bash
+# 检查
+$ docker compose up -d
+[+] Running 9/9
+ ✔ Network docker-langfuse-doris_doris_internal  Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Network docker-langfuse-doris_default         Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Container doris_fe                            Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-postgres                   Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-redis                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-minio                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container doris_be                            Healthy                       
                                                                                
                                                                                
       54.3s 
+ ✔ Container langfuse-worker                     Started                       
                                                                                
                                                                                
       54.8s 
+ ✔ Container langfuse-web                        Started
+```
+
+### 3. 验证部署
+
+检查服务状态:
+
+当服务状态都为 Healthy 说明 compose 启动成功
+
+```Bash
+$ docker compose ps
+NAME                IMAGE                             COMMAND                  
SERVICE           CREATED         STATUS                        PORTS
+doris_be            apache/doris:be-2.1.11            "bash entry_point.sh"    
doris_be          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8040->8040/tcp, :::8040->8040/tcp, 0.0.0.0:8060->8060/tcp, 
:::8060->8060/tcp, 0.0.0.0:9050->9050/tcp, :::9050->9050/tcp, 
0.0.0.0:9060->9060/tcp, :::9060->9060/tcp
+doris_fe            apache/doris:fe-2.1.11            "bash init_fe.sh"        
doris_fe          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8030->8030/tcp, :::8030->8030/tcp, 0.0.0.0:9010->9010/tcp, 
:::9010->9010/tcp, 0.0.0.0:9030->9030/tcp, :::9030->9030/tcp
+langfuse-minio      minio/minio                       "sh -c 'mkdir -p /da…"   
minio             2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:19090->9000/tcp, :::19090->9000/tcp, 127.0.0.1:19091->9001/tcp
+langfuse-postgres   postgres:latest                   "docker-entrypoint.s…"   
postgres          2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:5432->5432/tcp
+langfuse-redis      redis:7                           "docker-entrypoint.s…"   
redis             2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:16379->6379/tcp
+langfuse-web        selectdb/langfuse-web:latest      "dumb-init -- ./web/…"   
langfuse-web      2 minutes ago   Up About a minute (healthy)   
0.0.0.0:13000->3000/tcp, :::13000->3000/tcp
+langfuse-worker     selectdb/langfuse-worker:latest   "dumb-init -- ./work…"   
langfuse-worker   2 minutes ago   Up About a minute (healthy)   
0.0.0.0:3030->3030/tcp, :::3030->3030/tcp
+```
+
+
+#### 4. 服务初始化
+
+部署完成后,通过以下方式访问和初始化服务:
+
+**访问 Langfuse Web 界面**:
+- 地址:http://localhost:3000
+
+**初始化步骤**:
+1. 打开浏览器访问 http://localhost:3000
+2. 创建管理员账户并登录
+3. 创建新组织与新项目
+4. 获取项目的 API Keys(Public Key 和 Secret Key)
+5. 配置 SDK 集成所需的认证信息
+
+
+# Examples
+
+## Using Langfuse SDK
+
+```Python
+import os
+# Instead of: import openai
+from langfuse.openai import OpenAI
+# from langfuse import observe
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+
+# use OpenAI client to access DeepSeek API
+client = OpenAI(
+    base_url="https://api.deepseek.com";
+)
+
+
+# ask a question
+question = "Doris 可观测性解决方案的特点是什么?回答简洁清晰"
+print(f"question: {question}")
+
+completion = client.chat.completions.create(
+    model="deepseek-chat",
+    messages=[
+        {"role": "user", "content": question}
+    ]
+)
+response = completion.choices[0].message.content
+print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LangChain SDK
+
+```Python
+import os
+from langfuse.langchain import CallbackHandler
+from langchain_openai import ChatOpenAI
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+# Create your LangChain components (using DeepSeek API)
+llm = ChatOpenAI(
+    model="deepseek-chat",
+    openai_api_base="https://api.deepseek.com";
+)
+
+# ask a question
+question = "Doris 可观测性解决方案的特点是什么?回答简洁清晰"
+print(f"question: {question} \n")
+
+# Run your chain with Langfuse tracing
+try:
+    # Initialize the Langfuse handler
+    langfuse_handler = CallbackHandler()
+    response = llm.invoke(question, config={"callbacks": [langfuse_handler]})
+    print(f"response: {response.content}")
+except Exception as e:
+    print(f"Error during chain execution: {e}")
+```
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LlamaIndex SDK
+
+```Python
+from langfuse import get_client
+from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
+from llama_index.llms.deepseek import DeepSeek
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+langfuse = get_client()
+
+
+# Initialize LlamaIndex instrumentation
+LlamaIndexInstrumentor().instrument()
+
+
+# Set up the DeepSeek class with the required model and API key
+llm = DeepSeek(model="deepseek-chat")
+
+
+# ask a question
+question = "Doris 可观测性解决方案的特点是什么?回答简洁清晰"
+print(f"question: {question} \n")
+ 
+with langfuse.start_as_current_span(name="llama-index-trace"):
+    response = llm.complete(question)
+    print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_3.png)
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.x/ecosystem/observability/langfuse.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.x/ecosystem/observability/langfuse.md
new file mode 100644
index 00000000000..92f445036c3
--- /dev/null
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.x/ecosystem/observability/langfuse.md
@@ -0,0 +1,333 @@
+---
+{
+    "title": "Langfuse on Doris",
+    "language": "zh-CN"
+}
+---
+
+# Langfuse on Doris
+
+## 关于 Langfuse
+
+Langfuse 是一个开源的 LLM 工程平台,专门为大语言模型应用提供全面的可观测性解决方案。它主要提供以下核心功能:
+
+- **链路追踪**:完整记录 LLM 应用的调用链路和执行流程
+- **性能评估**:提供多维度的模型性能评估和质量分析
+- **提示管理**:集中管理和版本控制提示词模板
+- **指标监控**:实时监控应用性能、成本和质量指标
+
+本文档将详细介绍如何部署基于 Apache Doris 作为分析后端的 Langfuse 解决方案,充分利用 Doris 强大的 OLAP 
分析能力来处理大规模的 LLM 应用数据。
+
+
+## 系统架构
+
+Langfuse on Doris 解决方案采用微服务架构,包含以下核心组件:
+
+| 组件              | 端口        | 功能说明                                           
                                  |
+|-----------------|-----------|----------------------------------------------------------------------------------|
+| Langfuse Web    | 3000      | Web 界面和 API 服务,提供用户交互和数据接入                     
                                  |
+| Langfuse Worker | 3030      | 异步任务处理,负责数据处理和分析任务                             
                                  |
+| PostgreSQL      | 5432      | 事务性数据存储,保存用户配置和元数据                             
                                  |
+| Redis           | 6379      | 缓存层和消息队列,提升系统响应性能                              
                                  |
+| MinIO           | 9090      | 对象存储服务,存储原始事件和多模态附件                            
                                  |
+| Doris Fe        | 9030 8030 | Doris frontend, Doris 
架构的一部分,主要负责接收用户请求、查询解析和规划、元数据管理以及节点管理                      |
+| Doris Be        | 8040 8050 | Doris Backends ,Doris 
架构的一部分,主要负责数据存储和查询计划的执行。数据会被切分成数据分片(Shard),在 BE 中以多副本方式存储。 |
+
+::: note
+
+在部署 Apache Doris 时,可以根据硬件环境与业务需求选择存算一体架构或存算分离架构。
+在 Langfuse 部署中,生产环境不建议使用 Docker Doris,Docker 中带有的 Fe,Be 部分为了方便用户快速体验Langfuse 
on Doris 的能力
+
+:::
+
+```mermaid
+flowchart TB
+    User["UI, API, SDKs"]
+    subgraph vpc["VPC"]
+        Web["Web Server<br/>(langfuse/langfuse)"]
+        Worker["Async Worker<br/>(langfuse/worker)"]
+        Postgres["Postgres - OLTP<br/>(Transactional Data)"]
+        Cache["Redis/Valkey<br/>(Cache, Queue)"]
+        Doris["Doris - OLAP<br/>(Observability Data)"]
+        S3["S3 / Blob Storage<br/>(Raw events, multi-modal attachments)"]
+    end
+    LLM["LLM API/Gateway<br/>(optional)"]
+
+    User --> Web
+    Web --> S3
+    Web --> Postgres
+    Web --> Cache
+    Web --> Doris
+    Web -.->|"optional for playground"| LLM
+
+    Cache --> Worker
+    Worker --> Doris
+    Worker --> Postgres
+    Worker --> S3
+    Worker -.->|"optional for evals"| LLM
+```
+
+## 部署要求
+
+### 软件环境
+
+| 组件 | 版本要求 | 说明 |
+|------|----------|------|
+| Docker | 20.0+ | 容器运行环境 |
+| Docker Compose | 2.0+ | 容器编排工具 |
+| Apache Doris | 2.1.10+ | 分析数据库,需独立部署 |
+
+### 硬件资源
+
+| 资源类型 | 最低要求 | 推荐配置 | 说明 |
+|----------|----------|----------|------|
+| 内存 | 8GB | 16GB+ | 支持多服务并发运行 |
+| 磁盘 | 50GB | 100GB+ | 存储容器数据和日志 |
+| 网络 | 1Gbps | 10Gbps | 确保数据传输性能 |
+
+### 前置条件
+
+1. **Doris 集群准备**
+    - 确保 Doris 集群正常运行且性能稳定
+    - 验证 FE HTTP 端口(默认 8030)和查询端口(默认 9030)网络可达
+    - Langfuse 启动后将自动在 Doris 中创建所需的数据库和表结构
+
+2. **网络连通性**
+    - 部署环境能够访问 Docker Hub 拉取镜像
+    - Langfuse 服务能够访问 Doris 集群的相关端口
+    - 客户端能够访问 Langfuse Web 服务端口
+
+:::tip 部署建议
+推荐使用 Docker 部署 Langfuse 服务组件(Web、Worker、Redis、PostgreSQL),但 Doris 
建议独立部署以获得更好的性能和稳定性。详细的 Doris 部署指南请参考官方文档。
+:::
+
+## 配置参数
+
+Langfuse 服务需要配置多个环境变量来支持各个组件的正常运行:
+
+### Doris 分析后端配置
+
+| 参数名称 | 示例值 | 说明 |
+|---------|--------|------|
+| `LANGFUSE_ANALYTICS_BACKEND` | `doris` | 指定使用 Doris 作为分析后端 |
+| `DORIS_FE_HTTP_URL` | `http://localhost:8030` | Doris FE HTTP 服务地址 |
+| `DORIS_FE_QUERY_PORT` | `9030` | Doris FE 查询端口 |
+| `DORIS_DB` | `langfuse` | Doris 数据库名称 |
+| `DORIS_USER` | `root` | Doris 用户名 |
+| `DORIS_PASSWORD` | `123456` | Doris 密码 |
+| `DORIS_MAX_OPEN_CONNECTIONS` | `100` | 最大数据库连接数 |
+| `DORIS_REQUEST_TIMEOUT_MS` | `300000` | 请求超时时间(毫秒) |
+
+### 基础服务配置
+
+| 参数名称 | 示例值 | 说明 |
+|---------|--------|------|
+| `DATABASE_URL` | 
`postgresql://postgres:postgres@langfuse-postgres:5432/postgres` | PostgreSQL 
数据库连接地址 |
+| `NEXTAUTH_SECRET` | `your-debug-secret-key-here-must-be-long-enough` | 
NextAuth 认证密钥,用于会话加密 |
+| `SALT` | `your-super-secret-salt-with-at-least-32-characters-for-encryption` 
| 数据加密盐值(至少32字符) |
+| `ENCRYPTION_KEY` | 
`0000000000000000000000000000000000000000000000000000000000000000` | 
数据加密密钥(64字符) |
+| `NEXTAUTH_URL` | `http://localhost:3000` | Langfuse Web 服务地址 |
+| `TZ` | `UTC` | 系统时区设置 |
+
+### Redis 缓存配置
+
+| 参数名称 | 示例值              | 说明 |
+|---------|------------------|------|
+| `REDIS_HOST` | `langfuse-redis` | Redis 服务主机地址 |
+| `REDIS_PORT` | `6379`           | Redis 服务端口 |
+| `REDIS_AUTH` | `myredissecret`  | Redis 认证密码 |
+| `REDIS_TLS_ENABLED` | `false`          | 是否启用 TLS 加密 |
+| `REDIS_TLS_CA` | `-`              | TLS CA 证书路径 |
+| `REDIS_TLS_CERT` | `-`              | TLS 客户端证书路径 |
+| `REDIS_TLS_KEY` | `-`              | TLS 私钥路径 |
+
+### 数据迁移配置
+
+| 参数名称 | 示例值 | 说明 |
+|---------|--------|------|
+| `LANGFUSE_ENABLE_BACKGROUND_MIGRATIONS` | `false` | 禁用后台迁移(使用 Doris 时需要关闭) |
+| `LANGFUSE_AUTO_DORIS_MIGRATION_DISABLED` | `false` | 启用 Doris 自动迁移 |
+
+
+## Docker Compose 部署
+
+### 启动前准备
+
+这里我们提供一个可以直接启动的compose 示例,配置根据需求进行修改
+
+### 下载 docker compose
+
+```shell
+wget 
https://apache-doris-releases.oss-cn-beijing.aliyuncs.com/extension/docker-langfuse-doris.tar.gz
+```
+
+compose 文件与配置文件路径结构如下
+
+```text
+docker-langfuse-doris
+├── docker-compose.yml
+└── doris-config
+    └── fe_custom.conf
+```
+
+### 部署步骤
+
+### 1 . 启动 compose
+
+```Bash
+docker compose up -d
+```
+
+```Bash
+# 检查
+$ docker compose up -d
+[+] Running 9/9
+ ✔ Network docker-langfuse-doris_doris_internal  Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Network docker-langfuse-doris_default         Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Container doris_fe                            Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-postgres                   Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-redis                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-minio                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container doris_be                            Healthy                       
                                                                                
                                                                                
       54.3s 
+ ✔ Container langfuse-worker                     Started                       
                                                                                
                                                                                
       54.8s 
+ ✔ Container langfuse-web                        Started
+```
+
+### 3. 验证部署
+
+检查服务状态:
+
+当服务状态都为 Healthy 说明 compose 启动成功
+
+```Bash
+$ docker compose ps
+NAME                IMAGE                             COMMAND                  
SERVICE           CREATED         STATUS                        PORTS
+doris_be            apache/doris:be-2.1.11            "bash entry_point.sh"    
doris_be          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8040->8040/tcp, :::8040->8040/tcp, 0.0.0.0:8060->8060/tcp, 
:::8060->8060/tcp, 0.0.0.0:9050->9050/tcp, :::9050->9050/tcp, 
0.0.0.0:9060->9060/tcp, :::9060->9060/tcp
+doris_fe            apache/doris:fe-2.1.11            "bash init_fe.sh"        
doris_fe          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8030->8030/tcp, :::8030->8030/tcp, 0.0.0.0:9010->9010/tcp, 
:::9010->9010/tcp, 0.0.0.0:9030->9030/tcp, :::9030->9030/tcp
+langfuse-minio      minio/minio                       "sh -c 'mkdir -p /da…"   
minio             2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:19090->9000/tcp, :::19090->9000/tcp, 127.0.0.1:19091->9001/tcp
+langfuse-postgres   postgres:latest                   "docker-entrypoint.s…"   
postgres          2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:5432->5432/tcp
+langfuse-redis      redis:7                           "docker-entrypoint.s…"   
redis             2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:16379->6379/tcp
+langfuse-web        selectdb/langfuse-web:latest      "dumb-init -- ./web/…"   
langfuse-web      2 minutes ago   Up About a minute (healthy)   
0.0.0.0:13000->3000/tcp, :::13000->3000/tcp
+langfuse-worker     selectdb/langfuse-worker:latest   "dumb-init -- ./work…"   
langfuse-worker   2 minutes ago   Up About a minute (healthy)   
0.0.0.0:3030->3030/tcp, :::3030->3030/tcp
+```
+
+
+#### 4. 服务初始化
+
+部署完成后,通过以下方式访问和初始化服务:
+
+**访问 Langfuse Web 界面**:
+- 地址:http://localhost:3000
+
+**初始化步骤**:
+1. 打开浏览器访问 http://localhost:3000
+2. 创建管理员账户并登录
+3. 创建新组织与新项目
+4. 获取项目的 API Keys(Public Key 和 Secret Key)
+5. 配置 SDK 集成所需的认证信息
+
+
+# Examples
+
+## Using Langfuse SDK
+
+```Python
+import os
+# Instead of: import openai
+from langfuse.openai import OpenAI
+# from langfuse import observe
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+
+# use OpenAI client to access DeepSeek API
+client = OpenAI(
+    base_url="https://api.deepseek.com";
+)
+
+
+# ask a question
+question = "Doris 可观测性解决方案的特点是什么?回答简洁清晰"
+print(f"question: {question}")
+
+completion = client.chat.completions.create(
+    model="deepseek-chat",
+    messages=[
+        {"role": "user", "content": question}
+    ]
+)
+response = completion.choices[0].message.content
+print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LangChain SDK
+
+```Python
+import os
+from langfuse.langchain import CallbackHandler
+from langchain_openai import ChatOpenAI
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+# Create your LangChain components (using DeepSeek API)
+llm = ChatOpenAI(
+    model="deepseek-chat",
+    openai_api_base="https://api.deepseek.com";
+)
+
+# ask a question
+question = "Doris 可观测性解决方案的特点是什么?回答简洁清晰"
+print(f"question: {question} \n")
+
+# Run your chain with Langfuse tracing
+try:
+    # Initialize the Langfuse handler
+    langfuse_handler = CallbackHandler()
+    response = llm.invoke(question, config={"callbacks": [langfuse_handler]})
+    print(f"response: {response.content}")
+except Exception as e:
+    print(f"Error during chain execution: {e}")
+```
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LlamaIndex SDK
+
+```Python
+from langfuse import get_client
+from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
+from llama_index.llms.deepseek import DeepSeek
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+langfuse = get_client()
+
+
+# Initialize LlamaIndex instrumentation
+LlamaIndexInstrumentor().instrument()
+
+
+# Set up the DeepSeek class with the required model and API key
+llm = DeepSeek(model="deepseek-chat")
+
+
+# ask a question
+question = "Doris 可观测性解决方案的特点是什么?回答简洁清晰"
+print(f"question: {question} \n")
+ 
+with langfuse.start_as_current_span(name="llama-index-trace"):
+    response = llm.complete(question)
+    print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_3.png)
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/ecosystem/observability/langfuse.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/ecosystem/observability/langfuse.md
new file mode 100644
index 00000000000..92f445036c3
--- /dev/null
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/ecosystem/observability/langfuse.md
@@ -0,0 +1,333 @@
+---
+{
+    "title": "Langfuse on Doris",
+    "language": "zh-CN"
+}
+---
+
+# Langfuse on Doris
+
+## 关于 Langfuse
+
+Langfuse 是一个开源的 LLM 工程平台,专门为大语言模型应用提供全面的可观测性解决方案。它主要提供以下核心功能:
+
+- **链路追踪**:完整记录 LLM 应用的调用链路和执行流程
+- **性能评估**:提供多维度的模型性能评估和质量分析
+- **提示管理**:集中管理和版本控制提示词模板
+- **指标监控**:实时监控应用性能、成本和质量指标
+
+本文档将详细介绍如何部署基于 Apache Doris 作为分析后端的 Langfuse 解决方案,充分利用 Doris 强大的 OLAP 
分析能力来处理大规模的 LLM 应用数据。
+
+
+## 系统架构
+
+Langfuse on Doris 解决方案采用微服务架构,包含以下核心组件:
+
+| 组件              | 端口        | 功能说明                                           
                                  |
+|-----------------|-----------|----------------------------------------------------------------------------------|
+| Langfuse Web    | 3000      | Web 界面和 API 服务,提供用户交互和数据接入                     
                                  |
+| Langfuse Worker | 3030      | 异步任务处理,负责数据处理和分析任务                             
                                  |
+| PostgreSQL      | 5432      | 事务性数据存储,保存用户配置和元数据                             
                                  |
+| Redis           | 6379      | 缓存层和消息队列,提升系统响应性能                              
                                  |
+| MinIO           | 9090      | 对象存储服务,存储原始事件和多模态附件                            
                                  |
+| Doris Fe        | 9030 8030 | Doris frontend, Doris 
架构的一部分,主要负责接收用户请求、查询解析和规划、元数据管理以及节点管理                      |
+| Doris Be        | 8040 8050 | Doris Backends ,Doris 
架构的一部分,主要负责数据存储和查询计划的执行。数据会被切分成数据分片(Shard),在 BE 中以多副本方式存储。 |
+
+::: note
+
+在部署 Apache Doris 时,可以根据硬件环境与业务需求选择存算一体架构或存算分离架构。
+在 Langfuse 部署中,生产环境不建议使用 Docker Doris,Docker 中带有的 Fe,Be 部分为了方便用户快速体验Langfuse 
on Doris 的能力
+
+:::
+
+```mermaid
+flowchart TB
+    User["UI, API, SDKs"]
+    subgraph vpc["VPC"]
+        Web["Web Server<br/>(langfuse/langfuse)"]
+        Worker["Async Worker<br/>(langfuse/worker)"]
+        Postgres["Postgres - OLTP<br/>(Transactional Data)"]
+        Cache["Redis/Valkey<br/>(Cache, Queue)"]
+        Doris["Doris - OLAP<br/>(Observability Data)"]
+        S3["S3 / Blob Storage<br/>(Raw events, multi-modal attachments)"]
+    end
+    LLM["LLM API/Gateway<br/>(optional)"]
+
+    User --> Web
+    Web --> S3
+    Web --> Postgres
+    Web --> Cache
+    Web --> Doris
+    Web -.->|"optional for playground"| LLM
+
+    Cache --> Worker
+    Worker --> Doris
+    Worker --> Postgres
+    Worker --> S3
+    Worker -.->|"optional for evals"| LLM
+```
+
+## 部署要求
+
+### 软件环境
+
+| 组件 | 版本要求 | 说明 |
+|------|----------|------|
+| Docker | 20.0+ | 容器运行环境 |
+| Docker Compose | 2.0+ | 容器编排工具 |
+| Apache Doris | 2.1.10+ | 分析数据库,需独立部署 |
+
+### 硬件资源
+
+| 资源类型 | 最低要求 | 推荐配置 | 说明 |
+|----------|----------|----------|------|
+| 内存 | 8GB | 16GB+ | 支持多服务并发运行 |
+| 磁盘 | 50GB | 100GB+ | 存储容器数据和日志 |
+| 网络 | 1Gbps | 10Gbps | 确保数据传输性能 |
+
+### 前置条件
+
+1. **Doris 集群准备**
+    - 确保 Doris 集群正常运行且性能稳定
+    - 验证 FE HTTP 端口(默认 8030)和查询端口(默认 9030)网络可达
+    - Langfuse 启动后将自动在 Doris 中创建所需的数据库和表结构
+
+2. **网络连通性**
+    - 部署环境能够访问 Docker Hub 拉取镜像
+    - Langfuse 服务能够访问 Doris 集群的相关端口
+    - 客户端能够访问 Langfuse Web 服务端口
+
+:::tip 部署建议
+推荐使用 Docker 部署 Langfuse 服务组件(Web、Worker、Redis、PostgreSQL),但 Doris 
建议独立部署以获得更好的性能和稳定性。详细的 Doris 部署指南请参考官方文档。
+:::
+
+## 配置参数
+
+Langfuse 服务需要配置多个环境变量来支持各个组件的正常运行:
+
+### Doris 分析后端配置
+
+| 参数名称 | 示例值 | 说明 |
+|---------|--------|------|
+| `LANGFUSE_ANALYTICS_BACKEND` | `doris` | 指定使用 Doris 作为分析后端 |
+| `DORIS_FE_HTTP_URL` | `http://localhost:8030` | Doris FE HTTP 服务地址 |
+| `DORIS_FE_QUERY_PORT` | `9030` | Doris FE 查询端口 |
+| `DORIS_DB` | `langfuse` | Doris 数据库名称 |
+| `DORIS_USER` | `root` | Doris 用户名 |
+| `DORIS_PASSWORD` | `123456` | Doris 密码 |
+| `DORIS_MAX_OPEN_CONNECTIONS` | `100` | 最大数据库连接数 |
+| `DORIS_REQUEST_TIMEOUT_MS` | `300000` | 请求超时时间(毫秒) |
+
+### 基础服务配置
+
+| 参数名称 | 示例值 | 说明 |
+|---------|--------|------|
+| `DATABASE_URL` | 
`postgresql://postgres:postgres@langfuse-postgres:5432/postgres` | PostgreSQL 
数据库连接地址 |
+| `NEXTAUTH_SECRET` | `your-debug-secret-key-here-must-be-long-enough` | 
NextAuth 认证密钥,用于会话加密 |
+| `SALT` | `your-super-secret-salt-with-at-least-32-characters-for-encryption` 
| 数据加密盐值(至少32字符) |
+| `ENCRYPTION_KEY` | 
`0000000000000000000000000000000000000000000000000000000000000000` | 
数据加密密钥(64字符) |
+| `NEXTAUTH_URL` | `http://localhost:3000` | Langfuse Web 服务地址 |
+| `TZ` | `UTC` | 系统时区设置 |
+
+### Redis 缓存配置
+
+| 参数名称 | 示例值              | 说明 |
+|---------|------------------|------|
+| `REDIS_HOST` | `langfuse-redis` | Redis 服务主机地址 |
+| `REDIS_PORT` | `6379`           | Redis 服务端口 |
+| `REDIS_AUTH` | `myredissecret`  | Redis 认证密码 |
+| `REDIS_TLS_ENABLED` | `false`          | 是否启用 TLS 加密 |
+| `REDIS_TLS_CA` | `-`              | TLS CA 证书路径 |
+| `REDIS_TLS_CERT` | `-`              | TLS 客户端证书路径 |
+| `REDIS_TLS_KEY` | `-`              | TLS 私钥路径 |
+
+### 数据迁移配置
+
+| 参数名称 | 示例值 | 说明 |
+|---------|--------|------|
+| `LANGFUSE_ENABLE_BACKGROUND_MIGRATIONS` | `false` | 禁用后台迁移(使用 Doris 时需要关闭) |
+| `LANGFUSE_AUTO_DORIS_MIGRATION_DISABLED` | `false` | 启用 Doris 自动迁移 |
+
+
+## Docker Compose 部署
+
+### 启动前准备
+
+这里我们提供一个可以直接启动的compose 示例,配置根据需求进行修改
+
+### 下载 docker compose
+
+```shell
+wget 
https://apache-doris-releases.oss-cn-beijing.aliyuncs.com/extension/docker-langfuse-doris.tar.gz
+```
+
+compose 文件与配置文件路径结构如下
+
+```text
+docker-langfuse-doris
+├── docker-compose.yml
+└── doris-config
+    └── fe_custom.conf
+```
+
+### 部署步骤
+
+### 1 . 启动 compose
+
+```Bash
+docker compose up -d
+```
+
+```Bash
+# 检查
+$ docker compose up -d
+[+] Running 9/9
+ ✔ Network docker-langfuse-doris_doris_internal  Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Network docker-langfuse-doris_default         Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Container doris_fe                            Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-postgres                   Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-redis                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-minio                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container doris_be                            Healthy                       
                                                                                
                                                                                
       54.3s 
+ ✔ Container langfuse-worker                     Started                       
                                                                                
                                                                                
       54.8s 
+ ✔ Container langfuse-web                        Started
+```
+
+### 3. 验证部署
+
+检查服务状态:
+
+当服务状态都为 Healthy 说明 compose 启动成功
+
+```Bash
+$ docker compose ps
+NAME                IMAGE                             COMMAND                  
SERVICE           CREATED         STATUS                        PORTS
+doris_be            apache/doris:be-2.1.11            "bash entry_point.sh"    
doris_be          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8040->8040/tcp, :::8040->8040/tcp, 0.0.0.0:8060->8060/tcp, 
:::8060->8060/tcp, 0.0.0.0:9050->9050/tcp, :::9050->9050/tcp, 
0.0.0.0:9060->9060/tcp, :::9060->9060/tcp
+doris_fe            apache/doris:fe-2.1.11            "bash init_fe.sh"        
doris_fe          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8030->8030/tcp, :::8030->8030/tcp, 0.0.0.0:9010->9010/tcp, 
:::9010->9010/tcp, 0.0.0.0:9030->9030/tcp, :::9030->9030/tcp
+langfuse-minio      minio/minio                       "sh -c 'mkdir -p /da…"   
minio             2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:19090->9000/tcp, :::19090->9000/tcp, 127.0.0.1:19091->9001/tcp
+langfuse-postgres   postgres:latest                   "docker-entrypoint.s…"   
postgres          2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:5432->5432/tcp
+langfuse-redis      redis:7                           "docker-entrypoint.s…"   
redis             2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:16379->6379/tcp
+langfuse-web        selectdb/langfuse-web:latest      "dumb-init -- ./web/…"   
langfuse-web      2 minutes ago   Up About a minute (healthy)   
0.0.0.0:13000->3000/tcp, :::13000->3000/tcp
+langfuse-worker     selectdb/langfuse-worker:latest   "dumb-init -- ./work…"   
langfuse-worker   2 minutes ago   Up About a minute (healthy)   
0.0.0.0:3030->3030/tcp, :::3030->3030/tcp
+```
+
+
+#### 4. 服务初始化
+
+部署完成后,通过以下方式访问和初始化服务:
+
+**访问 Langfuse Web 界面**:
+- 地址:http://localhost:3000
+
+**初始化步骤**:
+1. 打开浏览器访问 http://localhost:3000
+2. 创建管理员账户并登录
+3. 创建新组织与新项目
+4. 获取项目的 API Keys(Public Key 和 Secret Key)
+5. 配置 SDK 集成所需的认证信息
+
+
+# Examples
+
+## Using Langfuse SDK
+
+```Python
+import os
+# Instead of: import openai
+from langfuse.openai import OpenAI
+# from langfuse import observe
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+
+# use OpenAI client to access DeepSeek API
+client = OpenAI(
+    base_url="https://api.deepseek.com";
+)
+
+
+# ask a question
+question = "Doris 可观测性解决方案的特点是什么?回答简洁清晰"
+print(f"question: {question}")
+
+completion = client.chat.completions.create(
+    model="deepseek-chat",
+    messages=[
+        {"role": "user", "content": question}
+    ]
+)
+response = completion.choices[0].message.content
+print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LangChain SDK
+
+```Python
+import os
+from langfuse.langchain import CallbackHandler
+from langchain_openai import ChatOpenAI
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+# Create your LangChain components (using DeepSeek API)
+llm = ChatOpenAI(
+    model="deepseek-chat",
+    openai_api_base="https://api.deepseek.com";
+)
+
+# ask a question
+question = "Doris 可观测性解决方案的特点是什么?回答简洁清晰"
+print(f"question: {question} \n")
+
+# Run your chain with Langfuse tracing
+try:
+    # Initialize the Langfuse handler
+    langfuse_handler = CallbackHandler()
+    response = llm.invoke(question, config={"callbacks": [langfuse_handler]})
+    print(f"response: {response.content}")
+except Exception as e:
+    print(f"Error during chain execution: {e}")
+```
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LlamaIndex SDK
+
+```Python
+from langfuse import get_client
+from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
+from llama_index.llms.deepseek import DeepSeek
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+langfuse = get_client()
+
+
+# Initialize LlamaIndex instrumentation
+LlamaIndexInstrumentor().instrument()
+
+
+# Set up the DeepSeek class with the required model and API key
+llm = DeepSeek(model="deepseek-chat")
+
+
+# ask a question
+question = "Doris 可观测性解决方案的特点是什么?回答简洁清晰"
+print(f"question: {question} \n")
+ 
+with langfuse.start_as_current_span(name="llama-index-trace"):
+    response = llm.complete(question)
+    print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_3.png)
diff --git a/sidebars.ts b/sidebars.ts
index 8ca38d52ad6..d8ec0c98d14 100644
--- a/sidebars.ts
+++ b/sidebars.ts
@@ -549,6 +549,7 @@ const sidebars: SidebarsConfig = {
                                 'ecosystem/observability/opentelemetry',
                                 'ecosystem/observability/fluentbit',
                                 'ecosystem/observability/loongcollector',
+                                'ecosystem/observability/langfuse',
                                 'ecosystem/observability/vector',
                             ],
                         },
@@ -993,6 +994,7 @@ const sidebars: SidebarsConfig = {
                         'ecosystem/observability/opentelemetry',
                         'ecosystem/observability/fluentbit',
                         'ecosystem/observability/loongcollector',
+                        'ecosystem/observability/langfuse',
                         'ecosystem/observability/vector',
                     ],
                 },
diff --git a/static/images/ecomsystem/langfuse/langfuse_1.png 
b/static/images/ecomsystem/langfuse/langfuse_1.png
new file mode 100644
index 00000000000..12af119d0cf
Binary files /dev/null and b/static/images/ecomsystem/langfuse/langfuse_1.png 
differ
diff --git a/static/images/ecomsystem/langfuse/langfuse_2.png 
b/static/images/ecomsystem/langfuse/langfuse_2.png
new file mode 100644
index 00000000000..83c5fcd0a36
Binary files /dev/null and b/static/images/ecomsystem/langfuse/langfuse_2.png 
differ
diff --git a/static/images/ecomsystem/langfuse/langfuse_3.png 
b/static/images/ecomsystem/langfuse/langfuse_3.png
new file mode 100644
index 00000000000..0115f47027b
Binary files /dev/null and b/static/images/ecomsystem/langfuse/langfuse_3.png 
differ
diff --git a/versioned_docs/version-2.1/ecosystem/observability/langfuse.md 
b/versioned_docs/version-2.1/ecosystem/observability/langfuse.md
new file mode 100644
index 00000000000..3f010c0f067
--- /dev/null
+++ b/versioned_docs/version-2.1/ecosystem/observability/langfuse.md
@@ -0,0 +1,331 @@
+---
+{
+    "title": "Langfuse on Doris",
+    "language": "en"
+}
+---
+
+# Langfuse on Doris
+
+## About Langfuse
+
+Langfuse is an open-source LLM engineering platform that provides 
comprehensive observability solutions for large language model applications. It 
offers the following core features:
+
+- **Tracing**: Complete recording of LLM application call chains and execution 
flows
+- **Evaluation**: Multi-dimensional model performance evaluation and quality 
analysis
+- **Prompt Management**: Centralized management and version control of prompt 
templates
+- **Metrics Monitoring**: Real-time monitoring of application performance, 
cost, and quality metrics
+
+This document provides detailed instructions on how to deploy a Langfuse 
solution using Apache Doris as the analytics backend, fully leveraging Doris's 
powerful OLAP analytics capabilities to process large-scale LLM application 
data.
+
+
+## System Architecture
+
+The Langfuse on Doris solution uses a microservices architecture with the 
following core components:
+
+| Component         | Ports     | Description                                  
                                        |
+|-----------------|-----------|----------------------------------------------------------------------------------|
+| Langfuse Web    | 3000      | Web interface and API service for user 
interaction and data ingestion                                                  
     |
+| Langfuse Worker | 3030      | Async task processing for data processing and 
analytics tasks                                                               |
+| PostgreSQL      | 5432      | Transactional data storage for user 
configuration and metadata                                                      
         |
+| Redis           | 6379      | Cache layer and message queue for improved 
system response performance                                                     
           |
+| MinIO           | 9090      | Object storage service for raw events and 
multi-modal attachments                                                         
     |
+| Doris Fe        | 9030 8030 | Doris Frontend, part of the Doris 
architecture, responsible for receiving user requests, query parsing and 
planning, metadata management, and node management                      |
+| Doris Be        | 8040 8050 | Doris Backends, part of the Doris 
architecture, responsible for data storage and query plan execution. Data is 
split into shards and stored with multiple replicas in BE nodes. |
+
+::: note
+
+When deploying Apache Doris, you can choose between integrated compute-storage 
architecture or disaggregated compute-storage architecture based on your 
hardware environment and business requirements.
+For Langfuse deployment, Docker Doris is not recommended for production 
environments. The FE and BE components included in Docker are intended for 
users to quickly experience the Langfuse on Doris capabilities.
+
+:::
+
+```mermaid
+flowchart TB
+    User["UI, API, SDKs"]
+    subgraph vpc["VPC"]
+        Web["Web Server<br/>(langfuse/langfuse)"]
+        Worker["Async Worker<br/>(langfuse/worker)"]
+        Postgres["Postgres - OLTP<br/>(Transactional Data)"]
+        Cache["Redis/Valkey<br/>(Cache, Queue)"]
+        Doris["Doris - OLAP<br/>(Observability Data)"]
+        S3["S3 / Blob Storage<br/>(Raw events, multi-modal attachments)"]
+    end
+    LLM["LLM API/Gateway<br/>(optional)"]
+
+    User --> Web
+    Web --> S3
+    Web --> Postgres
+    Web --> Cache
+    Web --> Doris
+    Web -.->|"optional for playground"| LLM
+
+    Cache --> Worker
+    Worker --> Doris
+    Worker --> Postgres
+    Worker --> S3
+    Worker -.->|"optional for evals"| LLM
+```
+
+## Deployment Requirements
+
+### Software Environment
+
+| Component | Version | Description |
+|------|----------|------|
+| Docker | 20.0+ | Container runtime environment |
+| Docker Compose | 2.0+ | Container orchestration tool |
+| Apache Doris | 2.1.10+ | Analytics database, requires separate deployment |
+
+### Hardware Resources
+
+| Resource Type | Minimum | Recommended | Description |
+|----------|----------|----------|------|
+| Memory | 8GB | 16GB+ | Supports multi-service concurrent operation |
+| Disk | 50GB | 100GB+ | Storage for container data and logs |
+| Network | 1Gbps | 10Gbps | Ensures data transfer performance |
+
+### Prerequisites
+
+1. **Doris Cluster Preparation**
+    - Ensure the Doris cluster is running properly with stable performance
+    - Verify that FE HTTP port (default 8030) and query port (default 9030) 
are network accessible
+    - Langfuse will automatically create the required database and table 
structures in Doris after startup
+
+2. **Network Connectivity**
+    - Deployment environment can access Docker Hub to pull images
+    - Langfuse services can access the relevant ports of the Doris cluster
+    - Clients can access the Langfuse Web service port
+
+:::tip Deployment Recommendation
+It is recommended to use Docker to deploy Langfuse service components (Web, 
Worker, Redis, PostgreSQL), but Doris is recommended to be deployed separately 
for better performance and stability. Please refer to the official 
documentation for detailed Doris deployment guide.
+:::
+
+## Configuration Parameters
+
+Langfuse services require multiple environment variables to support the proper 
operation of each component:
+
+### Doris Analytics Backend Configuration
+
+| Parameter | Example Value | Description |
+|---------|--------|------|
+| `LANGFUSE_ANALYTICS_BACKEND` | `doris` | Specify Doris as the analytics 
backend |
+| `DORIS_FE_HTTP_URL` | `http://localhost:8030` | Doris FE HTTP service 
address |
+| `DORIS_FE_QUERY_PORT` | `9030` | Doris FE query port |
+| `DORIS_DB` | `langfuse` | Doris database name |
+| `DORIS_USER` | `root` | Doris username |
+| `DORIS_PASSWORD` | `123456` | Doris password |
+| `DORIS_MAX_OPEN_CONNECTIONS` | `100` | Maximum database connections |
+| `DORIS_REQUEST_TIMEOUT_MS` | `300000` | Request timeout in milliseconds |
+
+### Basic Service Configuration
+
+| Parameter | Example Value | Description |
+|---------|--------|------|
+| `DATABASE_URL` | 
`postgresql://postgres:postgres@langfuse-postgres:5432/postgres` | PostgreSQL 
database connection URL |
+| `NEXTAUTH_SECRET` | `your-debug-secret-key-here-must-be-long-enough` | 
NextAuth authentication key for session encryption |
+| `SALT` | `your-super-secret-salt-with-at-least-32-characters-for-encryption` 
| Data encryption salt (at least 32 characters) |
+| `ENCRYPTION_KEY` | 
`0000000000000000000000000000000000000000000000000000000000000000` | Data 
encryption key (64 characters) |
+| `NEXTAUTH_URL` | `http://localhost:3000` | Langfuse Web service address |
+| `TZ` | `UTC` | System timezone setting |
+
+### Redis Cache Configuration
+
+| Parameter | Example Value              | Description |
+|---------|------------------|------|
+| `REDIS_HOST` | `langfuse-redis` | Redis service host address |
+| `REDIS_PORT` | `6379`           | Redis service port |
+| `REDIS_AUTH` | `myredissecret`  | Redis authentication password |
+| `REDIS_TLS_ENABLED` | `false`          | Whether to enable TLS encryption |
+| `REDIS_TLS_CA` | `-`              | TLS CA certificate path |
+| `REDIS_TLS_CERT` | `-`              | TLS client certificate path |
+| `REDIS_TLS_KEY` | `-`              | TLS private key path |
+
+### Data Migration Configuration
+
+| Parameter | Example Value | Description |
+|---------|--------|------|
+| `LANGFUSE_ENABLE_BACKGROUND_MIGRATIONS` | `false` | Disable background 
migrations (must be disabled when using Doris) |
+| `LANGFUSE_AUTO_DORIS_MIGRATION_DISABLED` | `false` | Enable Doris auto 
migration |
+
+
+## Docker Compose Deployment
+
+### Pre-deployment Preparation
+
+Here we provide a compose example that can be started directly. Modify the 
configuration according to your requirements.
+
+### Download docker compose
+
+```shell
+wget 
https://apache-doris-releases.oss-cn-beijing.aliyuncs.com/extension/docker-langfuse-doris.tar.gz
+```
+
+The directory structure for compose file and configuration file is as follows:
+
+```text
+docker-langfuse-doris
+├── docker-compose.yml
+└── doris-config
+    └── fe_custom.conf
+```
+
+### Deployment Steps
+
+### 1 . Start compose
+
+```Bash
+docker compose up -d
+```
+
+```Bash
+# Check
+$ docker compose up -d
+[+] Running 9/9
+ ✔ Network docker-langfuse-doris_doris_internal  Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Network docker-langfuse-doris_default         Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Container doris_fe                            Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-postgres                   Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-redis                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-minio                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container doris_be                            Healthy                       
                                                                                
                                                                                
       54.3s 
+ ✔ Container langfuse-worker                     Started                       
                                                                                
                                                                                
       54.8s 
+ ✔ Container langfuse-web                        Started
+```
+
+### 3. Verify Deployment
+
+Check service status:
+
+When all service statuses show as Healthy, the compose has started 
successfully.
+
+```Bash
+$ docker compose ps
+NAME                IMAGE                             COMMAND                  
SERVICE           CREATED         STATUS                        PORTS
+doris_be            apache/doris:be-2.1.11            "bash entry_point.sh"    
doris_be          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8040->8040/tcp, :::8040->8040/tcp, 0.0.0.0:8060->8060/tcp, 
:::8060->8060/tcp, 0.0.0.0:9050->9050/tcp, :::9050->9050/tcp, 
0.0.0.0:9060->9060/tcp, :::9060->9060/tcp
+doris_fe            apache/doris:fe-2.1.11            "bash init_fe.sh"        
doris_fe          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8030->8030/tcp, :::8030->8030/tcp, 0.0.0.0:9010->9010/tcp, 
:::9010->9010/tcp, 0.0.0.0:9030->9030/tcp, :::9030->9030/tcp
+langfuse-minio      minio/minio                       "sh -c 'mkdir -p /da…"   
minio             2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:19090->9000/tcp, :::19090->9000/tcp, 127.0.0.1:19091->9001/tcp
+langfuse-postgres   postgres:latest                   "docker-entrypoint.s…"   
postgres          2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:5432->5432/tcp
+langfuse-redis      redis:7                           "docker-entrypoint.s…"   
redis             2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:16379->6379/tcp
+langfuse-web        selectdb/langfuse-web:latest      "dumb-init -- ./web/…"   
langfuse-web      2 minutes ago   Up About a minute (healthy)   
0.0.0.0:13000->3000/tcp, :::13000->3000/tcp
+langfuse-worker     selectdb/langfuse-worker:latest   "dumb-init -- ./work…"   
langfuse-worker   2 minutes ago   Up About a minute (healthy)   
0.0.0.0:3030->3030/tcp, :::3030->3030/tcp
+```
+
+
+#### 4. Service Initialization
+
+After deployment is complete, access and initialize the service as follows:
+
+**Access Langfuse Web Interface**:
+- URL: http://localhost:3000
+
+**Initialization Steps**:
+1. Open your browser and navigate to http://localhost:3000
+2. Create an administrator account and log in
+3. Create a new organization and project
+4. Obtain the project's API Keys (Public Key and Secret Key)
+5. Configure the authentication information required for SDK integration
+
+
+# Examples
+
+## Using Langfuse SDK
+
+```Python
+import os
+# Instead of: import openai
+from langfuse.openai import OpenAI
+# from langfuse import observe
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+
+# use OpenAI client
+client = OpenAI()
+
+
+# ask a question
+question = "What are the key features of the Doris observability solution? 
Please answer concisely."
+print(f"question: {question}")
+
+completion = client.chat.completions.create(
+    model="gpt-4o",
+    messages=[
+        {"role": "user", "content": question}
+    ]
+)
+response = completion.choices[0].message.content
+print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LangChain SDK
+
+```Python
+import os
+from langfuse.langchain import CallbackHandler
+from langchain_openai import ChatOpenAI
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+# Create your LangChain components (using OpenAI API)
+llm = ChatOpenAI(
+    model="gpt-4o"
+)
+
+# ask a question
+question = "What are the key features of the Doris observability solution? 
Please answer concisely."
+print(f"question: {question} \n")
+
+# Run your chain with Langfuse tracing
+try:
+    # Initialize the Langfuse handler
+    langfuse_handler = CallbackHandler()
+    response = llm.invoke(question, config={"callbacks": [langfuse_handler]})
+    print(f"response: {response.content}")
+except Exception as e:
+    print(f"Error during chain execution: {e}")
+```
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LlamaIndex SDK
+
+```Python
+import os
+from langfuse import get_client
+from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
+from llama_index.llms.openai import OpenAI
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+langfuse = get_client()
+
+
+# Initialize LlamaIndex instrumentation
+LlamaIndexInstrumentor().instrument()
+
+
+# Set up the OpenAI class with the required model
+llm = OpenAI(model="gpt-4o")
+
+
+# ask a question
+question = "What are the key features of the Doris observability solution? 
Please answer concisely."
+print(f"question: {question} \n")
+ 
+with langfuse.start_as_current_span(name="llama-index-trace"):
+    response = llm.complete(question)
+    print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_3.png)
diff --git a/versioned_docs/version-3.x/ecosystem/observability/langfuse.md 
b/versioned_docs/version-3.x/ecosystem/observability/langfuse.md
new file mode 100644
index 00000000000..3f010c0f067
--- /dev/null
+++ b/versioned_docs/version-3.x/ecosystem/observability/langfuse.md
@@ -0,0 +1,331 @@
+---
+{
+    "title": "Langfuse on Doris",
+    "language": "en"
+}
+---
+
+# Langfuse on Doris
+
+## About Langfuse
+
+Langfuse is an open-source LLM engineering platform that provides 
comprehensive observability solutions for large language model applications. It 
offers the following core features:
+
+- **Tracing**: Complete recording of LLM application call chains and execution 
flows
+- **Evaluation**: Multi-dimensional model performance evaluation and quality 
analysis
+- **Prompt Management**: Centralized management and version control of prompt 
templates
+- **Metrics Monitoring**: Real-time monitoring of application performance, 
cost, and quality metrics
+
+This document provides detailed instructions on how to deploy a Langfuse 
solution using Apache Doris as the analytics backend, fully leveraging Doris's 
powerful OLAP analytics capabilities to process large-scale LLM application 
data.
+
+
+## System Architecture
+
+The Langfuse on Doris solution uses a microservices architecture with the 
following core components:
+
+| Component         | Ports     | Description                                  
                                        |
+|-----------------|-----------|----------------------------------------------------------------------------------|
+| Langfuse Web    | 3000      | Web interface and API service for user 
interaction and data ingestion                                                  
     |
+| Langfuse Worker | 3030      | Async task processing for data processing and 
analytics tasks                                                               |
+| PostgreSQL      | 5432      | Transactional data storage for user 
configuration and metadata                                                      
         |
+| Redis           | 6379      | Cache layer and message queue for improved 
system response performance                                                     
           |
+| MinIO           | 9090      | Object storage service for raw events and 
multi-modal attachments                                                         
     |
+| Doris Fe        | 9030 8030 | Doris Frontend, part of the Doris 
architecture, responsible for receiving user requests, query parsing and 
planning, metadata management, and node management                      |
+| Doris Be        | 8040 8050 | Doris Backends, part of the Doris 
architecture, responsible for data storage and query plan execution. Data is 
split into shards and stored with multiple replicas in BE nodes. |
+
+::: note
+
+When deploying Apache Doris, you can choose between integrated compute-storage 
architecture or disaggregated compute-storage architecture based on your 
hardware environment and business requirements.
+For Langfuse deployment, Docker Doris is not recommended for production 
environments. The FE and BE components included in Docker are intended for 
users to quickly experience the Langfuse on Doris capabilities.
+
+:::
+
+```mermaid
+flowchart TB
+    User["UI, API, SDKs"]
+    subgraph vpc["VPC"]
+        Web["Web Server<br/>(langfuse/langfuse)"]
+        Worker["Async Worker<br/>(langfuse/worker)"]
+        Postgres["Postgres - OLTP<br/>(Transactional Data)"]
+        Cache["Redis/Valkey<br/>(Cache, Queue)"]
+        Doris["Doris - OLAP<br/>(Observability Data)"]
+        S3["S3 / Blob Storage<br/>(Raw events, multi-modal attachments)"]
+    end
+    LLM["LLM API/Gateway<br/>(optional)"]
+
+    User --> Web
+    Web --> S3
+    Web --> Postgres
+    Web --> Cache
+    Web --> Doris
+    Web -.->|"optional for playground"| LLM
+
+    Cache --> Worker
+    Worker --> Doris
+    Worker --> Postgres
+    Worker --> S3
+    Worker -.->|"optional for evals"| LLM
+```
+
+## Deployment Requirements
+
+### Software Environment
+
+| Component | Version | Description |
+|------|----------|------|
+| Docker | 20.0+ | Container runtime environment |
+| Docker Compose | 2.0+ | Container orchestration tool |
+| Apache Doris | 2.1.10+ | Analytics database, requires separate deployment |
+
+### Hardware Resources
+
+| Resource Type | Minimum | Recommended | Description |
+|----------|----------|----------|------|
+| Memory | 8GB | 16GB+ | Supports multi-service concurrent operation |
+| Disk | 50GB | 100GB+ | Storage for container data and logs |
+| Network | 1Gbps | 10Gbps | Ensures data transfer performance |
+
+### Prerequisites
+
+1. **Doris Cluster Preparation**
+    - Ensure the Doris cluster is running properly with stable performance
+    - Verify that FE HTTP port (default 8030) and query port (default 9030) 
are network accessible
+    - Langfuse will automatically create the required database and table 
structures in Doris after startup
+
+2. **Network Connectivity**
+    - Deployment environment can access Docker Hub to pull images
+    - Langfuse services can access the relevant ports of the Doris cluster
+    - Clients can access the Langfuse Web service port
+
+:::tip Deployment Recommendation
+It is recommended to use Docker to deploy Langfuse service components (Web, 
Worker, Redis, PostgreSQL), but Doris is recommended to be deployed separately 
for better performance and stability. Please refer to the official 
documentation for detailed Doris deployment guide.
+:::
+
+## Configuration Parameters
+
+Langfuse services require multiple environment variables to support the proper 
operation of each component:
+
+### Doris Analytics Backend Configuration
+
+| Parameter | Example Value | Description |
+|---------|--------|------|
+| `LANGFUSE_ANALYTICS_BACKEND` | `doris` | Specify Doris as the analytics 
backend |
+| `DORIS_FE_HTTP_URL` | `http://localhost:8030` | Doris FE HTTP service 
address |
+| `DORIS_FE_QUERY_PORT` | `9030` | Doris FE query port |
+| `DORIS_DB` | `langfuse` | Doris database name |
+| `DORIS_USER` | `root` | Doris username |
+| `DORIS_PASSWORD` | `123456` | Doris password |
+| `DORIS_MAX_OPEN_CONNECTIONS` | `100` | Maximum database connections |
+| `DORIS_REQUEST_TIMEOUT_MS` | `300000` | Request timeout in milliseconds |
+
+### Basic Service Configuration
+
+| Parameter | Example Value | Description |
+|---------|--------|------|
+| `DATABASE_URL` | 
`postgresql://postgres:postgres@langfuse-postgres:5432/postgres` | PostgreSQL 
database connection URL |
+| `NEXTAUTH_SECRET` | `your-debug-secret-key-here-must-be-long-enough` | 
NextAuth authentication key for session encryption |
+| `SALT` | `your-super-secret-salt-with-at-least-32-characters-for-encryption` 
| Data encryption salt (at least 32 characters) |
+| `ENCRYPTION_KEY` | 
`0000000000000000000000000000000000000000000000000000000000000000` | Data 
encryption key (64 characters) |
+| `NEXTAUTH_URL` | `http://localhost:3000` | Langfuse Web service address |
+| `TZ` | `UTC` | System timezone setting |
+
+### Redis Cache Configuration
+
+| Parameter | Example Value              | Description |
+|---------|------------------|------|
+| `REDIS_HOST` | `langfuse-redis` | Redis service host address |
+| `REDIS_PORT` | `6379`           | Redis service port |
+| `REDIS_AUTH` | `myredissecret`  | Redis authentication password |
+| `REDIS_TLS_ENABLED` | `false`          | Whether to enable TLS encryption |
+| `REDIS_TLS_CA` | `-`              | TLS CA certificate path |
+| `REDIS_TLS_CERT` | `-`              | TLS client certificate path |
+| `REDIS_TLS_KEY` | `-`              | TLS private key path |
+
+### Data Migration Configuration
+
+| Parameter | Example Value | Description |
+|---------|--------|------|
+| `LANGFUSE_ENABLE_BACKGROUND_MIGRATIONS` | `false` | Disable background 
migrations (must be disabled when using Doris) |
+| `LANGFUSE_AUTO_DORIS_MIGRATION_DISABLED` | `false` | Enable Doris auto 
migration |
+
+
+## Docker Compose Deployment
+
+### Pre-deployment Preparation
+
+Here we provide a compose example that can be started directly. Modify the 
configuration according to your requirements.
+
+### Download docker compose
+
+```shell
+wget 
https://apache-doris-releases.oss-cn-beijing.aliyuncs.com/extension/docker-langfuse-doris.tar.gz
+```
+
+The directory structure for compose file and configuration file is as follows:
+
+```text
+docker-langfuse-doris
+├── docker-compose.yml
+└── doris-config
+    └── fe_custom.conf
+```
+
+### Deployment Steps
+
+### 1 . Start compose
+
+```Bash
+docker compose up -d
+```
+
+```Bash
+# Check
+$ docker compose up -d
+[+] Running 9/9
+ ✔ Network docker-langfuse-doris_doris_internal  Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Network docker-langfuse-doris_default         Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Container doris_fe                            Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-postgres                   Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-redis                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-minio                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container doris_be                            Healthy                       
                                                                                
                                                                                
       54.3s 
+ ✔ Container langfuse-worker                     Started                       
                                                                                
                                                                                
       54.8s 
+ ✔ Container langfuse-web                        Started
+```
+
+### 3. Verify Deployment
+
+Check service status:
+
+When all service statuses show as Healthy, the compose has started 
successfully.
+
+```Bash
+$ docker compose ps
+NAME                IMAGE                             COMMAND                  
SERVICE           CREATED         STATUS                        PORTS
+doris_be            apache/doris:be-2.1.11            "bash entry_point.sh"    
doris_be          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8040->8040/tcp, :::8040->8040/tcp, 0.0.0.0:8060->8060/tcp, 
:::8060->8060/tcp, 0.0.0.0:9050->9050/tcp, :::9050->9050/tcp, 
0.0.0.0:9060->9060/tcp, :::9060->9060/tcp
+doris_fe            apache/doris:fe-2.1.11            "bash init_fe.sh"        
doris_fe          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8030->8030/tcp, :::8030->8030/tcp, 0.0.0.0:9010->9010/tcp, 
:::9010->9010/tcp, 0.0.0.0:9030->9030/tcp, :::9030->9030/tcp
+langfuse-minio      minio/minio                       "sh -c 'mkdir -p /da…"   
minio             2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:19090->9000/tcp, :::19090->9000/tcp, 127.0.0.1:19091->9001/tcp
+langfuse-postgres   postgres:latest                   "docker-entrypoint.s…"   
postgres          2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:5432->5432/tcp
+langfuse-redis      redis:7                           "docker-entrypoint.s…"   
redis             2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:16379->6379/tcp
+langfuse-web        selectdb/langfuse-web:latest      "dumb-init -- ./web/…"   
langfuse-web      2 minutes ago   Up About a minute (healthy)   
0.0.0.0:13000->3000/tcp, :::13000->3000/tcp
+langfuse-worker     selectdb/langfuse-worker:latest   "dumb-init -- ./work…"   
langfuse-worker   2 minutes ago   Up About a minute (healthy)   
0.0.0.0:3030->3030/tcp, :::3030->3030/tcp
+```
+
+
+#### 4. Service Initialization
+
+After deployment is complete, access and initialize the service as follows:
+
+**Access Langfuse Web Interface**:
+- URL: http://localhost:3000
+
+**Initialization Steps**:
+1. Open your browser and navigate to http://localhost:3000
+2. Create an administrator account and log in
+3. Create a new organization and project
+4. Obtain the project's API Keys (Public Key and Secret Key)
+5. Configure the authentication information required for SDK integration
+
+
+# Examples
+
+## Using Langfuse SDK
+
+```Python
+import os
+# Instead of: import openai
+from langfuse.openai import OpenAI
+# from langfuse import observe
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+
+# use OpenAI client
+client = OpenAI()
+
+
+# ask a question
+question = "What are the key features of the Doris observability solution? 
Please answer concisely."
+print(f"question: {question}")
+
+completion = client.chat.completions.create(
+    model="gpt-4o",
+    messages=[
+        {"role": "user", "content": question}
+    ]
+)
+response = completion.choices[0].message.content
+print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LangChain SDK
+
+```Python
+import os
+from langfuse.langchain import CallbackHandler
+from langchain_openai import ChatOpenAI
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+# Create your LangChain components (using OpenAI API)
+llm = ChatOpenAI(
+    model="gpt-4o"
+)
+
+# ask a question
+question = "What are the key features of the Doris observability solution? 
Please answer concisely."
+print(f"question: {question} \n")
+
+# Run your chain with Langfuse tracing
+try:
+    # Initialize the Langfuse handler
+    langfuse_handler = CallbackHandler()
+    response = llm.invoke(question, config={"callbacks": [langfuse_handler]})
+    print(f"response: {response.content}")
+except Exception as e:
+    print(f"Error during chain execution: {e}")
+```
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LlamaIndex SDK
+
+```Python
+import os
+from langfuse import get_client
+from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
+from llama_index.llms.openai import OpenAI
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+langfuse = get_client()
+
+
+# Initialize LlamaIndex instrumentation
+LlamaIndexInstrumentor().instrument()
+
+
+# Set up the OpenAI class with the required model
+llm = OpenAI(model="gpt-4o")
+
+
+# ask a question
+question = "What are the key features of the Doris observability solution? 
Please answer concisely."
+print(f"question: {question} \n")
+ 
+with langfuse.start_as_current_span(name="llama-index-trace"):
+    response = llm.complete(question)
+    print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_3.png)
diff --git a/versioned_docs/version-4.x/ecosystem/observability/langfuse.md 
b/versioned_docs/version-4.x/ecosystem/observability/langfuse.md
new file mode 100644
index 00000000000..3f010c0f067
--- /dev/null
+++ b/versioned_docs/version-4.x/ecosystem/observability/langfuse.md
@@ -0,0 +1,331 @@
+---
+{
+    "title": "Langfuse on Doris",
+    "language": "en"
+}
+---
+
+# Langfuse on Doris
+
+## About Langfuse
+
+Langfuse is an open-source LLM engineering platform that provides 
comprehensive observability solutions for large language model applications. It 
offers the following core features:
+
+- **Tracing**: Complete recording of LLM application call chains and execution 
flows
+- **Evaluation**: Multi-dimensional model performance evaluation and quality 
analysis
+- **Prompt Management**: Centralized management and version control of prompt 
templates
+- **Metrics Monitoring**: Real-time monitoring of application performance, 
cost, and quality metrics
+
+This document provides detailed instructions on how to deploy a Langfuse 
solution using Apache Doris as the analytics backend, fully leveraging Doris's 
powerful OLAP analytics capabilities to process large-scale LLM application 
data.
+
+
+## System Architecture
+
+The Langfuse on Doris solution uses a microservices architecture with the 
following core components:
+
+| Component         | Ports     | Description                                  
                                        |
+|-----------------|-----------|----------------------------------------------------------------------------------|
+| Langfuse Web    | 3000      | Web interface and API service for user 
interaction and data ingestion                                                  
     |
+| Langfuse Worker | 3030      | Async task processing for data processing and 
analytics tasks                                                               |
+| PostgreSQL      | 5432      | Transactional data storage for user 
configuration and metadata                                                      
         |
+| Redis           | 6379      | Cache layer and message queue for improved 
system response performance                                                     
           |
+| MinIO           | 9090      | Object storage service for raw events and 
multi-modal attachments                                                         
     |
+| Doris Fe        | 9030 8030 | Doris Frontend, part of the Doris 
architecture, responsible for receiving user requests, query parsing and 
planning, metadata management, and node management                      |
+| Doris Be        | 8040 8050 | Doris Backends, part of the Doris 
architecture, responsible for data storage and query plan execution. Data is 
split into shards and stored with multiple replicas in BE nodes. |
+
+::: note
+
+When deploying Apache Doris, you can choose between integrated compute-storage 
architecture or disaggregated compute-storage architecture based on your 
hardware environment and business requirements.
+For Langfuse deployment, Docker Doris is not recommended for production 
environments. The FE and BE components included in Docker are intended for 
users to quickly experience the Langfuse on Doris capabilities.
+
+:::
+
+```mermaid
+flowchart TB
+    User["UI, API, SDKs"]
+    subgraph vpc["VPC"]
+        Web["Web Server<br/>(langfuse/langfuse)"]
+        Worker["Async Worker<br/>(langfuse/worker)"]
+        Postgres["Postgres - OLTP<br/>(Transactional Data)"]
+        Cache["Redis/Valkey<br/>(Cache, Queue)"]
+        Doris["Doris - OLAP<br/>(Observability Data)"]
+        S3["S3 / Blob Storage<br/>(Raw events, multi-modal attachments)"]
+    end
+    LLM["LLM API/Gateway<br/>(optional)"]
+
+    User --> Web
+    Web --> S3
+    Web --> Postgres
+    Web --> Cache
+    Web --> Doris
+    Web -.->|"optional for playground"| LLM
+
+    Cache --> Worker
+    Worker --> Doris
+    Worker --> Postgres
+    Worker --> S3
+    Worker -.->|"optional for evals"| LLM
+```
+
+## Deployment Requirements
+
+### Software Environment
+
+| Component | Version | Description |
+|------|----------|------|
+| Docker | 20.0+ | Container runtime environment |
+| Docker Compose | 2.0+ | Container orchestration tool |
+| Apache Doris | 2.1.10+ | Analytics database, requires separate deployment |
+
+### Hardware Resources
+
+| Resource Type | Minimum | Recommended | Description |
+|----------|----------|----------|------|
+| Memory | 8GB | 16GB+ | Supports multi-service concurrent operation |
+| Disk | 50GB | 100GB+ | Storage for container data and logs |
+| Network | 1Gbps | 10Gbps | Ensures data transfer performance |
+
+### Prerequisites
+
+1. **Doris Cluster Preparation**
+    - Ensure the Doris cluster is running properly with stable performance
+    - Verify that FE HTTP port (default 8030) and query port (default 9030) 
are network accessible
+    - Langfuse will automatically create the required database and table 
structures in Doris after startup
+
+2. **Network Connectivity**
+    - Deployment environment can access Docker Hub to pull images
+    - Langfuse services can access the relevant ports of the Doris cluster
+    - Clients can access the Langfuse Web service port
+
+:::tip Deployment Recommendation
+It is recommended to use Docker to deploy Langfuse service components (Web, 
Worker, Redis, PostgreSQL), but Doris is recommended to be deployed separately 
for better performance and stability. Please refer to the official 
documentation for detailed Doris deployment guide.
+:::
+
+## Configuration Parameters
+
+Langfuse services require multiple environment variables to support the proper 
operation of each component:
+
+### Doris Analytics Backend Configuration
+
+| Parameter | Example Value | Description |
+|---------|--------|------|
+| `LANGFUSE_ANALYTICS_BACKEND` | `doris` | Specify Doris as the analytics 
backend |
+| `DORIS_FE_HTTP_URL` | `http://localhost:8030` | Doris FE HTTP service 
address |
+| `DORIS_FE_QUERY_PORT` | `9030` | Doris FE query port |
+| `DORIS_DB` | `langfuse` | Doris database name |
+| `DORIS_USER` | `root` | Doris username |
+| `DORIS_PASSWORD` | `123456` | Doris password |
+| `DORIS_MAX_OPEN_CONNECTIONS` | `100` | Maximum database connections |
+| `DORIS_REQUEST_TIMEOUT_MS` | `300000` | Request timeout in milliseconds |
+
+### Basic Service Configuration
+
+| Parameter | Example Value | Description |
+|---------|--------|------|
+| `DATABASE_URL` | 
`postgresql://postgres:postgres@langfuse-postgres:5432/postgres` | PostgreSQL 
database connection URL |
+| `NEXTAUTH_SECRET` | `your-debug-secret-key-here-must-be-long-enough` | 
NextAuth authentication key for session encryption |
+| `SALT` | `your-super-secret-salt-with-at-least-32-characters-for-encryption` 
| Data encryption salt (at least 32 characters) |
+| `ENCRYPTION_KEY` | 
`0000000000000000000000000000000000000000000000000000000000000000` | Data 
encryption key (64 characters) |
+| `NEXTAUTH_URL` | `http://localhost:3000` | Langfuse Web service address |
+| `TZ` | `UTC` | System timezone setting |
+
+### Redis Cache Configuration
+
+| Parameter | Example Value              | Description |
+|---------|------------------|------|
+| `REDIS_HOST` | `langfuse-redis` | Redis service host address |
+| `REDIS_PORT` | `6379`           | Redis service port |
+| `REDIS_AUTH` | `myredissecret`  | Redis authentication password |
+| `REDIS_TLS_ENABLED` | `false`          | Whether to enable TLS encryption |
+| `REDIS_TLS_CA` | `-`              | TLS CA certificate path |
+| `REDIS_TLS_CERT` | `-`              | TLS client certificate path |
+| `REDIS_TLS_KEY` | `-`              | TLS private key path |
+
+### Data Migration Configuration
+
+| Parameter | Example Value | Description |
+|---------|--------|------|
+| `LANGFUSE_ENABLE_BACKGROUND_MIGRATIONS` | `false` | Disable background 
migrations (must be disabled when using Doris) |
+| `LANGFUSE_AUTO_DORIS_MIGRATION_DISABLED` | `false` | Enable Doris auto 
migration |
+
+
+## Docker Compose Deployment
+
+### Pre-deployment Preparation
+
+Here we provide a compose example that can be started directly. Modify the 
configuration according to your requirements.
+
+### Download docker compose
+
+```shell
+wget 
https://apache-doris-releases.oss-cn-beijing.aliyuncs.com/extension/docker-langfuse-doris.tar.gz
+```
+
+The directory structure for compose file and configuration file is as follows:
+
+```text
+docker-langfuse-doris
+├── docker-compose.yml
+└── doris-config
+    └── fe_custom.conf
+```
+
+### Deployment Steps
+
+### 1 . Start compose
+
+```Bash
+docker compose up -d
+```
+
+```Bash
+# Check
+$ docker compose up -d
+[+] Running 9/9
+ ✔ Network docker-langfuse-doris_doris_internal  Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Network docker-langfuse-doris_default         Created                       
                                                                                
                                                                                
        0.1s 
+ ✔ Container doris_fe                            Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-postgres                   Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-redis                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container langfuse-minio                      Healthy                       
                                                                                
                                                                                
       13.8s 
+ ✔ Container doris_be                            Healthy                       
                                                                                
                                                                                
       54.3s 
+ ✔ Container langfuse-worker                     Started                       
                                                                                
                                                                                
       54.8s 
+ ✔ Container langfuse-web                        Started
+```
+
+### 3. Verify Deployment
+
+Check service status:
+
+When all service statuses show as Healthy, the compose has started 
successfully.
+
+```Bash
+$ docker compose ps
+NAME                IMAGE                             COMMAND                  
SERVICE           CREATED         STATUS                        PORTS
+doris_be            apache/doris:be-2.1.11            "bash entry_point.sh"    
doris_be          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8040->8040/tcp, :::8040->8040/tcp, 0.0.0.0:8060->8060/tcp, 
:::8060->8060/tcp, 0.0.0.0:9050->9050/tcp, :::9050->9050/tcp, 
0.0.0.0:9060->9060/tcp, :::9060->9060/tcp
+doris_fe            apache/doris:fe-2.1.11            "bash init_fe.sh"        
doris_fe          2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:8030->8030/tcp, :::8030->8030/tcp, 0.0.0.0:9010->9010/tcp, 
:::9010->9010/tcp, 0.0.0.0:9030->9030/tcp, :::9030->9030/tcp
+langfuse-minio      minio/minio                       "sh -c 'mkdir -p /da…"   
minio             2 minutes ago   Up 2 minutes (healthy)        
0.0.0.0:19090->9000/tcp, :::19090->9000/tcp, 127.0.0.1:19091->9001/tcp
+langfuse-postgres   postgres:latest                   "docker-entrypoint.s…"   
postgres          2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:5432->5432/tcp
+langfuse-redis      redis:7                           "docker-entrypoint.s…"   
redis             2 minutes ago   Up 2 minutes (healthy)        
127.0.0.1:16379->6379/tcp
+langfuse-web        selectdb/langfuse-web:latest      "dumb-init -- ./web/…"   
langfuse-web      2 minutes ago   Up About a minute (healthy)   
0.0.0.0:13000->3000/tcp, :::13000->3000/tcp
+langfuse-worker     selectdb/langfuse-worker:latest   "dumb-init -- ./work…"   
langfuse-worker   2 minutes ago   Up About a minute (healthy)   
0.0.0.0:3030->3030/tcp, :::3030->3030/tcp
+```
+
+
+#### 4. Service Initialization
+
+After deployment is complete, access and initialize the service as follows:
+
+**Access Langfuse Web Interface**:
+- URL: http://localhost:3000
+
+**Initialization Steps**:
+1. Open your browser and navigate to http://localhost:3000
+2. Create an administrator account and log in
+3. Create a new organization and project
+4. Obtain the project's API Keys (Public Key and Secret Key)
+5. Configure the authentication information required for SDK integration
+
+
+# Examples
+
+## Using Langfuse SDK
+
+```Python
+import os
+# Instead of: import openai
+from langfuse.openai import OpenAI
+# from langfuse import observe
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+
+# use OpenAI client
+client = OpenAI()
+
+
+# ask a question
+question = "What are the key features of the Doris observability solution? 
Please answer concisely."
+print(f"question: {question}")
+
+completion = client.chat.completions.create(
+    model="gpt-4o",
+    messages=[
+        {"role": "user", "content": question}
+    ]
+)
+response = completion.choices[0].message.content
+print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LangChain SDK
+
+```Python
+import os
+from langfuse.langchain import CallbackHandler
+from langchain_openai import ChatOpenAI
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+# Create your LangChain components (using OpenAI API)
+llm = ChatOpenAI(
+    model="gpt-4o"
+)
+
+# ask a question
+question = "What are the key features of the Doris observability solution? 
Please answer concisely."
+print(f"question: {question} \n")
+
+# Run your chain with Langfuse tracing
+try:
+    # Initialize the Langfuse handler
+    langfuse_handler = CallbackHandler()
+    response = llm.invoke(question, config={"callbacks": [langfuse_handler]})
+    print(f"response: {response.content}")
+except Exception as e:
+    print(f"Error during chain execution: {e}")
+```
+![](/images/ecomsystem/langfuse/langfuse_2.png)
+
+## Using LlamaIndex SDK
+
+```Python
+import os
+from langfuse import get_client
+from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
+from llama_index.llms.openai import OpenAI
+
+# Langfuse config
+os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-******-******"
+os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-******-******" 
+os.environ["LANGFUSE_HOST"] = "http://localhost:3000";
+
+langfuse = get_client()
+
+
+# Initialize LlamaIndex instrumentation
+LlamaIndexInstrumentor().instrument()
+
+
+# Set up the OpenAI class with the required model
+llm = OpenAI(model="gpt-4o")
+
+
+# ask a question
+question = "What are the key features of the Doris observability solution? 
Please answer concisely."
+print(f"question: {question} \n")
+ 
+with langfuse.start_as_current_span(name="llama-index-trace"):
+    response = llm.complete(question)
+    print(f"response: {response}")
+```
+
+![](/images/ecomsystem/langfuse/langfuse_3.png)
diff --git a/versioned_sidebars/version-2.1-sidebars.json 
b/versioned_sidebars/version-2.1-sidebars.json
index f096907aab9..18f69becac7 100644
--- a/versioned_sidebars/version-2.1-sidebars.json
+++ b/versioned_sidebars/version-2.1-sidebars.json
@@ -470,6 +470,7 @@
                                 "ecosystem/observability/opentelemetry",
                                 "ecosystem/observability/fluentbit",
                                 "ecosystem/observability/loongcollector",
+                                "ecosystem/observability/langfuse",
                                 "ecosystem/observability/vector"
                             ]
                         }
@@ -884,6 +885,7 @@
                         "ecosystem/observability/opentelemetry",
                         "ecosystem/observability/fluentbit",
                         "ecosystem/observability/loongcollector",
+                        "ecosystem/observability/langfuse",
                         "ecosystem/observability/vector"
                     ]
                 },
diff --git a/versioned_sidebars/version-3.x-sidebars.json 
b/versioned_sidebars/version-3.x-sidebars.json
index 281d39d207f..03e7baaea4c 100644
--- a/versioned_sidebars/version-3.x-sidebars.json
+++ b/versioned_sidebars/version-3.x-sidebars.json
@@ -510,6 +510,7 @@
                                 "ecosystem/observability/opentelemetry",
                                 "ecosystem/observability/fluentbit",
                                 "ecosystem/observability/loongcollector",
+                                "ecosystem/observability/langfuse",
                                 "ecosystem/observability/vector"
                             ]
                         }
@@ -956,6 +957,7 @@
                         "ecosystem/observability/opentelemetry",
                         "ecosystem/observability/fluentbit",
                         "ecosystem/observability/loongcollector",
+                        "ecosystem/observability/langfuse",
                         "ecosystem/observability/vector"
                     ]
                 },
diff --git a/versioned_sidebars/version-4.x-sidebars.json 
b/versioned_sidebars/version-4.x-sidebars.json
index 75079f02983..621bb5154f9 100644
--- a/versioned_sidebars/version-4.x-sidebars.json
+++ b/versioned_sidebars/version-4.x-sidebars.json
@@ -555,6 +555,7 @@
                                 "ecosystem/observability/opentelemetry",
                                 "ecosystem/observability/fluentbit",
                                 "ecosystem/observability/loongcollector",
+                                "ecosystem/observability/langfuse",
                                 "ecosystem/observability/vector"
                             ]
                         }
@@ -1013,6 +1014,7 @@
                         "ecosystem/observability/opentelemetry",
                         "ecosystem/observability/fluentbit",
                         "ecosystem/observability/loongcollector",
+                        "ecosystem/observability/langfuse",
                         "ecosystem/observability/vector"
                     ]
                 },


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to