featzhang commented on code in PR #27490: URL: https://github.com/apache/flink/pull/27490#discussion_r2744377376
########## docs/content.zh/docs/connectors/models/triton.md: ########## @@ -0,0 +1,482 @@ +--- +title: "Triton" +weight: 2 +type: docs +--- +<!-- +Licensed to the Apache Software Foundation (ASF) under one +or more contributor license agreements. See the NOTICE file +distributed with this work for additional information +regarding copyright ownership. The ASF licenses this file +to you under the Apache License, Version 2.0 (the +"License"); you may not use this file except in compliance +with the License. You may obtain a copy of the License at + http://www.apache.org/licenses/LICENSE-2.0 +Unless required by applicable law or agreed to in writing, +software distributed under the License is distributed on an +"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +KIND, either express or implied. See the License for the +specific language governing permissions and limitations +under the License. +--> + +# Triton + +The Triton Model Function allows Flink SQL to call [NVIDIA Triton Inference Server](https://github.com/triton-inference-server/server) for real-time model inference tasks. + +## Overview + +The function supports calling remote Triton Inference Server via Flink SQL for prediction/inference tasks. Triton Inference Server is a high-performance inference serving solution that supports multiple machine learning frameworks including TensorFlow, PyTorch, ONNX, and more. + +Key features: +* **High Performance**: Optimized for low-latency and high-throughput inference +* **Multi-Framework Support**: Works with models from various ML frameworks +* **Asynchronous Processing**: Non-blocking inference requests for better resource utilization +* **Flexible Configuration**: Comprehensive configuration options for different use cases +* **Resource Management**: Efficient HTTP client pooling and automatic resource cleanup + +## Usage Examples + +The following example creates a Triton model for text classification and uses it to analyze sentiment in movie reviews. + +First, create the Triton model with the following SQL statement: + +```sql +CREATE MODEL triton_sentiment_classifier +INPUT (`input` STRING) +OUTPUT (`output` STRING) +WITH ( + 'provider' = 'triton', + 'endpoint' = 'http://localhost:8000/v2/models', + 'model-name' = 'text-classification', + 'model-version' = '1', + 'timeout' = '10000', + 'max-retries' = '3' +); +``` + +Suppose the following data is stored in a table named `movie_reviews`, and the prediction result is to be stored in a table named `classified_reviews`: + +```sql +CREATE TEMPORARY VIEW movie_reviews(id, movie_name, user_review, actual_sentiment) +AS VALUES + (1, 'Great Movie', 'This movie was absolutely fantastic! Great acting and storyline.', 'positive'), + (2, 'Boring Film', 'I fell asleep halfway through. Very disappointing.', 'negative'), + (3, 'Average Show', 'It was okay, nothing special but not terrible either.', 'neutral'); + +CREATE TEMPORARY TABLE classified_reviews( + id BIGINT, + movie_name VARCHAR, + predicted_sentiment VARCHAR, + actual_sentiment VARCHAR +) WITH ( + 'connector' = 'print' +); +``` + +Then the following SQL statement can be used to classify sentiment for movie reviews: + +```sql +INSERT INTO classified_reviews +SELECT id, movie_name, output as predicted_sentiment, actual_sentiment +FROM ML_PREDICT( + TABLE movie_reviews, + MODEL triton_sentiment_classifier, + DESCRIPTOR(user_review) +); +``` + +### Advanced Configuration Example + +For production environments with authentication and custom headers: + +```sql +CREATE MODEL triton_advanced_model +INPUT (`input` STRING) +OUTPUT (`output` STRING) +WITH ( + 'provider' = 'triton', + 'endpoint' = 'https://triton.example.com/v2/models', Review Comment: already supports authentication via headers (see the auth-token example below) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
