Faakhir30 opened a new issue, #41702:
URL: https://github.com/apache/airflow/issues/41702
### Description
ONNX (Open Neural Network Exchange) provides cross-platform compatibility
An operator that can run inference using ONNX models, ideal for deploying
machine learning models in a standardized format can provide us with direct
model invocation.
this can be solved using a pythonOperator ofc as onnxruntime can be executed
with pythonruntime, but this can also be built into airflow to minimize work, a
simple onnx operator structure would be something like:
```
import onnxruntime as ort
from airflow import DAG
from airflow.operators.python import PythonOperator
from datetime import datetime
def run_onnx_inference():
# Load the ONNX model
model_path = '/path/to/your/model.onnx'
session = ort.InferenceSession(model_path)
# Prepare input data
input_name = session.get_inputs()[0].name
input_data = {"your_input_key": your_input_data}
# Run inference
result = session.run(None, {input_name: input_data})
print(result)
# Define the DAG
with DAG(
dag_id='onnx_inference_dag',
start_date=datetime(2023, 1, 1),
schedule_interval='@once'
) as dag:
# Define the task
inference_task = PythonOperator(
task_id='onnx_inference_task',
python_callable=run_onnx_inference
)
```
A direct support of onnx with Airflow's DAG-based orchestration can manage
the entire lifecycle of data processing and model inference in one place,
providing a more cohesive and manageable workflow.
Looking frwd to any suggestions.
### Use case/motivation
A direct support of onnx with Airflow's DAG-based orchestration can manage
the entire lifecycle of data processing and model inference in one place,
providing a more cohesive and manageable workflow.
### Related issues
_No response_
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]