tlopex opened a new pull request, #18313:
URL: https://github.com/apache/tvm/pull/18313
This PR allows calling Python functions directly from Relax IR, where
integration between Relax computations and Python/PyTorch operations can be
supported.
### Usage Example
```python
@I.ir_module
class MyModule(BasePyModule):
@I.pyfunc
def pytorch_add(self, x, y):
return x + y
@R.function
def compute(x: R.Tensor((5,), "float32"), y: R.Tensor((5,), "float32"))
-> R.Tensor((5,), "float32"):
result = R.call_py_func("pytorch_add", (x, y),
out_sinfo=R.Tensor((5,), "float32"))
return result
```
**THIS PR** represents the final milestone in the Relax Python integration
design, and completes the full feature set for Python/Relax interoperability.
The complete Relax-Python integration ecosystem has been built upon the
following key design principles:
### Cross-level Calls
- **Two-way interoperability**: Python functions can invoke Relax/TIR/packed
functions, and Relax functions can invoke Python functions via `R.call_py_func`
- **Seamless data conversion**: DLPack enables efficient conversion between
TVM Tensors and PyTorch Tensors with minimal overhead
### Just-in-time (JIT) Compilation
- **Delayed compilation**: TIR and Relax functions are compiled only when
the IRModule is instantiated
- **Flexible integration**: Allows late-stage modifications and seamless
integration with the Python runtime
- **Relax VM execution**: Compiled Relax functions are executed using a
Relax VM created at instantiation time
### Conversion between Relax and Python Functions
- **IRModule printer**: Converts Relax functions into executable Python code
for debugging and deployment
- **Operator mapping**: High-level Relax operators (e.g., `R.nn.relu`) are
mapped to corresponding PyTorch APIs (e.g., `F.relu`)
- **Multi-stage conversion**: Can happen at any stage of compilation, from
early Relax functions to fully lowered TIR modules
- **DLPack integration**: Handles `call_tir` and `call_dps_packed` by
converting PyTorch tensors to/from DLPack format
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]