mshr-h commented on code in PR #18429:
URL: https://github.com/apache/tvm/pull/18429#discussion_r2513619153
##########
python/tvm/relax/frontend/torch/exported_program_translator.py:
##########
@@ -1149,14 +1170,22 @@ def from_exported_program(
from torch import fx # type: ignore
# Create input variables.
- parameter_buffer_constant_vars, user_input_vars =
self.create_input_vars(exported_program)
+ (
+ parameter_buffer_constant_vars,
+ user_input_vars,
+ range_constraints,
+ ) = self.create_input_vars(exported_program)
inputs_vars = user_input_vars.copy()
inputs_vars.update(parameter_buffer_constant_vars)
# Initialize the block builder with a function and a dataflow block.
self.block_builder = relax.BlockBuilder()
func_name = "main"
func_attrs = {"num_input": len(user_input_vars)} if
keep_params_as_input else None
+ if range_constraints:
+ if func_attrs is None:
+ func_attrs = {}
+ func_attrs["shape_var_constraints"] = range_constraints
Review Comment:
Please use `tir_var_upper_bound` to annotate upper bound.
I grepped the tvm code base and I found that there's no lower bound
annotation. So I don't think we need to keep it at the moment. If we have a
real use case for it, it's fine to keep it in the Relax module.
https://github.com/apache/tvm/blob/main/src/relax/transform/static_plan_block_memory.cc#L62-L66
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]