Mousius commented on code in PR #12789:
URL: https://github.com/apache/tvm/pull/12789#discussion_r999171602
##########
python/tvm/micro/model_library_format.py:
##########
@@ -277,6 +294,37 @@ def _create_empty_entry(target_device_type):
main_func_metadata.io_sizes[target]
)
+ # Now, we also add the information about the size of each input and
output of the main
+ # function (in bytes)
+ input_dict = {}
+ for input_param in main_func_metadata.relay_primfuncs[target].params:
+ if hasattr(input_param, "checked_type"):
+ input_dict[input_param.name_hint] = int(
+ _shape_to_size(input_param.checked_type.shape,
input_param.checked_type.dtype)
+ )
+ else:
+ # TODO: maybe fill checked_type here?
+ input_dict[input_param.name_hint] = 0
+ target_main_entries[int(target.kind.device_type)]["inputs"] =
input_dict
+
+ output_dict = {}
+ # For output, we dont have the name of the output, so we enumerate them
+ if isinstance(main_func_metadata.relay_primfuncs[target].ret_type,
tvm.ir.type.TupleType):
+ for i, output_type in enumerate(
+ main_func_metadata.relay_primfuncs[target].ret_type.fields
+ ):
+ if hasattr(output_type, "shape"):
+ output_dict[i] = int(_shape_to_size(output_type.shape,
output_type.dtype))
+ else:
+ output_dict[i] = 0
+ else:
+ output_type = main_func_metadata.relay_primfuncs[target].ret_type
+ if hasattr(output_type, "shape"):
+ output_dict[0] = int(_shape_to_size(output_type.shape,
output_type.dtype))
+ else:
+ output_dict[0] = 0
Review Comment:
This is duplicated between input and output, we can replace it with a
function:
```python
def _create_type_metadata(input_type):
return {
"size": int(
_shape_to_size(input_type.shape, input_type.dtype)
),
"dtype": str(input_type.dtype)
}
```
##########
python/tvm/micro/model_library_format.py:
##########
@@ -277,6 +294,37 @@ def _create_empty_entry(target_device_type):
main_func_metadata.io_sizes[target]
)
+ # Now, we also add the information about the size of each input and
output of the main
+ # function (in bytes)
+ input_dict = {}
+ for input_param in main_func_metadata.relay_primfuncs[target].params:
+ if hasattr(input_param, "checked_type"):
+ input_dict[input_param.name_hint] = int(
+ _shape_to_size(input_param.checked_type.shape,
input_param.checked_type.dtype)
+ )
+ else:
+ # TODO: maybe fill checked_type here?
+ input_dict[input_param.name_hint] = 0
+ target_main_entries[int(target.kind.device_type)]["inputs"] =
input_dict
+
+ output_dict = {}
+ # For output, we dont have the name of the output, so we enumerate them
+ if isinstance(main_func_metadata.relay_primfuncs[target].ret_type,
tvm.ir.type.TupleType):
Review Comment:
Given tuples could be more heavily nested than this, we should use some
recursive function to generate the outputs? We can re-purpose this function
(which is no longer used):
https://github.com/apache/tvm/blob/main/python/tvm/micro/model_library_format.py#L293
##########
tests/python/unittest/test_micro_model_library_format.py:
##########
@@ -208,7 +208,12 @@ def @main(%a : Tensor[(1, 2), uint8], %b : Tensor[(1, 2),
float32], %c : Tensor[
{
"constants_size_bytes": json_constants_size_bytes,
"device": 1,
+ "inputs": {
+ "a": 2,
+ "b": 8,
+ },
Review Comment:
For future proofing, it'd be good to have more than just the size? But also
dtype here?
```suggestion
"inputs": {
"a": {"dtype": "uint8", "size_in_bytes": 2},
"b": {"dtype": "float32", "size_in_bytes": 8},
},
```
##########
python/tvm/micro/model_library_format.py:
##########
@@ -277,6 +294,37 @@ def _create_empty_entry(target_device_type):
main_func_metadata.io_sizes[target]
)
+ # Now, we also add the information about the size of each input and
output of the main
+ # function (in bytes)
+ input_dict = {}
+ for input_param in main_func_metadata.relay_primfuncs[target].params:
+ if hasattr(input_param, "checked_type"):
Review Comment:
I don't see a test case for when this is not set? How do we reproduce it?
##########
tests/python/unittest/test_micro_model_library_format.py:
##########
@@ -208,7 +208,12 @@ def @main(%a : Tensor[(1, 2), uint8], %b : Tensor[(1, 2),
float32], %c : Tensor[
{
"constants_size_bytes": json_constants_size_bytes,
"device": 1,
+ "inputs": {
+ "a": 2,
+ "b": 8,
+ },
"io_size_bytes": 18,
+ "outputs": {"0": 8},
Review Comment:
This should match with the name of the output in the generated C (`output`
for single, `outputX` for multiple, or the actual name if we can get to it)
```suggestion
"outputs": {"output": 8},
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]