Thrsu opened a new issue, #15021:
URL: https://github.com/apache/tvm/issues/15021

   
   I encountered an error when trying to load a TFLite model using the 
`relay.frontend.from_tflite` function. The error message is KeyError: 14.
   
   ### Actual behavior
   
   ```
   Traceback (most recent call last):
     File "single_test.py", line 42, in <module>
       mod, params = relay.frontend.from_tflite(tflite_model, shape_dict)
     File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/tflite.py", 
line 4178, in from_tflite
       _shape_dict, _dtype_dict = _input_type(model)
     File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/tflite.py", 
line 4135, in _input_type
       dtype_dict[input_name] = _decode_type(tensor_type)
     File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/tflite.py", 
line 4115, in _decode_type
       return _tflite_m[n]
   KeyError: 14
   ```
   What actually happened
   
   ### Environment
   
   - TVM: 0.13.dev0
   - Tensorflow: 2.12.0
   - TFlite: 2.10.0
   - Keras: 2.12.0
   
   ### Steps to reproduce
   
   ```
   from tensorflow import keras
   import numpy as np
   import tensorflow as tf
   from tensorflow.keras import layers, models
   import tflite
   
   import tvm
   import tvm.relay as relay
   
   np.random.seed(2023)
   
   input_shape = (1, 2, 5, 5, 2)
   input_data = 10 * np.random.random(list(input_shape))
   input_data -= 0.5
   input_data = input_data.astype("float32")
   
   kwargs={ "data_format": "channels_last", "return_sequences": False, 
"filters": 2, "kernel_size": (3, 3), "padding": "same", "dropout": 0.1, 
"recurrent_dropout": 0.1, }
   layer_cls = keras.layers.ConvLSTM2D
   layer = layer_cls(**kwargs)
   weights = layer.get_weights()
   layer.set_weights(weights)
   x = layers.Input(shape=input_shape[1:], dtype="float32")
   y = layer(x)
   model = models.Model(x, y)
   
   converter = tf.lite.TFLiteConverter.from_keras_model(model)
   converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, 
tf.lite.OpsSet.SELECT_TF_OPS]
   converter._experimental_lower_tensor_list_ops = False
   tflite_model = converter.convert()
   interpreter = tf.lite.Interpreter(model_content=tflite_model)
   interpreter.allocate_tensors()
   
   input_details = interpreter.get_input_details()
   input_name = input_details[0]['name'].split(':')[0]
   
   target = 'llvm'
   ctx = tvm.cpu(0)
   
   shape_dict = {input_name: input_shape}
   
   tflite_model = tflite.Model.GetRootAsModel(tflite_model, 0)
   mod, params = relay.frontend.from_tflite(tflite_model, shape_dict)
   with tvm.transform.PassContext(opt_level=3):
       model = relay.build_module.create_executor('vm', mod, ctx, target, 
params).evaluate()
   tvm_output = model(tvm.nd.array(input_data)).numpy()
   ```
   
   ### Triage
   
   -  needs-triage
   - frontend:tflite
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to