shalushajan95 opened a new issue #6265:
URL: https://github.com/apache/incubator-tvm/issues/6265


   Hi,
   
   While converting the tflite model to relay module using below line of code :
   
   ```
   mod, params = relay.frontend.from_tflite(tflite_model,
                                      shape_dict={input_tensor: input_shape},
                                     dtype_dict={input_tensor: input_dtype})
   ```
   i am getting this error: KeyError: 
‘InceptionResnetV1/Logits/Flatten/flatten/Reshape/shape/1’
   
   The tflite model (vgg facenet) is an quantized int8 model which i converted 
from frozzen pb model to tflite model using tf lite python API(with help of 
representaive dataset) in tensorflow 2.3. Below is the line of code i used for 
converting to quantized int8 tflite model
   
       ```
   converter = 
tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(model,input_arrays={"input":[160,160,3]},output_arrays={"embeddings"})
   converter.inference_type = tf.int8
   converter.optimizations = [tf.lite.Optimize.DEFAULT]
   
   # Enforce full-int8 quantization (except inputs/outputs which are always 
float)
   converter.representative_dataset = rep_data_gen
   converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
   
   
   
   converter.inference_input_type = tf.int8
   converter.inference_output_type = tf.int8
   
   quantized_model = converter.convert() 
   ```
   But in tensorflow 2.2 while converting to tflite model i am getting a 
quantized float model instead of quantized int8 ( using above tflite python 
api.)
   
   Please comment on this issue
   
   Thank you
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to