aaltonenzhang opened a new issue #7198: URL: https://github.com/apache/tvm/issues/7198
When testing tensorflow models from tfhub at [https://tfhub.dev/](url), I found issues when import tensorflow IRs. 1. When I import saved model using TFParser.parse() and from_tensorflow(), I found that for some models, it's seems that the tags are not consistent with the real saved model. 2. I found that import model from checkpoint is not supported yet, but will tvm support it in the future? And what if there are no meta data exported from the checkpoint? Is it mandatory to modify python code of each cases without meta data? 3. Why does only constant values are supported for dims parameter of Fill operator? Will you support it later? For efficientnet, I convert saved model to be tflite format and found this problem. 4. Function not found - __inference_signature_wrapper_4615. issues are listed below: > model name | import result -- | -- efficientnet | For dims parameter of Fill operator, only constant values are supported retinanet | StatefulPartitionedCall:6 is not in graph albert | StatefulPartitionedCall:6 is not in graph bert | StatefulPartitionedCall:6 is not in graph ncf | Function not found - __inference_signature_wrapper_4615 ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org