ekalda commented on pull request #8368: URL: https://github.com/apache/tvm/pull/8368#issuecomment-871580756
> > This patch adds infrastructure to directly generate TFLite model buffers > > by using flatc, the flatbuffers command line tool. This gives us more > > freedom in creating the models for testing since we don't have to > > rely on any of the converters. > > > > * Add classes and helper functions to create the model buffer > > * Add some convolution tests that test TFLite2 models in int8 > > with per channel and per tensor quantization and remove the > > orphaned Keras tests > > > > Co-authored with @NicolaLancellotti > > Hi @ekalda, I am just unable to see the need for such changes. As per my understanding, TFlite framework behaviour is not something we should control in TVM. > Model buffers should be created using standard Apis in TFlite. We should not use a custom one to validate our requirements which may result in failure of complete TFlite frontend Parser. > > Maybe if you share what was the actual motivation for this change, we can discuss about the solution better. Thanks! Hi @ANSHUMAN87, see that RFC for some more motivation - https://discuss.tvm.apache.org/t/rfc-tflite-frontend-create-models-for-frontend-testing-by-directly-writing-tflite-buffers/9811 The gist is that the current converters that convert into TFLite are just not flexible enough when it comes to creating the one operator models with various properties (e.g. different fused activations). We have found that writing buffers directly is the most convenient, fast and debuggable way for consistently generating various one operator models with desired properties. As of whether the models created like that are valid TFLite models - since we use the TFLite schema to create the buffers, all the models created this way are valid TFLite models and if TVM frontend fails to parse them, that indicates problem with the TVM's TFLite's frontend parser. Also tagging @mbaret @FrozenGene @manupa-arm @anijain2305 @leandron -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
