lhutton1 commented on a change in pull request #10345:
URL: https://github.com/apache/tvm/pull/10345#discussion_r813695552
##########
File path: tests/python/contrib/test_ethosu/test_codegen.py
##########
@@ -1167,5 +1167,24 @@ def leaky_relu_func(x):
_compare_tvm_with_tflite(leaky_relu_func, [ifm_shape], accel_type)
[email protected]("accel_type", ACCEL_TYPES)
[email protected]("units", [32, 64])
[email protected]("use_bias", [True, False])
[email protected]("activation_function", ["RELU", "NONE"])
+def test_tflite_fully_connected(
+ accel_type,
+ units,
+ use_bias,
+ activation_function,
+):
+ @tf.function
+ def fully_connected():
+ return tf.keras.layers.Dense(
Review comment:
I'm not too familiar with the Keras API, but I'm not sure this will
work. One thing we could do instead is use `tf.matmul` which gets legalized to
fully connected in TFLite under the conditions we will use it for. e.g.
something like this would be a starting point:
```
@tf.function
def dense_layer(x):
w = tf.constant(
np.random.uniform(size=[units, units]),
dtype=tf.float32,
)
return tf.matmul(x, w)
_compare_tvm_with_tflite(dense_layer, [(1, units)], accel_type)
```
Happy to keep the Keras implementation if we get it working though, just
wanted to offer an alternative :)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]