@KellenSunderland That problem was probably due to the mismatch of nvidia 
driver versions. After I fixed the problems and installed the latest pip 
package `mxnet-tensorrt-cu90`, TensorRT is usable now. I tried the `lenet` test 
case and measured the inference time:

```
MXNet costs time: 1.380867, TensorRT costs time: 1.026270.
```

Seems that it works fine! Thanks for your awesome efforts!

[ Full content available at: 
https://github.com/apache/incubator-mxnet/issues/12142 ]
This message was relayed via gitbox.apache.org for [email protected]

Reply via email to