## Description

run forward pass with MXNet spends much more time than run forward pass with 
openCV

Hi, I'm trying to run my own pre-trained model with cpu only on centos8, 
virtual machine. (which was trained with GPU on ubuntu 18.04)

Our whole project need to be written in c++ code, so my previous solution was 
to convert (symbols, params) to onnx and load it with opencv in our code.

We decided to migrate our code from openCV base to mxnet cpp build.

Here comes the problem. 
Run forward pass and get prediction output with mxnet cpp is too much slower 
compare to opencv.
We are using exactly same network. The only difference is format. 
(symbols&params vs onnx).

## What I have tried to solve it

1. to use cpp bind build mxnet from source
2. also test pip install version of mxnet 1.6.0

## Environment

System: centos8.2
MXNet version: mxnet-1.8.0
CPU: Intel i5 9500F
RAM: 32 GiB

## Processing times

With opencv + onn: under 1 seconds
With mxnet + symbol, params: 8 seconds





---
[Visit 
Topic](https://discuss.mxnet.apache.org/t/mxnet-slow-compare-to-opencv/7073/1) 
or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.mxnet.apache.org/email/unsubscribe/8e0444d418aee695aa6b408a3062db893fa4fd1f7d9f93b9c69e33ac984bdade).

Reply via email to