BenLag2906 opened a new issue #9308: batch size in inference URL: https://github.com/apache/incubator-mxnet/issues/9308 I work with your version of mxnet under windows in c++. I have heard of it was possible to make inference/prediction with a batch size. This practise is usually only for training. Could you please provide a piece of explanation to do this? I want to to boost time processing on inference ? I have seen it in python but with MXPredForward in c++, I don't see any solution ?
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
