Chris Olivier commented on MXNET-11:

Yeah great performance improvement!  We definitely have requests to run
parallel inference in the same process. Each one loading the model
separately isn’t a realistic solution. Do you know if other libraries can
do this?

> Multithreaded Inference
> -----------------------
>                 Key: MXNET-11
>                 URL: https://issues.apache.org/jira/browse/MXNET-11
>             Project: Apache MXNet
>          Issue Type: Epic
>          Components: MXNet Engine
>            Reporter: Chris Olivier
>            Priority: Major
>              Labels: inference
> Add the ability to do multithreaded inference without using fork() or using 
> multiple copies of a given model

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@mxnet.apache.org
For additional commands, e-mail: issues-h...@mxnet.apache.org

Reply via email to