samskalicky commented on issue #12255: Pretty high cpu load when import mxnet URL: https://github.com/apache/incubator-mxnet/issues/12255#issuecomment-443045452 Hey Guys, we need to start getting consistent in what and how were testing this. Here was I tried: 1. spin up a C5.large instance with DLAMI v19.0 2. pip install mxnet-cu90 --user 3. time python -c "import mxnet" Here was the output: real 0m24.955s user 0m0.827s sys 0m0.189s Doing it a second time resulted in the following (and for every subsequent time too). real 0m0.855s user 0m0.798s sys 0m0.123s Timing the script above for running 8 processes in parallel resulted in: $ time python test.py time consumes: 4.612242221832275 time consumes: 4.625608682632446 time consumes: 4.641973257064819 time consumes: 4.690966844558716 time consumes: 4.7061262130737305 time consumes: 4.699116945266724 time consumes: 4.703948259353638 time consumes: 4.718777418136597 real 0m4.770s user 0m8.823s sys 0m0.650s So overall it took 4.77 seconds and each processes finished in less than that. I terminated that instance and spun up a new one and after installing the pip wheel I modified the __init__.py file in /home/ec2-user/.local/lib/python3.6/site-packages/mxnet (where the packages are installed with --user) to get an idea which imports where causing the most delay. Heres what I found were the biggest offenders: ` from .context import Context, current_context, cpu, gpu, cpu_pinned from . import engine from .base import MXNetError from . import base from . import contrib from . import ndarray from . import ndarray as nd from . import name ` time = 20.388737678527832 ` from . import random as rnd from . import random from . import optimizer from . import model from . import metric from . import notebook from . import initializer ` time = 0.5453829765319824 ` from . import image from . import image as img from . import test_utils from . import rnn from . import gluon ` time = 0.4957273006439209
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
