jonatan1626 commented on issue #17331: [mxnet 2.0] [item 2.4] Turning on large
tensor support by default
URL:
https://github.com/apache/incubator-mxnet/issues/17331#issuecomment-582752477
@apeforest Oh sorry, so I'm multiplying by only for the samples/second
column -1 to keep the meaning
jonatan1626 commented on issue #17331: [mxnet 2.0] [item 2.4] Turning on large
tensor support by default
URL:
https://github.com/apache/incubator-mxnet/issues/17331#issuecomment-582702619
@eric-haibin-lin Yes I am calculating this by: 1 - ( / ).
For the samples/sec I am multiplying by
jonatan1626 commented on issue #17331: [mxnet 2.0] [item 2.4] Turning on large
tensor support by default
URL:
https://github.com/apache/incubator-mxnet/issues/17331#issuecomment-582689067
Training Benchmarks comparing LT_MKL with just MKL Enabled.
Speed measured seconds per Epoch.
jonatan1626 commented on issue #17331: [mxnet 2.0] [item 2.4] Turning on large
tensor support by default
URL:
https://github.com/apache/incubator-mxnet/issues/17331#issuecomment-582591625
Inference Benchmarks comparing LT_MKL with just MKL Enabled.
All Time in MS.
MXNet Type |