[GitHub] [incubator-mxnet] jonatan1626 commented on issue #17331: [mxnet 2.0] [item 2.4] Turning on large tensor support by default

2020-02-05 Thread GitBox
jonatan1626 commented on issue #17331: [mxnet 2.0] [item 2.4] Turning on large tensor support by default URL: https://github.com/apache/incubator-mxnet/issues/17331#issuecomment-582752477 @apeforest Oh sorry, so I'm multiplying by only for the samples/second column -1 to keep the meaning

[GitHub] [incubator-mxnet] jonatan1626 commented on issue #17331: [mxnet 2.0] [item 2.4] Turning on large tensor support by default

2020-02-05 Thread GitBox
jonatan1626 commented on issue #17331: [mxnet 2.0] [item 2.4] Turning on large tensor support by default URL: https://github.com/apache/incubator-mxnet/issues/17331#issuecomment-582702619 @eric-haibin-lin Yes I am calculating this by: 1 - ( / ). For the samples/sec I am multiplying by

[GitHub] [incubator-mxnet] jonatan1626 commented on issue #17331: [mxnet 2.0] [item 2.4] Turning on large tensor support by default

2020-02-05 Thread GitBox
jonatan1626 commented on issue #17331: [mxnet 2.0] [item 2.4] Turning on large tensor support by default URL: https://github.com/apache/incubator-mxnet/issues/17331#issuecomment-582689067 Training Benchmarks comparing LT_MKL with just MKL Enabled. Speed measured seconds per Epoch.

[GitHub] [incubator-mxnet] jonatan1626 commented on issue #17331: [mxnet 2.0] [item 2.4] Turning on large tensor support by default

2020-02-05 Thread GitBox
jonatan1626 commented on issue #17331: [mxnet 2.0] [item 2.4] Turning on large tensor support by default URL: https://github.com/apache/incubator-mxnet/issues/17331#issuecomment-582591625 Inference Benchmarks comparing LT_MKL with just MKL Enabled. All Time in MS. MXNet Type |