[
https://issues.apache.org/jira/browse/MXNET-4?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Chris Olivier updated MXNET-4:
------------------------------
Summary: Refactor Random and ParallelRandom resources to use MKL for MKL
builds (was: Performance: Refactor Random and ParallelRandom resources to use
MKL for MKL builds)
> Refactor Random and ParallelRandom resources to use MKL for MKL builds
> ----------------------------------------------------------------------
>
> Key: MXNET-4
> URL: https://issues.apache.org/jira/browse/MXNET-4
> Project: Apache MXNet
> Issue Type: Improvement
> Reporter: Chris Olivier
> Priority: Major
> Labels: mkl, performance
>
> Refactor Random and ParallelRandom resources to use MKL for MKL builds
> Things such as RngUniform, etc. Similarly to what is done for dropout
> operator.
> It may need to allocate some temporary memory and generate random numbers in
> batches, then serving them out from that batch.
> Also the Random classes could export a "fill buffer with randoms" function,
> which seems to be a common use-case and fits the MKL API more closely.
> Care must be taken regarding MKL's fixed output types for some of the API
> functions.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)