True as of Theano 0.9. 0.8 don't include a cpu corrmm from memory.

If you are really intersted by CPU speed, Intel have a fork of Theano that
they have optimized on CPU:

http://github.com/intel/Theano

It will probably be merged into the master of Theano at some point, but no
timeline.

Fred

On Tue, Mar 21, 2017 at 9:18 PM Jesse Livezey <[email protected]>
wrote:

> That is correct as of theano 0.8 (I think).
>
> If you use the bleeding edge version of theano, you can let CorrMM use
> openmp to parallelize across batches. If you have more than 2 cores, this
> should give additional speedup. GPUs are going to be much faster than CPUs
> generally, if you have large batches and lots of cores, CPUs can catch up a
> bit, but GPUs are still going to be faster.
>
>
> On Monday, March 20, 2017 at 11:59:52 PM UTC-7, C. Ng wrote:
>
> Hi,
>
> Just want to confirm that theano.tensor.nnet.conv2d uses CorrMM (not the
> legacy convolution) by default in CPU mode ?
>
> I was hoping that forward prop (doing inference only, no training) using
> CPU for convolution might be as fast as GPU (using CorrMM), given my batch
> size is only 10. But using GPU is still quite a bit faster.
>
>
>
>
>
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to