Again with Theano "0.9.0.dev-ca213aa43e78ab3fb074a7c679907ea4d5412ed1" I
get:
Traceback (most recent call last):
> File "tt.py", line 9, in <module>
> a = theano.gpuarray.host_from_gpu(a_gpu)
> File
> "/share/apps/barber/system/lib/python3.6/site-packages/theano/gof/op.py",
> line 615, in __call__
> node = self.make_node(*inputs, **kwargs)
> File
> "/share/apps/barber/system/lib/python3.6/site-packages/theano/gpuarray/basic_ops.py",
>
> line 549, in make_node
> raise TypeError(x)
> TypeError: <TensorType(float32, matrix)>
>
On Wednesday, 10 May 2017 04:11:46 UTC+1, Adam Becker wrote:
>
> Hmm ... my bad. I thought givens would work.
>
> Anyway, this trick would work:
>
> import theano
> from theano.gpuarray.basic_ops import infer_context_name
>
> gpu_fmatrix = theano.gpuarray.GpuArrayType(dtype='float32', broadcastable
> =(False,False))
> a_gpu = T.fmatrix()
>
> # insert transfer
> a = theano.gpuarray.host_from_gpu(a_gpu)
> # define graph as usual
> b = a + 2.
>
> # compiles function, but takes GpuArray as input
> fn = theano.function([a_gpu], b)
> theano.printing.debugprint(fn)
>
> # compiles function that takes GpuArray as input/output
> ctx_name = infer_context_name(a_gpu)
> b_gpu = theano.gpuarray.as_gpuarray_variable(b, ctx_name)
> fn2 = theano.function([a_gpu], b_gpu)
> theano.printing.debugprint(fn2)
>
>
> Console output:
>
> HostFromGpu(gpuarray) [id A] '' 1
> |GpuElemwise{add,no_inplace} [id B] '' 0
> |GpuArrayConstant{[[ 2.]]} [id C]
> |<GpuArrayType<None>(float32, matrix)> [id D]
>
> GpuElemwise{add,no_inplace} [id A] '' 0
> |GpuArrayConstant{[[ 2.]]} [id B]
> |<GpuArrayType<None>(float32, matrix)> [id C]
>
> The above works because the optimizer can remove redundant GPU -> CPU ->
> GPU transfers. The downside is the above approach doesn't work with config
> optimizer=None
>
> On Wednesday, May 10, 2017 at 5:02:20 AM UTC+8, Alexander Botev wrote:
>>
>> That does not seem to work. So I have this:
>>
>> a = T.fmatrix()
>> ctx = pygpu.init(theano.config.device)
>> theano.gpuarray.reg_context("mine", ctx)
>> a_gpu = theano.gpuarray.GpuArrayType(a.dtype, a.broadcastable, "mine")
>> f2 = theano.function([a_gpu], a + T.constant(2), givens={a: a_gpu})
>> return f1, f2
>>
>>
>> However, Theano complains about:
>>
>> TypeError: Unknown parameter type: <class
>> 'theano.gpuarray.type.GpuArrayType'>
>>
>> If instead of the [a_gpu] I have [a] it complains that the givens is
>> overwriting an input:
>>
>> RuntimeError: You are trying to replace variable '<TensorType(float32,
>> matrix)>' through the `givens` parameter, but this variable is an input to
>> your function. Replacing inputs is currently forbidden because it has no
>> effect. One way to modify an input `x` to a function evaluating f(x) is to
>> define a new input `y` and use `theano.function([y], f(x), givens={x:
>> g(y)})`. Another solution consists in using `theano.clone`, e.g. like this:
>> `theano.function([x], theano.clone(f(x), replace={x: g(x)}))`.
>>
>>
>> On Tuesday, 9 May 2017 15:19:10 UTC+1, Adam Becker wrote:
>>>
>>> In the main graph, replace the input variables with type:
>>> theano.gpuarray.GpuArrayType (Can be done using givens parameter of
>>> theano.function). Then, feed pygpu.gpuarray.GpuArray object directly to
>>> the compiled function. pygpu.gpuarray.asarray can be used to move numpy
>>> array to GPU.
>>>
>>> On Tuesday, May 9, 2017 at 5:01:42 PM UTC+8, Alexander Botev wrote:
>>>>
>>>> Actually one thing I've just realized is that to do this consistently I
>>>> need to have access to the underlying Theano pygpu Context. Is there
>>>> anyway
>>>> to get that?
>>>>
>>>> On Tuesday, 9 May 2017 09:53:02 UTC+1, Alexander Botev wrote:
>>>>>
>>>>> So recently I was wondering if there is any way that after compiling a
>>>>> theano function, rather than taking numpy arrays / native lists / native
>>>>> numbers it can accept as an input something like a libgpuarray or
>>>>> anything
>>>>> else that lives on the GPU. However, I know that in the computation graph
>>>>> usually when you compile it there is a Transfer Op if it is on the GPU.
>>>>> Is
>>>>> there a way to avoid that transfer?
>>>>>
>>>>>
>>>>>
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.