note, I made an issue about this:

https://github.com/Theano/Theano/issues/6287

Fred

On Mon, Jul 3, 2017 at 7:51 AM Frédéric Bastien <frederic.bast...@gmail.com>
wrote:

> This is still experimental and we don't have to to work on it now.
>
> For multiple GPU, you should do data parallelism. The is 3 framework that
> can help you, theano-mpi, platoon and synkronous.
>
> Fred
>
> Le sam. 1 juil. 2017 16:33, Ramana Subramanyam <vxrram...@gmail.com> a
> écrit :
>
>> Hi,
>> This error that I reported was solved using the
>> flag optimizer_excluding=fusion. However, when I try to use multiple GPUs,
>> I get this error
>>
>> ERROR (theano.gof.opt): Optimization failure due to:
>> LocalOptGroup(local_abstractconv_cudnn,local_abstractconv_gw_cudnn,local_abstractconv_gi_cudnn,local_abstractconv_gemm,local_abstractconv3d_gemm,local_abstractconv_gradweights_gemm,local_abstractconv3d_gradweights_gemm,local_abstractconv_gradinputs_gemm,local_abstractconv3d_gradinputs_gemm)
>> ERROR (theano.gof.opt): node: AbstractConv2d{convdim=2, border_mode=(4,
>> 3), subsample=(1, 1), filter_flip=False, imshp=(None, None, None, None),
>> kshp=(None, None, None, None), filter_dilation=(1, 1)}(X,
>> CIFAR10.pixelCNN.pxCNN.vstack1.filter)
>> ERROR (theano.gof.opt): TRACEBACK:
>> ERROR (theano.gof.opt): Traceback (most recent call last):
>>   File
>> "/home/akshat/anaconda2/envs/ramana-test/lib/python2.7/site-packages/theano/gof/opt.py",
>> line 1982, in process_node
>>     replacements = lopt.transform(node)
>>   File
>> "/home/akshat/anaconda2/envs/ramana-test/lib/python2.7/site-packages/theano/gof/opt.py",
>> line 1335, in transform
>>     new_repl = opt.transform(node)
>>   File
>> "/home/akshat/anaconda2/envs/ramana-test/lib/python2.7/site-packages/theano/gpuarray/dnn.py",
>> line 2816, in local_abstractconv_cudnn
>>     ctx = infer_context_name(*node.inputs)
>>   File
>> "/home/akshat/anaconda2/envs/ramana-test/lib/python2.7/site-packages/theano/gpuarray/basic_ops.py",
>> line 122, in infer_context_name
>>     raise ValueError("Could not infer context from inputs")
>> ValueError: Could not infer context from inputs
>>
>> I used these THEANO_FLAGS
>> , contexts=dev0->cuda1;dev1->cuda3,floatX=float32,optimizer_excluding=fusion.
>> The same flags works well with the import and the sample code on this
>> page
>> <http://deeplearning.net/software/theano/tutorial/using_multi_gpu.html>. This
>> is my first time using multiple GPUs, apologise if I have made some trivial
>> mistake
>>
>> Ramana
>>
>>
>> On Tuesday, June 27, 2017 at 11:50:12 PM UTC+5:30, Ramana Subramanyam
>> wrote:
>>>
>>> Hi Fred,
>>> Since there wasn't any \n in the output, it was all in the same line.
>>> You have to scroll towards your left/right on this link
>>> <http://dpaste.com/0SSEM4E>. I am pasting a smaller copy of that below,
>>>
>>>
>>> (Composite{Switch((LT(i0, i1), i1, i0)}(Composite{Switch(GE(i0, i1), i1,
>>> i0)}(i0, i1), i2), i3), Composite{Switch(LT(i0, i1), i1,
>>> i0)}(Composite{Switch(GE(i0, i1), i1, i0)}(i0, i1), i2), i3) + i4)}(i8,
>>> Composite{((i0 + i1) - i2)}(i2, Composite{Switch(LT(Composite{Switch(GE(i0,
>>> i1), i1, i0)}(Composite{Switch(LT(i0, i1), i2, i0)}(Composite{((i0 + i1) -
>>> i2)}(i0, i1, i2), i3, i4), i5), i3), i3, Composite{Switch(GE(i0, i1), i1,
>>> i0)}(Composite{Switch(LT(i0, i1), i2, i0)}(Composite{((i0 + i1) - i2)}(i0,
>>> i1, i2), i3, i4), i5))}(i1, Composite{Switch(LT(Composite{Switch(GE(i0,
>>> i1), i1, i0)}(Composite{Switch(LT(i0, i1), i2, i0)}(Composite{((i0 + i1) -
>>> i2)}(i0, i1, i2), i3, i4), i5), i3), i3, Composite{Switch(GE(i0, i1), i1,
>>> i0)}(Composite{Switch(LT(i0, i1), i2, i0)}(Composite{((i0 + i1) - i2)}(i0,
>>> i1, i2), i3, i4), i5))}(i2, i3, i4, i5, i6, Composite{((i0 + i1) - i2)}(i7,
>>> i3, i4)), Composite{(Switch(LT(Composite{Switch(LT(i0, i1), i1,
>>> i0)}(Composite{Switch(GE(i0, i1), i1, i0)}(i0, i1), i2), i3),
>>> Composite{Switch(LT(i0, i1), i1, i0)}(Composite{Switch(GE(i0, i1), i1,
>>> i0)}(i0, i1), i2), i3) + i4)}(i8, Composite{((i0 + i1) - i2)}(i7, i3, i4),
>>> i5, Composite{Switch(LT(Composite{Switch(GE(i0, i1), i1,
>>> i0)}(Composite{Switch(LT(i0, i1), i2, i0)}(Composite{((i0 + i1) - i2)}(i0,
>>> i1, i2), i3, i4), i5), i3), i3, Composite{Switch(GE(i0, i1), i1,
>>> i0)}(Composite{Switch(LT(i0, i1), i2, i0)}(Composite{((i0 + i1) - i2)}(i0,
>>> i1, i2), i3, i4), i5))}(i2, i3, i4, i5, i6, Composite{((i0 + i1) - i2)}(i7,
>>> i3, i4)), i9), i5, i6, Composite{((i0 + i1) - i2)}(i2,
>>> Composite{Switch(LT(Composite{Switch(GE(i0, i1), i1,
>>> i0)}(Composite{Switch(LT(i0, i1), i2, i0)}(Composite{((i0 + i1) - i2)}(i0,
>>> i1, i2), i3, i4), i5), i3), i3, Composite{Switch(GE(i0, i1), i1,
>>> i0)}(Composite{Switch(LT(i0, i1), i2, i0)}
>>>
>>> On Tuesday, June 27, 2017 at 11:44:11 PM UTC+5:30, nouiz wrote:
>>>>
>>>> The output you gave don't show the infinit loop printed. Can you give
>>>> me longer output?
>>>>
>>>> On Tue, Jun 27, 2017 at 2:08 PM Ramana Subramanyam <vxrr...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi,
>>>>> I was trying benchmark the reimplementation of PixelCNN code
>>>>> <https://github.com/kundan2510/pixelCNN> by Kundan Kumar. I landed
>>>>> into following error while compiling the validation function. I deleted
>>>>> cache and then restarted, and this error persists. The function that 
>>>>> raises
>>>>> error is pretty much the same as the previous theano function that 
>>>>> compiled
>>>>> successfully(Same in terms of input and output), except that the one that
>>>>> gives error doesn't have update parameter. This is the traceback of the
>>>>> error, http://dpaste.com/33R4231. If I don't stop the script (
>>>>> KeyboardInterrupt doesn't work, so stopped using ctrl + Z), it continues 
>>>>> to
>>>>> paste infinitely on the terminal, just like this
>>>>> <http://dpaste.com/0SSEM4E>. I implemented it from scratch (but with
>>>>> a some modifications) and all the functions compiled well for me.  I'd be
>>>>> glad for any help :-)
>>>>>
>>>>> Regards,
>>>>> Ramana
>>>>>
>>>>> --
>>>>>
>>>>> ---
>>>>> You received this message because you are subscribed to the Google
>>>>> Groups "theano-users" group.
>>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>>> an email to theano-users...@googlegroups.com.
>>>>> For more options, visit https://groups.google.com/d/optout.
>>>>>
>>>> --
>>
>> ---
>> You received this message because you are subscribed to the Google Groups
>> "theano-users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to theano-users+unsubscr...@googlegroups.com.
>> For more options, visit https://groups.google.com/d/optout.
>>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to